Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SucculentBeetle7
Moderator
5 Questions, 16 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

12 × Eureka!
0 Votes
26 Answers
1K Views
0 Votes 26 Answers 1K Views
Hello community! Is there an option to only download a part of a Dataset with .get_local_copy()? I imagine something like this, but I can't find the right wa...
3 years ago
0 Votes
5 Answers
977 Views
0 Votes 5 Answers 977 Views
I have another pipeline-related question. In a pipeline controller Task, I would like to add several steps that are based on the same base Task (I'm passing ...
3 years ago
0 Votes
5 Answers
858 Views
0 Votes 5 Answers 858 Views
Hello! Is it possible to run pipeline controller Tasks locally, similar to regular Tasks that run locally by default if task.execute_remotely() is not set?
3 years ago
0 Votes
13 Answers
901 Views
0 Votes 13 Answers 901 Views
Hi, our server IP address has changed, and this breaks all the paths to artifacts / datasets. Is there a way to fix the old paths so that they can be accesse...
3 years ago
0 Votes
3 Answers
960 Views
0 Votes 3 Answers 960 Views
Hi! I'm trying clearml 1.1.3. I'm trying to get a dataset with Dataset.get(dataset_id='my_id') . I get this error message: TypeError: clearml.task.Task.__get...
2 years ago
0 I Have Another Pipeline-Related Question. In A Pipeline Controller Task, I Would Like To Add Several Steps That Are Based On The Same Base Task (I'M Passing The

Here's the example: Even though I'm passing different parameters to the two clones, they will end up configured with the same (the second) parameter set.
` project_name = 'pipeline_test'

task = Task.init(project_name=project_name,
task_name='pipeline_main',
task_type=Task.TaskTypes.controller,
reuse_last_task_id=False)

pipe = PipelineController(default_execution_queue='default_queue',
add_pipeline_tags=False)

#%%
...

3 years ago
0 Hello, Is There Any Way To Download All Scalars, Not Only Last Metrics, By Python Interface. I Am Going To Analyze My Learning Curves By Myself.

Hello! I'm doing it like this:
scalars = task.get_reported_scalars()it returns a dictionary with all the scalars for all iterations that you can access like so:
scalars['epoch_accuracy']['validation: epoch_accuracy']['y']

3 years ago
0 Hi! I'M Trying Clearml 1.1.3. I'M Trying To Get A Dataset With

Oh yes! I should have checked the last messages before posting. Thank you for pointing me to it! I will try the fix.

2 years ago
0 Hi! I'M Trying Clearml 1.1.3. I'M Trying To Get A Dataset With

It's working now for me with 1.1.4rc0 as well, thank you!

2 years ago
0 Hello! Is It Possible To Run Pipeline Controller Tasks Locally, Similar To Regular Tasks That Run Locally By Default If

The use case is for example my other question from today. I want to test/debug the parameter_override functionality (and pipelines in general). For this it would be fastest for me if the Tasks that are part of the pipeline are also running locally.

3 years ago
0 Hi There! We'Ve Recently Started To Explore The Dataset Page In The Enterprise Version And The Corresponding Hyperdatasets. However, We Are Using 3D Image Data In Hd5 Format. Is There Any Way To Visualize Either A Slice Of The 3D Data Or Visualize It As V

How does clearml detect a preview or thumbnail associated with a file? e.g. if we would add a ['preview'] group into the .hdf5 file (containing a png / tiff /.. image), would it be able to find it?

2 years ago
0 Hello! Is It Possible To Run Pipeline Controller Tasks Locally, Similar To Regular Tasks That Run Locally By Default If

Thank you! In fact I'm already using "start". I should have been more clear: Can I make the Tasks that I'm adding to the pipeline also run locally, such that the entire pipeline runs locally?

3 years ago
0 Hello Community! Is There An Option To Only Download A Part Of A Dataset With .Get_Local_Copy()? I Imagine Something Like This, But I Can'T Find The Right Way To Do It.

Thank you! Yes that might be the best option. I'll have to divide it already when I create the datasets then, right?

3 years ago
0 I Have Another Pipeline-Related Question. In A Pipeline Controller Task, I Would Like To Add Several Steps That Are Based On The Same Base Task (I'M Passing The

The section name needs to be added in both the base task as well as the pipeline task for it to work. Since the parameters also show up in the "General" section in the web interface when the parameters are connected only with their name (without section name), I didn't think that this could matter. Thank you for your help!

3 years ago
0 Hi, Our Server Ip Address Has Changed, And This Breaks All The Paths To Artifacts / Datasets. Is There A Way To Fix The Old Paths So That They Can Be Accessed Again? Thank You!

Or, when I try to load a dataset from an old task, this is the error that I get:
File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/clearml/datasets/dataset.py", line 835, in get raise ValueError('Could not load Dataset id={} state'.format(task.id)) ValueError: Could not load Dataset id=7d05e1cad34441799f79931337612ae1 state

3 years ago
0 If I Have A Task And A Dataset Is Being Created In A Task, How Can I Get A “Link” That This Dataset Is Created In This Task, Similar To How Model Has The Task Where It Came From

This also helped me 🙂 Really, I'd like it both ways, such that the Task links to the Dataset it created, as well as the Dataset to the Task it was created by.
Right now I'm doing
dataset = Dataset.create(...) task.connect({'dataset_id': dataset.id}, name='Datasets')for the second direction. Is there a better way to do this? (I'm using it to pass Datasets between Tasks, one Task operating on a Dataset that was created by another Task.) Thank you!

3 years ago
0 If I Have A Task And A Dataset Is Being Created In A Task, How Can I Get A “Link” That This Dataset Is Created In This Task, Similar To How Model Has The Task Where It Came From

Nice!
I can't really think of a reason why not to do it automatically, at least for my usecase. What name would you give the dataset(s) in the Configuration? Also, the IDs as an entry in the Configuration will not be clickable in the web interface, right?

3 years ago