Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity 8 months ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
12 Answers
231 Views
0 Votes 12 Answers 231 Views
2 years ago
0 Votes
11 Answers
213 Views
0 Votes 11 Answers 213 Views
Let's say that I specify the output_uri parameter in Task.init like this: task = Task.init( project_name="example_project", task_name="example_task", output_...
2 years ago
0 Votes
6 Answers
191 Views
0 Votes 6 Answers 191 Views
Hi, Is there any reason why artifacts linked to a task are not removed when the task is removed from the experiment list?
2 years ago
0 Votes
7 Answers
236 Views
0 Votes 7 Answers 236 Views
Hi, not sure if I'm doing something wrong or I found a bug. When I try to overwrite some parameters in a cloned task using get_parameters and set_parameters ...
2 years ago
0 Votes
6 Answers
189 Views
0 Votes 6 Answers 189 Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
2 years ago
0 Votes
0 Answers
227 Views
0 Votes 0 Answers 227 Views
Hi, Let's say I have several functions decorated with PipelineDecorator.component (functions A, B and C). Function C can only be executed after functions A a...
one year ago
0 Votes
9 Answers
248 Views
0 Votes 9 Answers 248 Views
Hi, I just updated clearml to version v1.1.3. Right after launching a training pipeline, the system crashed due to the following error: Traceback (most recen...
one year ago
0 Votes
4 Answers
215 Views
0 Votes 4 Answers 215 Views
I'm trying to implement a cleanup service by following this example https://github.com/allegroai/clearml/blob/master/examples/services/cleanup/cleanup_servic...
one year ago
0 Votes
12 Answers
217 Views
0 Votes 12 Answers 217 Views
one year ago
0 Votes
30 Answers
200 Views
0 Votes 30 Answers 200 Views
Is there any reason why doing the following is not possible? Am I doing it right? I want to run a pipeline with different parameters but I get the following ...
one year ago
0 Votes
3 Answers
219 Views
0 Votes 3 Answers 219 Views
one year ago
0 Votes
21 Answers
228 Views
0 Votes 21 Answers 228 Views
Hi all! I noticed when a pipeline fails, all its components continue running. Wouldn't it make more sense for the pipeline to send an abort signal to all tas...
one year ago
0 Votes
10 Answers
216 Views
0 Votes 10 Answers 216 Views
Hi, Is there a simple way to make Task.init compatible with Dask.distributed client? When I try to run a script where I want to read concurrently a dataset i...
2 years ago
0 Votes
1 Answers
203 Views
0 Votes 1 Answers 203 Views
Is there any way to create a queue from code?
2 years ago
0 Votes
8 Answers
188 Views
0 Votes 8 Answers 188 Views
one year ago
0 Votes
29 Answers
216 Views
0 Votes 29 Answers 216 Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
one year ago
0 Votes
14 Answers
194 Views
0 Votes 14 Answers 194 Views
Hi! Can someone show me an example of how PipelineController.create_draft works? I'm trying to create a template of a pipeline to run it later but I can't ge...
one year ago
0 Votes
2 Answers
188 Views
0 Votes 2 Answers 188 Views
Since PipelineDecorator automatically starts the task for you, is there any way to specify arguments to Task.init in the task created for a function decorate...
one year ago
0 Votes
1 Answers
176 Views
0 Votes 1 Answers 176 Views
It is possible to attach to an OutputModel an object closely related to it (as some product of data preprocessing that has been done specifically for that mo...
2 years ago
0 Votes
1 Answers
198 Views
0 Votes 1 Answers 198 Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
2 years ago
0 Votes
2 Answers
222 Views
0 Votes 2 Answers 222 Views
one year ago
0 Votes
13 Answers
189 Views
0 Votes 13 Answers 189 Views
one year ago
0 Votes
11 Answers
271 Views
0 Votes 11 Answers 271 Views
Hi guys, Suppose I have the following script: import numpy as np import pandas as pd from clearml import Task # Import required project dependencies. from tf...
2 years ago
0 Votes
6 Answers
186 Views
0 Votes 6 Answers 186 Views
one year ago
0 Votes
11 Answers
228 Views
0 Votes 11 Answers 228 Views
Hi! I was wondering why ClearML recognize Scikit-learn scalers as Input Models... Am I missing something here? For me it would make sense to include the scal...
one year ago
0 Votes
10 Answers
195 Views
0 Votes 10 Answers 195 Views
Is there any example showing how to work with nested pipelines? In my case I have several functions decorated with PipelineDecorator . In a pipeline I call s...
one year ago
0 Votes
3 Answers
209 Views
0 Votes 3 Answers 209 Views
I have another question regarding creating a Task with PipelineDecorator.component . Where can I specify the reuse_last_task_id parameter? I need to set it t...
one year ago
0 Votes
2 Answers
180 Views
0 Votes 2 Answers 180 Views
Why
Why Task.add_tags method has no effect when running remotely? What if I want to tag a step based on a parameter passed to the pipeline through PipelineContro...
one year ago
0 Votes
1 Answers
218 Views
0 Votes 1 Answers 218 Views
one year ago
0 Votes
9 Answers
196 Views
0 Votes 9 Answers 196 Views
It is a good practice to call a function decorated by PipelineDecorator in a for loop? I tried it in a real-world example and I didn't get the results I expe...
2 years ago
Show more results questions
0 Hi All! When I Set A List As A Task Parameter And Later Try To Retrieve It, What I Get Is A String. Is This The Expected Behavior? I Have Prepared The Following Snippet So That You Can Reproduce It.

If I try to connect a dictionary of type dict[str, list] with task.connect , when retrieving this dictionary with task.get_parameter I get another dictionary dict[str, str] . Therefore, I see the same behavior using task.connect :/

one year ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

By adding the slash I have been able to see that indeed the dataset is stored in output_url . However, when calling finalize , I get the same error. And yes, I have installed the version corresponding to the last commit :/

one year ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

AgitatedDove14 In the 'status.json' file I could see the 'is_dirty' flag is set to True

one year ago
one year ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

AgitatedDove14 Oops, something still seems to be wrong. When trying to retrieve the dataset using get_local_copy() I get the following error:
` Traceback (most recent call last):
File "/home/user/myproject/lab.py", line 27, in <module>
print(dataset.get_local_copy())
File "/home/user/.conda/envs/myenv/lib/python3.9/site-packages/clearml/datasets/dataset.py", line 554, in get_local_copy
target_folder = self._merge_datasets(
File "/home/user/.conda/envs/myenv/lib/python3.9/site-p...

one year ago
one year ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Yes, I'm working with the latest commit. Anyway, I have tried to run dataset.get_local_copy() on another machine and it works. I have no idea why this happens. However, on the new machine get_local_copy() does not return the path I expect. If I have this code:
dataset.upload( output_url="/home/user/server_local_storage/mock_storage" )I would expect the dataset to be stored under the path specified in output_url . But what I get with get_local_copy() is the follo...

one year ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Indeed it does! But what still puzzles me so badly is why I get below path when running dataset.get_local_copy() on one of the machines of my cluster:
/home/user/.clearml/cache/storage_manager/datasets/.lock.000.ds_61ff8d4335dd4b74bd78c3576fa44131.clearml
Why is it pointing to a .lock file?

one year ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Mmm what would be the implications of not being part of the DAG? I mean, how could that step be launched if it is not part of the execution graph?

one year ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Hi AgitatedDove14 ,
Any updates on the new ClearML release that fixes the bugs we mentioned in this thread? :)

one year ago
0 Hi All, I Am Faced With The Situation That My Company'S Gitlab Is Temporarily Out Of Service At A Certain Time In The Early Morning (Due To Regular Maintenance Service, Something I Cannot Control). Normally, My System'S Inference Pipelines Are Scheduled T

Mmm well, I can think of a pipeline that could save its state in the instant before the error occurred. So that using some crontab/scheduler the pipeline could be resumed at the point where it was stopped in the case of not having been completed. Is there any functionality like this? Something like PipelineDecorator/PipelineController.resume_from(state_filepath) ?

one year ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Exactly, at first I was trying to call a component from another component, but it didn't work. Then I thought it would be more natural to do this using a pipeline, but it didn't recognize the user_config_creation function despite I imported it as I would do under PipelineDecorator.component . I really like the idea of enabling an argument to specify the components you are going to use in the pipeline so they are in the step's context! I will be eagerly waiting for that feature :D

one year ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

I mean to use a function decorated with PipelineDecorator.pipeline inside another pipeline decorated in the same way.
In the traceback attached below you can see that I am trying to use a component named user_config_creation inside the create_user_configs sub-pipeline. I have imported user_config_creation inside create_user_configs but a KeyError is raised (however I assume the function has been imported correctly because no ImportError or ` ModuleNo...

one year ago
0 Hi! Can Someone Show Me An Example Of How

I don't know if you remember the need I had some time ago to launch the same pipeline through configuration. I've been thinking about it and I think PipelineController fits my needs better than PipelineDecorator in that respect.

one year ago
0 Hi! Can Someone Show Me An Example Of How

Having the ability to clone and modify the same task over and over again, in principle I would no longer need the multi_instance support feature from PipelineDecorator.pipeline. Is this correct, or are they different things?

one year ago
0 Hi! Can Someone Show Me An Example Of How

Hi AgitatedDove14 , so isn't it ClearML best practice to create a draft pipeline to have the task on the server so that it can be cloned, modified and executed at any time?

one year ago
Show more results compactanswers