Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
10 Answers
984 Views
0 Votes 10 Answers 984 Views
Hi all! When I set a list as a Task parameter and later try to retrieve it, what I get is a string. Is this the expected behavior? I have prepared the follow...
2 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Hi guys, Suppose I have the following script: import numpy as np import pandas as pd from clearml import Task # Import required project dependencies. from tf...
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
It is possible to attach to an OutputModel an object closely related to it (as some product of data preprocessing that has been done specifically for that mo...
3 years ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
2 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
3 years ago
0 Votes
1 Answers
923 Views
0 Votes 1 Answers 923 Views
Is there any way to create a queue from code?
3 years ago
0 Votes
2 Answers
916 Views
0 Votes 2 Answers 916 Views
Since PipelineDecorator automatically starts the task for you, is there any way to specify arguments to Task.init in the task created for a function decorate...
3 years ago
0 Votes
6 Answers
892 Views
0 Votes 6 Answers 892 Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
3 years ago
0 Votes
10 Answers
925 Views
0 Votes 10 Answers 925 Views
Hi! Is there any reason why integer/float values are casted to string when connecting arguments dictionary to task and then retrieve them using task.get_para...
3 years ago
0 Votes
18 Answers
994 Views
0 Votes 18 Answers 994 Views
Regarding the new version 1.1.2, I have noticed type hints are now included in the script generated by PipelineDecorator.component in the function arguments....
3 years ago
0 Votes
10 Answers
906 Views
0 Votes 10 Answers 906 Views
Is there any example showing how to work with nested pipelines? In my case I have several functions decorated with PipelineDecorator . In a pipeline I call s...
3 years ago
0 Votes
11 Answers
977 Views
0 Votes 11 Answers 977 Views
What is the recommended way to stop the execution of a specific agent? This command doesn't allow me to specify the agent IP I want to stop: clearml-agent da...
3 years ago
0 Votes
3 Answers
920 Views
0 Votes 3 Answers 920 Views
Hello, I have the following basic snippet where I'm trying to add another value to the Task's connected arguments after calling task.connect(args) . Script e...
3 years ago
0 Votes
6 Answers
956 Views
0 Votes 6 Answers 956 Views
Hi all! Let's say I have two functions decorated with PipelineDecorator.pipeline . Then I have a set of functions decorated with PipelineDecorator.component ...
3 years ago
0 Votes
29 Answers
996 Views
0 Votes 29 Answers 996 Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
3 years ago
0 Votes
3 Answers
936 Views
0 Votes 3 Answers 936 Views
I have another question regarding creating a Task with PipelineDecorator.component . Where can I specify the reuse_last_task_id parameter? I need to set it t...
3 years ago
0 Votes
8 Answers
880 Views
0 Votes 8 Answers 880 Views
2 years ago
0 Votes
4 Answers
967 Views
0 Votes 4 Answers 967 Views
I'm trying to implement a cleanup service by following this example https://github.com/allegroai/clearml/blob/master/examples/services/cleanup/cleanup_servic...
3 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Hi! I was wondering why ClearML recognize Scikit-learn scalers as Input Models... Am I missing something here? For me it would make sense to include the scal...
3 years ago
0 Votes
9 Answers
1K Views
0 Votes 9 Answers 1K Views
Hi, I just updated clearml to version v1.1.3. Right after launching a training pipeline, the system crashed due to the following error: Traceback (most recen...
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
3 years ago
0 Votes
11 Answers
987 Views
0 Votes 11 Answers 987 Views
Let's say that I specify the output_uri parameter in Task.init like this: task = Task.init( project_name="example_project", task_name="example_task", output_...
3 years ago
0 Votes
14 Answers
951 Views
0 Votes 14 Answers 951 Views
Hi all, I am testing the new PipelineDecorator feature. Is there any way to automatically detect the Git repository in the pipeline step decorated with Pipel...
3 years ago
0 Votes
5 Answers
990 Views
0 Votes 5 Answers 990 Views
Hi! From a task created using PipelineDecorator.pipeline , is there any way to get a task ID from the name of the step listed in the table below? My plan is ...
3 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hi! If there are several tasks running concurrently, which task should Task.current_task() return?
3 years ago
0 Votes
14 Answers
982 Views
0 Votes 14 Answers 982 Views
Hi! Can someone show me an example of how PipelineController.create_draft works? I'm trying to create a template of a pipeline to run it later but I can't ge...
2 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
Hi, Is there any reason why artifacts linked to a task are not removed when the task is removed from the experiment list?
3 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
Hi, Is there a simple way to make Task.init compatible with Dask.distributed client? When I try to run a script where I want to read concurrently a dataset i...
3 years ago
0 Votes
1 Answers
959 Views
0 Votes 1 Answers 959 Views
Hi everybody, Where can I find the documentation about the new TaskScheduler feature?
3 years ago
Show more results questions
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Hi AgitatedDove14 , great, glad it was fixed quickly!

By the way, before releasing version 1.1.3 you might want to take a look at this mock example. I'm trying to run the same pipeline (with different configurations) in a single for loop, as you can see below:
` from clearml import Task
from clearml.automation.controller import PipelineDecorator

@PipelineDecorator.component(return_values=["msg"], execution_queue="myqueue1")
def step_1(msg: str):
msg += "\nI've survived step 1!"
re...

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Sure, converting pipelines into components also works for me (ignoring still having to fix the problem with LazyEvalWrapper return values). But this way some interesting features of the pipeline are missing, such as displaying the step execution DaG in the PLOTS tab .

3 years ago
0 Hi Guys, Suppose I Have The Following Script:

Then ClearML should also detect the dependencies of the imported scripts? In this case shouldn't it detect that I am going to use tensorflow and install it as well? Because it is not actually recognizing it.

3 years ago
3 years ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Mmm what would be the implications of not being part of the DAG? I mean, how could that step be launched if it is not part of the execution graph?

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Yes, I like it! I was already get used to the ' execute_steps_as_functions' argument of PipelineDecorator.debug_pipeline() but I find your proposal to be more intuitive.

3 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Indeed it does! But what still puzzles me so badly is why I get below path when running dataset.get_local_copy() on one of the machines of my cluster:
/home/user/.clearml/cache/storage_manager/datasets/.lock.000.ds_61ff8d4335dd4b74bd78c3576fa44131.clearml
Why is it pointing to a .lock file?

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

BTW, how can I run 'execute_orchestrator' concurrently? That is, launch it for several configurations at the same time? The way it's implemented now, it doesn't start the next configuration until the current one is finished.

3 years ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Exactly, at first I was trying to call a component from another component, but it didn't work. Then I thought it would be more natural to do this using a pipeline, but it didn't recognize the user_config_creation function despite I imported it as I would do under PipelineDecorator.component . I really like the idea of enabling an argument to specify the components you are going to use in the pipeline so they are in the step's context! I will be eagerly waiting for that feature :D

3 years ago
0 Let'S Say That I Specify The

Sure, but I mean, apart from label it as a local path, what's the point of renaming the original path if my goal is to access it later using the name I gave it?

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Mmm that's weird. Because I can see the type hints in the function's arguments of the automatically generated script. So, maybe I'm doing something wrong or it's a bug, since they have been passed to the created step (I'm using clearml version 1.1.2 and clearml-agent version 1.1.0).

3 years ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Hi AgitatedDove14 ,
Any updates on the new ClearML release that fixes the bugs we mentioned in this thread? :)

3 years ago
0 Hi! Is There Any Reason Why Integer/Float Values Are Casted To String When Connecting Arguments Dictionary To Task And Then Retrieve Them Using

Well, I need to write boilerplate code to do parsing stuff if I want to use the original values after I connect the dictionary to the task, so it's a bit messy.
Currently I'm using clearml v1.0.5 and clearml-agent v1.0.0

3 years ago
0 Hi, Is There A Simple Way To Make

I see, but I don't understand the part where you talk about passing the task ID to the child processes. Sorry if it's something trivial. I recently started working with ClearML.

3 years ago
0 Hi! Is There Any Reason Why Integer/Float Values Are Casted To String When Connecting Arguments Dictionary To Task And Then Retrieve Them Using

SuccessfulKoala55 I have not tried yet with argparse, but maybe I will encounter the same problem

3 years ago
0 Hi! Can Someone Show Me An Example Of How

I don't know if you remember the need I had some time ago to launch the same pipeline through configuration. I've been thinking about it and I think PipelineController fits my needs better than PipelineDecorator in that respect.

2 years ago
0 Hi All, I Am Testing The New

I'm using the latest version (1.1.1)

3 years ago
0 Hi! Is There Any Reason Why Integer/Float Values Are Casted To String When Connecting Arguments Dictionary To Task And Then Retrieve Them Using

For any reason I can't get the values in their original types. Only the dictionary keys are returned as the raw nested dictionary, but the values remain casted.

3 years ago
0 Hi All, I Am Testing The New

Sure, it's already enabled. I noticed in the ClearML agent configuration another parameter related to environment caching, named as venv_update (I believe it's still in beta). Do you think enabling this parameter significantly helps to build environments faster?

Yes, I guess. Since pipelines are designed to be executed remotely it may be pointless to enable an output_uri parameter in the PipelineDecorator.component . Anyway, could another task be initialized in the same scr...

3 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

AnxiousSeal95 I see. That's why I was thinking of storing the model inside a task just like with the Dataset class. So that you can either use just the model via InputModel or the model and all its artifacts via Task.get_task by using the ID of the task where the model is located.
I would like my cleanup service to remove all tasks older than two weeks, but not the models. Right now, if I delete all tasks the model does not work (as it needs the training tasks). For now, I ...

3 years ago
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

Since I am still on time, I would like to report another minor bug related to the 'add_pipeline_tags' parameter of PipelineDecorator.pipeline . It turns out when the pipeline consists of components that in turn use other components (via 'helper_functions'), these nested components are not tagged with 'pipe: <pipeline_task_id>'. I assume this should not be like that, right?

3 years ago
0 Hi, Is There A Simple Way To Make

Are you suggesting just taking the read_and_process_file function out of the read_dataset method, or maybe decoupling the read_dataset method from the NetCDFReader class so it is not pickle along with the class instance itself?

As for the second option, you mean create the task in the __init__ method of the NetCDFReader class?

It would be a great idea to make the Task picklelizable, since at the moment what are the most frequently used options for integrating ...

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

Well, this is just a mock example 🙂 . In the real application I'm working on there will be more than one configuration file (in principle one for the data and one for the DL model). Regarding the fix, I am not in a hurry at the moment. I'll happily wait for tomorrow (or the day after) when the commit is pushed!

2 years ago
0 Hi All, I Am Testing The New

How can I tell clearml I will use the same virtual environment in all steps and there is no need to waste time re-installing all packages for each step?

3 years ago
0 Hi! Is There Any Reason Why Integer/Float Values Are Casted To String When Connecting Arguments Dictionary To Task And Then Retrieve Them Using

Mmm I see. So the agent is taking the parameters from the base task registered in the server. Then if I call task.get_parameter_as_dict for a task that has not been executed by an agent, should I get the original types of the values?

3 years ago
0 Hi All! When I Set A List As A Task Parameter And Later Try To Retrieve It, What I Get Is A String. Is This The Expected Behavior? I Have Prepared The Following Snippet So That You Can Reproduce It.

Thanks AgitatedDove14 ! Wow, I was definitely not expecting that behavior 🤣 I will check it out tomorrow. Just one more thing, what do you mean by "my_task_id_that_i_generated_before_here"?

2 years ago
0 Hi! Can Someone Show Me An Example Of How

Hi AgitatedDove14 , so isn't it ClearML best practice to create a draft pipeline to have the task on the server so that it can be cloned, modified and executed at any time?

2 years ago
0 Hi! Can Someone Show Me An Example Of How

Sure! Thank you 🙂

2 years ago
0 Hi! Can Someone Show Me An Example Of How

Exactly!! That's what I was looking for: create the pipeline but not launching it. Thanks again AgitatedDove14

2 years ago
Show more results compactanswers