Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
11 Answers
981 Views
0 Votes 11 Answers 981 Views
Let's say that I specify the output_uri parameter in Task.init like this: task = Task.init( project_name="example_project", task_name="example_task", output_...
3 years ago
0 Votes
6 Answers
886 Views
0 Votes 6 Answers 886 Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
3 years ago
0 Votes
29 Answers
989 Views
0 Votes 29 Answers 989 Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
3 years ago
0 Votes
14 Answers
943 Views
0 Votes 14 Answers 943 Views
Hi all, I am testing the new PipelineDecorator feature. Is there any way to automatically detect the Git repository in the pipeline step decorated with Pipel...
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
3 years ago
0 Votes
5 Answers
983 Views
0 Votes 5 Answers 983 Views
Hi! From a task created using PipelineDecorator.pipeline , is there any way to get a task ID from the name of the step listed in the table below? My plan is ...
3 years ago
0 Votes
6 Answers
896 Views
0 Votes 6 Answers 896 Views
3 years ago
0 Votes
3 Answers
915 Views
0 Votes 3 Answers 915 Views
Hello, I have the following basic snippet where I'm trying to add another value to the Task's connected arguments after calling task.connect(args) . Script e...
3 years ago
0 Votes
11 Answers
973 Views
0 Votes 11 Answers 973 Views
What is the recommended way to stop the execution of a specific agent? This command doesn't allow me to specify the agent IP I want to stop: clearml-agent da...
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
2 years ago
0 Votes
3 Answers
931 Views
0 Votes 3 Answers 931 Views
I have another question regarding creating a Task with PipelineDecorator.component . Where can I specify the reuse_last_task_id parameter? I need to set it t...
3 years ago
0 Votes
6 Answers
950 Views
0 Votes 6 Answers 950 Views
Hi all! Let's say I have two functions decorated with PipelineDecorator.pipeline . Then I have a set of functions decorated with PipelineDecorator.component ...
3 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
3 years ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
3 years ago
0 Votes
10 Answers
902 Views
0 Votes 10 Answers 902 Views
Is there any example showing how to work with nested pipelines? In my case I have several functions decorated with PipelineDecorator . In a pipeline I call s...
3 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
Hi, Is there any reason why artifacts linked to a task are not removed when the task is removed from the experiment list?
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
It is possible to attach to an OutputModel an object closely related to it (as some product of data preprocessing that has been done specifically for that mo...
3 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hi, not sure if I'm doing something wrong or I found a bug. When I try to overwrite some parameters in a cloned task using get_parameters and set_parameters ...
3 years ago
0 Votes
1 Answers
953 Views
0 Votes 1 Answers 953 Views
Hi everybody, Where can I find the documentation about the new TaskScheduler feature?
3 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
Hi, Is there a simple way to make Task.init compatible with Dask.distributed client? When I try to run a script where I want to read concurrently a dataset i...
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
3 years ago
0 Votes
2 Answers
910 Views
0 Votes 2 Answers 910 Views
Since PipelineDecorator automatically starts the task for you, is there any way to specify arguments to Task.init in the task created for a function decorate...
3 years ago
0 Votes
9 Answers
950 Views
0 Votes 9 Answers 950 Views
It is a good practice to call a function decorated by PipelineDecorator in a for loop? I tried it in a real-world example and I didn't get the results I expe...
3 years ago
0 Votes
10 Answers
921 Views
0 Votes 10 Answers 921 Views
Hi! Is there any reason why integer/float values are casted to string when connecting arguments dictionary to task and then retrieve them using task.get_para...
3 years ago
0 Votes
1 Answers
940 Views
0 Votes 1 Answers 940 Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
3 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Hi guys, Suppose I have the following script: import numpy as np import pandas as pd from clearml import Task # Import required project dependencies. from tf...
3 years ago
0 Votes
1 Answers
919 Views
0 Votes 1 Answers 919 Views
Is there any way to create a queue from code?
3 years ago
0 Votes
8 Answers
875 Views
0 Votes 8 Answers 875 Views
2 years ago
0 Votes
5 Answers
942 Views
0 Votes 5 Answers 942 Views
Hi, can anyone help me with this code? (just a mock example, but it nicely captures the behavior of the real code) import pandas as pd from clearml import Ta...
2 years ago
0 Votes
14 Answers
976 Views
0 Votes 14 Answers 976 Views
Hi! Can someone show me an example of how PipelineController.create_draft works? I'm trying to create a template of a pipeline to run it later but I can't ge...
2 years ago
Show more results questions
0 Hi! Is There Any Reason Why Integer/Float Values Are Casted To String When Connecting Arguments Dictionary To Task And Then Retrieve Them Using

For any reason I can't get the values in their original types. Only the dictionary keys are returned as the raw nested dictionary, but the values remain casted.

3 years ago
0 Hi, Not Sure If I'M Doing Something Wrong Or I Found A Bug. When I Try To Overwrite Some Parameters In A Cloned Task Using

Yes, when the parameters that are connected do not have nested dictionaries, everything works fine. The problem comes when I try to do something like this:

` from clearml import Task

task = Task.init(project_name="Examples", task_name="task with connected dict")

args = {}
args["period"] = {"start": "2020-01-01 00:00", "end": "2020-12-31 23:00"}

task.connect(args) `
and the clone task is like this:

` from clearml import Task

template_task = Task.get_task(task_id="<Your template task id>"...

3 years ago
0 Hello Folks! I Don'T Know If This Issue Has Already Been Addressed. I Have A Basic Pipelinecontroller Script With Two Steps: One Of Task Is For Preprocessing Purposes And The Other For Training A Model. Currently I Am Placing The Code Related To The Pack

Hi Martin,

Actually Task.add_requirements behaves as I expect, since that part of the code is in the preprocessing script and for that task it does install all the specified packages. So, my question could be rephrased as the following: when working with PipelineController , is there any way to avoid creating a new development environment for each step of the pipeline?

According to the https://clear.ml/docs/latest/docs/clearml_agent provided in the official ClearML documentatio...

3 years ago
0 Let'S Say That I Specify The

Sure, but I mean, apart from label it as a local path, what's the point of renaming the original path if my goal is to access it later using the name I gave it?

3 years ago
0 Hello, I Have The Following Basic Snippet Where I'M Trying To Add Another Value To The Task'S Connected Arguments After Calling

Currently I'm working with v1.0.5. Anyway, I found that it is possible to connect the new argument if I store in a variable the arguments returned by task.connect(args) . I expected that since it is a mutable object it would not be necessary to overwrite args , but apparently it is required in this version of ClearML.

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Or maybe you could bundle some parameters that belongs to PipelineDecorator.component into high-level configuration variable (something like PipelineDecorator.global_config (?))

3 years ago
0 Hi! From A Task Created Using

That' s right, I don't know why I was trying to make it so complicated 😅

3 years ago
0 Let'S Say That I Specify The

I currently deal with that by skipping the first 5 characters of the path, i. e. the 'file:' part. But I'm sure there is a cleaner way to proceed.

3 years ago
0 Let'S Say That I Specify The

But this path actually does not exist in my system, so how should I fix that?

3 years ago
0 Let'S Say That I Specify The

Now it's okey. I have found a more intuitive way to get around. I was facing the classic 'xy' problem :)

3 years ago
0 Hi All! When I Set A List As A Task Parameter And Later Try To Retrieve It, What I Get Is A String. Is This The Expected Behavior? I Have Prepared The Following Snippet So That You Can Reproduce It.

Sure, just by changing a few things from the previous example:
` from clearml import Task

task = Task.init()
task.connect({"metrics": ["nmae", "bias", "r2"]})

metrics_names = task.get_parameter("General/metrics")

print(metrics_names)
print(type(metrics_names)) `

2 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

I have found it is not possible to start a pipeline B after a pipeline A. Following the previous example, I have added one more pipeline to the script:
` from clearml import Task
from clearml.automation.controller import PipelineDecorator

@PipelineDecorator.component(return_values=["msg"], execution_queue="model_trainings")
def step_1(msg: str):
msg += "\nI've survived step 1!"
return msg

@PipelineDecorator.component(return_values=["msg"], execution_queue="model_trainings")
def st...

3 years ago
0 Hi All, I Am Testing The New

How can I tell clearml I will use the same virtual environment in all steps and there is no need to waste time re-installing all packages for each step?

3 years ago
0 Hi! Can Someone Show Me An Example Of How

Having the ability to clone and modify the same task over and over again, in principle I would no longer need the multi_instance support feature from PipelineDecorator.pipeline. Is this correct, or are they different things?

2 years ago
0 Hi! Can Someone Show Me An Example Of How

I don't know if you remember the need I had some time ago to launch the same pipeline through configuration. I've been thinking about it and I think PipelineController fits my needs better than PipelineDecorator in that respect.

2 years ago
0 Hi! Can Someone Show Me An Example Of How

Exactly!! That's what I was looking for: create the pipeline but not launching it. Thanks again AgitatedDove14

2 years ago
0 What Is The Recommended Way To Stop The Execution Of A Specific Agent? This Command Doesn'T Allow Me To Specify The Agent Ip I Want To Stop:

Sure, it would be very intuitive if the command to stop an agent would be as easy as:
clearml-agent daemon --stop AGENT_PID

3 years ago
0 Hello Folks! I Don'T Know If This Issue Has Already Been Addressed. I Have A Basic Pipelinecontroller Script With Two Steps: One Of Task Is For Preprocessing Purposes And The Other For Training A Model. Currently I Am Placing The Code Related To The Pack

From what I understood, ClearML creates a virtual environment from scratch for each task it runs. To detect the dependencies of each script, apparently it inspects the script for the imports and packages specified in Task.add_requirements . You mean that's not the convenient way for ClearML to create the environments for each task? What is the right way to proceed in this case?

3 years ago
0 Hi! Can Someone Show Me An Example Of How

I see the point. The reason I'm using PipelineController now is that I've realised that in the code I only send IDs from one step of the pipeline to another, and not artefacts as such. So I think it makes more sense in this case to work with the former.

2 years ago
0 Hi! Is There Any Reason Why Integer/Float Values Are Casted To String When Connecting Arguments Dictionary To Task And Then Retrieve Them Using

Mmm I see. So the agent is taking the parameters from the base task registered in the server. Then if I call task.get_parameter_as_dict for a task that has not been executed by an agent, should I get the original types of the values?

3 years ago
0 Since

For instance, the auto_connect family arguments

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Mmm that's weird. Because I can see the type hints in the function's arguments of the automatically generated script. So, maybe I'm doing something wrong or it's a bug, since they have been passed to the created step (I'm using clearml version 1.1.2 and clearml-agent version 1.1.0).

3 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

Hi AnxiousSeal95 !
That's it. My idea is that artifacts can be linked to the model. Typically these artifacts are often links to serialized objects (such as datasets or scalers). They are usually directories or temporary files in mount units that I want to be loaded as artifacts of the task, removed (as they are temporary) and later I can get a new local path via task.artifacts["scalers"].get_local_copy() . I think this way the model's dependence on the task that created it could be re...

3 years ago
Show more results compactanswers