Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
13 Answers
2K Views
0 Votes 13 Answers 2K Views
4 years ago
0 Votes
30 Answers
2K Views
0 Votes 30 Answers 2K Views
Is there any reason why doing the following is not possible? Am I doing it right? I want to run a pipeline with different parameters but I get the following ...
4 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
4 years ago
0 Votes
14 Answers
2K Views
0 Votes 14 Answers 2K Views
Hi! Can someone show me an example of how PipelineController.create_draft works? I'm trying to create a template of a pipeline to run it later but I can't ge...
3 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
Hi all! Let's say I have two functions decorated with PipelineDecorator.pipeline . Then I have a set of functions decorated with PipelineDecorator.component ...
4 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
Hi! If there are several tasks running concurrently, which task should Task.current_task() return?
4 years ago
0 Votes
29 Answers
2K Views
0 Votes 29 Answers 2K Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
4 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
4 years ago
0 Votes
21 Answers
2K Views
0 Votes 21 Answers 2K Views
Hi all! I noticed when a pipeline fails, all its components continue running. Wouldn't it make more sense for the pipeline to send an abort signal to all tas...
4 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
Hello, I have the following basic snippet where I'm trying to add another value to the Task's connected arguments after calling task.connect(args) . Script e...
4 years ago
0 Votes
11 Answers
2K Views
0 Votes 11 Answers 2K Views
What is the recommended way to stop the execution of a specific agent? This command doesn't allow me to specify the agent IP I want to stop: clearml-agent da...
4 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
4 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
Hi all! When I set a list as a Task parameter and later try to retrieve it, what I get is a string. Is this the expected behavior? I have prepared the follow...
3 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
Hi! Is there any reason why integer/float values are casted to string when connecting arguments dictionary to task and then retrieve them using task.get_para...
4 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
4 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
4 years ago
Show more results questions
0 Hi, I Have A Question Regarding The New

But when I call Task.current_task().task_id within the code of one of the pipeline components, I get the task ID of the component itself. I want the pipeline task ID

4 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Oh, I see. This explains the surprising behavior. But what if Task.init code is created automatically by PipelineDecorator.component ? How can I pass arguments to the init method in that case?

4 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Yes, I like it! I was already get used to the ' execute_steps_as_functions' argument of PipelineDecorator.debug_pipeline() but I find your proposal to be more intuitive.

4 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Well, I can see the difference here. Using the new pipelines generation the user has the flexibility to play with the returned values of each step. We can process those values before passing them to the next step, so maybe makes little sense to include those callbacks in this case

4 years ago
0 Let'S Say That I Specify The

But this path actually does not exist in my system, so how should I fix that?

4 years ago
0 Hello Folks! I Don'T Know If This Issue Has Already Been Addressed. I Have A Basic Pipelinecontroller Script With Two Steps: One Of Task Is For Preprocessing Purposes And The Other For Training A Model. Currently I Am Placing The Code Related To The Pack

Hi Martin,

Actually Task.add_requirements behaves as I expect, since that part of the code is in the preprocessing script and for that task it does install all the specified packages. So, my question could be rephrased as the following: when working with PipelineController , is there any way to avoid creating a new development environment for each step of the pipeline?

According to the https://clear.ml/docs/latest/docs/clearml_agent provided in the official ClearML documentatio...

4 years ago
0 Let'S Say That I Specify The

I currently deal with that by skipping the first 5 characters of the path, i. e. the 'file:' part. But I'm sure there is a cleaner way to proceed.

4 years ago
0 Hi! If There Are Several Tasks Running Concurrently, Which Task Should

Great, thank you very much for the info! I just spotted the get_logger classmethod. As for the initial question, that's just the behavior I expected!

4 years ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Exactly, at first I was trying to call a component from another component, but it didn't work. Then I thought it would be more natural to do this using a pipeline, but it didn't recognize the user_config_creation function despite I imported it as I would do under PipelineDecorator.component . I really like the idea of enabling an argument to specify the components you are going to use in the pipeline so they are in the step's context! I will be eagerly waiting for that feature :D

4 years ago
0 Let'S Say That I Specify The

My idea is to take advantage of the capability of getting parameters connected to a task from another task to read the path where the artifacts are stored locally, so I don't have to define it again in each script corresponding to a different task.

4 years ago
0 Hi! Can Someone Show Me An Example Of How

Having the ability to clone and modify the same task over and over again, in principle I would no longer need the multi_instance support feature from PipelineDecorator.pipeline. Is this correct, or are they different things?

3 years ago
0 Hi! From A Task Created Using

That' s right, I don't know why I was trying to make it so complicated 😅

4 years ago
0 Hi Guys, Suppose I Have The Following Script:

Thanks for helping. You and your team are doing a great job for the ML community.

4 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Mmm that's weird. Because I can see the type hints in the function's arguments of the automatically generated script. So, maybe I'm doing something wrong or it's a bug, since they have been passed to the created step (I'm using clearml version 1.1.2 and clearml-agent version 1.1.0).

4 years ago
0 Hi, I Am Experiencing Issues When Uploading Artifacts To The Dataset Task With Clearml Version V1.1.4Rc0. The Problem Is The Artifacts Are Uploaded To The Default Clearml Server, Even Though I Have Specified The Path To Our Storage Medium. The Code To Dem

Hi AgitatedDove14 , gotcha. So how can I temporarily fix it? I'm not able to find something like task.set_output_uri() in the official docs. Or maybe do you plan to solve this problem in the very short term?

4 years ago
0 What Is The Recommended Way To Stop The Execution Of A Specific Agent? This Command Doesn'T Allow Me To Specify The Agent Ip I Want To Stop:

After doing so the agent is removed from the list provided by ps -ef | grep clearml-agent , but it is still visible from the ClearML UI and also when I run clearml-agent list

4 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

The scheme is similar to the following:
` main_pipeline
(PipelineDecorator.pipeline)
|
|----------------------------------|
| |
inference_orchestrator_1 inference_orchestrator_2
(PipelineDecorator.component, (PipelineDecorator.component,
acting as a pipeline) acting as a pipeline)
| |
...

4 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

I have found it is not possible to start a pipeline B after a pipeline A. Following the previous example, I have added one more pipeline to the script:
` from clearml import Task
from clearml.automation.controller import PipelineDecorator

@PipelineDecorator.component(return_values=["msg"], execution_queue="model_trainings")
def step_1(msg: str):
msg += "\nI've survived step 1!"
return msg

@PipelineDecorator.component(return_values=["msg"], execution_queue="model_trainings")
def st...

4 years ago
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

Can you think of any other way to launch multiple pipelines concurrently? Since we have already seen it is only possible to run a single Pipelinecontroller in a single Python process

4 years ago
0 Hi! Can Someone Show Me An Example Of How

Hi AgitatedDove14 , just one last thing before closing the thread. I was wondering what is the use of PipelineController.create_draft if you can't use it to clone and run tasks, as we have seen

3 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

Hi AnxiousSeal95 !
Yes, main reason is to unclutter the ClearML Web UI but also free up space on our server (mainly due to the large size of the datasets). Once the models are trained, I want to retrain them periodically, and to do so I would like all the data specifications and artifacts generated during training to be linked to the model found under the " Models" section.
What I propose is somehow similar to the functionality of clearml.Dataset . These datasets are themselves a task t...

4 years ago
Show more results compactanswers