Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
11 Answers
981 Views
0 Votes 11 Answers 981 Views
Let's say that I specify the output_uri parameter in Task.init like this: task = Task.init( project_name="example_project", task_name="example_task", output_...
3 years ago
0 Votes
11 Answers
973 Views
0 Votes 11 Answers 973 Views
What is the recommended way to stop the execution of a specific agent? This command doesn't allow me to specify the agent IP I want to stop: clearml-agent da...
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
2 years ago
0 Votes
6 Answers
950 Views
0 Votes 6 Answers 950 Views
Hi all! Let's say I have two functions decorated with PipelineDecorator.pipeline . Then I have a set of functions decorated with PipelineDecorator.component ...
3 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
3 years ago
0 Votes
10 Answers
902 Views
0 Votes 10 Answers 902 Views
Is there any example showing how to work with nested pipelines? In my case I have several functions decorated with PipelineDecorator . In a pipeline I call s...
3 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
Hi, Is there any reason why artifacts linked to a task are not removed when the task is removed from the experiment list?
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
It is possible to attach to an OutputModel an object closely related to it (as some product of data preprocessing that has been done specifically for that mo...
3 years ago
0 Votes
1 Answers
953 Views
0 Votes 1 Answers 953 Views
Hi everybody, Where can I find the documentation about the new TaskScheduler feature?
3 years ago
0 Votes
1 Answers
940 Views
0 Votes 1 Answers 940 Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
3 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Hi guys, Suppose I have the following script: import numpy as np import pandas as pd from clearml import Task # Import required project dependencies. from tf...
3 years ago
0 Votes
1 Answers
919 Views
0 Votes 1 Answers 919 Views
Is there any way to create a queue from code?
3 years ago
0 Votes
8 Answers
875 Views
0 Votes 8 Answers 875 Views
2 years ago
0 Votes
5 Answers
942 Views
0 Votes 5 Answers 942 Views
Hi, can anyone help me with this code? (just a mock example, but it nicely captures the behavior of the real code) import pandas as pd from clearml import Ta...
2 years ago
0 Votes
2 Answers
915 Views
0 Votes 2 Answers 915 Views
Why
Why Task.add_tags method has no effect when running remotely? What if I want to tag a step based on a parameter passed to the pipeline through PipelineContro...
2 years ago
0 Votes
30 Answers
943 Views
0 Votes 30 Answers 943 Views
Is there any reason why doing the following is not possible? Am I doing it right? I want to run a pipeline with different parameters but I get the following ...
3 years ago
0 Votes
9 Answers
1K Views
0 Votes 9 Answers 1K Views
Hi, I just updated clearml to version v1.1.3. Right after launching a training pipeline, the system crashed due to the following error: Traceback (most recen...
3 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
3 years ago
0 Votes
4 Answers
965 Views
0 Votes 4 Answers 965 Views
I'm trying to implement a cleanup service by following this example https://github.com/allegroai/clearml/blob/master/examples/services/cleanup/cleanup_servic...
3 years ago
0 Votes
16 Answers
1K Views
0 Votes 16 Answers 1K Views
Hi! I noticed a bug related to reusing the same component in a pipeline. I have prepared a mock example so that you can reproduce it: from clearml.automation...
3 years ago
0 Votes
18 Answers
990 Views
0 Votes 18 Answers 990 Views
Regarding the new version 1.1.2, I have noticed type hints are now included in the script generated by PipelineDecorator.component in the function arguments....
3 years ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
Hi, Let's say I have several functions decorated with PipelineDecorator.component (functions A, B and C). Function C can only be executed after functions A a...
3 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
Hello, I was wondering if clearML offers the option to spin up again the clearml-agent automatically every time the machine where it was being executed as a ...
3 years ago
0 Votes
10 Answers
984 Views
0 Votes 10 Answers 984 Views
Hi all! When I set a list as a Task parameter and later try to retrieve it, what I get is a string. Is this the expected behavior? I have prepared the follow...
2 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hi! If there are several tasks running concurrently, which task should Task.current_task() return?
3 years ago
0 Votes
3 Answers
932 Views
0 Votes 3 Answers 932 Views
I have another question regarding creating a Task with PipelineDecorator.component . Where can I specify the reuse_last_task_id parameter? I need to set it t...
3 years ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
3 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Hi! I was wondering why ClearML recognize Scikit-learn scalers as Input Models... Am I missing something here? For me it would make sense to include the scal...
3 years ago
0 Votes
21 Answers
1K Views
0 Votes 21 Answers 1K Views
Hi all! I noticed when a pipeline fails, all its components continue running. Wouldn't it make more sense for the pipeline to send an abort signal to all tas...
3 years ago
Show more results questions
0 Hi All, I Am Testing The New

Sure, it's already enabled. I noticed in the ClearML agent configuration another parameter related to environment caching, named as venv_update (I believe it's still in beta). Do you think enabling this parameter significantly helps to build environments faster?

Yes, I guess. Since pipelines are designed to be executed remotely it may be pointless to enable an output_uri parameter in the PipelineDecorator.component . Anyway, could another task be initialized in the same scr...

3 years ago
3 years ago
0 What Is The Recommended Way To Stop The Execution Of A Specific Agent? This Command Doesn'T Allow Me To Specify The Agent Ip I Want To Stop:

But how can I reference that exact daemon execution? I tried with the ID but it fails:

clearml-agent daemon AGENT_ID --stop

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

I think it could be a convenient approach. The new parameter abort_on_failed_steps could be a list containing the name of the steps for which the pipeline will stop its execution if any of them fail (so that we can ignore other steps that are not crucial to continue the pipeline execution)

3 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

AgitatedDove14 Oops, something still seems to be wrong. When trying to retrieve the dataset using get_local_copy() I get the following error:
` Traceback (most recent call last):
File "/home/user/myproject/lab.py", line 27, in <module>
print(dataset.get_local_copy())
File "/home/user/.conda/envs/myenv/lib/python3.9/site-packages/clearml/datasets/dataset.py", line 554, in get_local_copy
target_folder = self._merge_datasets(
File "/home/user/.conda/envs/myenv/lib/python3.9/site-p...

3 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Mmm but what if the dataset size is too large to be stored in the .cache path? It will be stored there anyway?

3 years ago
0 Hi! Can Someone Show Me An Example Of How

Hi AgitatedDove14 , so isn't it ClearML best practice to create a draft pipeline to have the task on the server so that it can be cloned, modified and executed at any time?

2 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

BTW, let's say I accidentally removed the 'default' queue from the queue list. As a result, when I try to stop an agent using clearml-agent daemon --stop , I get the following error:
clearml_agent: ERROR: APIError: code 400/707: No queue is tagged as the default queue for this company
I have already created another queue also called 'default' but it had no effect :/

3 years ago
0 Hi All, I Am Testing The New

I am aware of the option to enable virtual environment caching, but that is still very time consuming.

3 years ago
0 Let'S Say That I Specify The

My idea is to take advantage of the capability of getting parameters connected to a task from another task to read the path where the artifacts are stored locally, so I don't have to define it again in each script corresponding to a different task.

3 years ago
0 Hi! Can Someone Show Me An Example Of How

Sure! Thank you 🙂

2 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

I'm getting a NameError because 'Optional' type hint is not defined in the global scope

3 years ago
0 Hi All, I Am Testing The New

I'm using the latest version (1.1.1)

3 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Indeed it does! But what still puzzles me so badly is why I get below path when running dataset.get_local_copy() on one of the machines of my cluster:
/home/user/.clearml/cache/storage_manager/datasets/.lock.000.ds_61ff8d4335dd4b74bd78c3576fa44131.clearml
Why is it pointing to a .lock file?

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

In my use case I have a pipeline that executes inference tasks with several models simultaneously. Each inference task is actually a component that acts as a pipeline, since it executes the required steps to generate the predictions (dataset creation, preprocessing and prediction). For this, I'm using the new pipeline functionality ( PipelineDecorator )

3 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

AnxiousSeal95 I see. That's why I was thinking of storing the model inside a task just like with the Dataset class. So that you can either use just the model via InputModel or the model and all its artifacts via Task.get_task by using the ID of the task where the model is located.
I would like my cleanup service to remove all tasks older than two weeks, but not the models. Right now, if I delete all tasks the model does not work (as it needs the training tasks). For now, I ...

3 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Yes, before removing the 'default' queue I was able to shut down agents without specifying further options after the --stop command. I just had to run clearml-agent daemon --stop as many times as there were agents. Of course, I will open the issue as soon as possible :D

3 years ago
0 Hi All, I Am Testing The New

Okay, so the idea behind the new decorator is not to group all the defined steps under the same script so that they share the same environment, but rather to simplify the process of creating scripts for each step and avoid manually calling Task.init on those scripts.

Regarding virtual environment creation from caching, I will keep running benchmarks (from what you say it might be due to high workload in the servers we use)

So far I've been unlucky in the attempt of clearml recog...

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Hi AgitatedDove14 , great, glad it was fixed quickly!

By the way, before releasing version 1.1.3 you might want to take a look at this mock example. I'm trying to run the same pipeline (with different configurations) in a single for loop, as you can see below:
` from clearml import Task
from clearml.automation.controller import PipelineDecorator

@PipelineDecorator.component(return_values=["msg"], execution_queue="myqueue1")
def step_1(msg: str):
msg += "\nI've survived step 1!"
re...

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Or perhaps the complementary scenario with a continue_on_failed_steps parameter which may be a list containing only the steps that can be ignored in case of failure.

3 years ago
0 Why

I just placed tagging code before Task.execute_remotely() and now it works. Thank you! 🙂

2 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Okey, so I could signal to the main pipeline the exception raised in any of the pipeline components and it should halt the whole pipeline. However, are you thinking of including this callbacks features in the new pipelines as well?

3 years ago
3 years ago
Show more results compactanswers