Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
16 Answers
2K Views
0 Votes 16 Answers 2K Views
Hi! I noticed a bug related to reusing the same component in a pipeline. I have prepared a mock example so that you can reproduce it: from clearml.automation...
4 years ago
0 Votes
9 Answers
2K Views
0 Votes 9 Answers 2K Views
It is a good practice to call a function decorated by PipelineDecorator in a for loop? I tried it in a real-world example and I didn't get the results I expe...
4 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
Why
Why Task.add_tags method has no effect when running remotely? What if I want to tag a step based on a parameter passed to the pipeline through PipelineContro...
3 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
Hello, I was wondering if clearML offers the option to spin up again the clearml-agent automatically every time the machine where it was being executed as a ...
4 years ago
0 Votes
30 Answers
2K Views
0 Votes 30 Answers 2K Views
Is there any reason why doing the following is not possible? Am I doing it right? I want to run a pipeline with different parameters but I get the following ...
4 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
Is there any example showing how to work with nested pipelines? In my case I have several functions decorated with PipelineDecorator . In a pipeline I call s...
4 years ago
0 Votes
8 Answers
2K Views
0 Votes 8 Answers 2K Views
3 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
4 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
Hi! If there are several tasks running concurrently, which task should Task.current_task() return?
4 years ago
0 Votes
21 Answers
2K Views
0 Votes 21 Answers 2K Views
Hi all! I noticed when a pipeline fails, all its components continue running. Wouldn't it make more sense for the pipeline to send an abort signal to all tas...
4 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
4 years ago
0 Votes
11 Answers
3K Views
0 Votes 11 Answers 3K Views
Hi guys, Suppose I have the following script: import numpy as np import pandas as pd from clearml import Task # Import required project dependencies. from tf...
4 years ago
0 Votes
11 Answers
2K Views
0 Votes 11 Answers 2K Views
What is the recommended way to stop the execution of a specific agent? This command doesn't allow me to specify the agent IP I want to stop: clearml-agent da...
4 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
4 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
Hi! Is there any reason why integer/float values are casted to string when connecting arguments dictionary to task and then retrieve them using task.get_para...
4 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
4 years ago
Show more results questions
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

They share the same code (i.e. the same decorated functions), but using a different configuration.

4 years ago
0 Hi! Is There Any Reason Why Integer/Float Values Are Casted To String When Connecting Arguments Dictionary To Task And Then Retrieve Them Using

Mmm I see. So the agent is taking the parameters from the base task registered in the server. Then if I call task.get_parameter_as_dict for a task that has not been executed by an agent, should I get the original types of the values?

4 years ago
0 Hi Guys, Suppose I Have The Following Script:

So ClearML will scan all the repository code searching for package dependencies? Is that right?

4 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Okey, so I could signal to the main pipeline the exception raised in any of the pipeline components and it should halt the whole pipeline. However, are you thinking of including this callbacks features in the new pipelines as well?

4 years ago
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

The thing is I don't know in advance how many models there will be in the inference stage. My approach is to read from a database the configurations of the operational models through a for loop, and in that loop all the inference tasks would be enqueued (one task for each deployed model). For this I need the system to be able to run several pipelines at the same time. As you told me for now this is not possible, as pipelines are based on singletons, my alternative is to use components

4 years ago
0 Hi, I Am Experiencing Issues When Uploading Artifacts To The Dataset Task With Clearml Version V1.1.4Rc0. The Problem Is The Artifacts Are Uploaded To The Default Clearml Server, Even Though I Have Specified The Path To Our Storage Medium. The Code To Dem

Hi AgitatedDove14 , gotcha. So how can I temporarily fix it? I'm not able to find something like task.set_output_uri() in the official docs. Or maybe do you plan to solve this problem in the very short term?

4 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

AgitatedDove14 In the 'status.json' file I could see the 'is_dirty' flag is set to True

4 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Indeed it does! But what still puzzles me so badly is why I get below path when running dataset.get_local_copy() on one of the machines of my cluster:
/home/user/.clearml/cache/storage_manager/datasets/.lock.000.ds_61ff8d4335dd4b74bd78c3576fa44131.clearml
Why is it pointing to a .lock file?

4 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

The scheme is similar to the following:
` main_pipeline
(PipelineDecorator.pipeline)
|
|----------------------------------|
| |
inference_orchestrator_1 inference_orchestrator_2
(PipelineDecorator.component, (PipelineDecorator.component,
acting as a pipeline) acting as a pipeline)
| |
...

4 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Well, just as you can pass the 'task_type' argument in PipelineDecorator.component , it might be a good option to pass the rest of the 'Task.init' arguments as they are passed in the original method (without using a dictionary)

4 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Or perhaps the complementary scenario with a continue_on_failed_steps parameter which may be a list containing only the steps that can be ignored in case of failure.

4 years ago
0 It Is A Good Practice To Call A Function Decorated By

So great! It would be a feature that would make the work much easier instead of having to clone the task and launch it with different parameters. It could even be considered more pythonic. Do you have an immediate solution in mind to keep moving forward before the new release is ready? :)

4 years ago
0 When Clearml Converts A

Nice, in the meantime as a workaround I will implement a temporary parsing code at the beginning of step functions

4 years ago
0 Hi, Is There A Simple Way To Make

Yep, you were absolutely right. What Dask did not like was the object self.preprocesser inside read_and_process_file , not Task.init . Since the dask.distributed.Client is initialized in that same class, maybe it's something that Dask doesn't allow.

Sorry for blaming ClearML without solid evidence x)

4 years ago
0 Hi Guys, Suppose I Have The Following Script:

It's my own module (called 'tf_funcs.py')

4 years ago
0 Hi All, I Am Testing The New

I'm using the latest version (1.1.1)

4 years ago
0 Hi, I Have A Question Regarding The New

But I was actually asking about accessing the Pipeline task ID, not the tasks corresponding to the components.

4 years ago
0 It Is A Good Practice To Call A Function Decorated By

Oh, I see. In the meantime I will duplicate the function and rename it so I can work with a different configuration. I really appreciate your effort as well as having a continuous feedback to keep improving this wonderful library!

4 years ago
Show more results compactanswers