Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

183 × Eureka!
0 Hi All! I Have A Question Regarding Clearml-Agents That I Have Not Been Able To Find In The Documentation. I Have Seen An Agent Is Also Called A 'Worker'. When An Agent Is Spinned Up, A New Process Is Spawned? Or Can It Also Be A Thread? What Limits The M

Hi CostlyOstrich36 AgitatedDove14
Oh no, I am not trying to say that I am using each agent to run a single task. I have several agents listening to a number of queues so that they are busy most of the time. As we talked about, it is not possible to run multiple pipelines ( PipelineDecorator.pipeline ) simultaneously in a single process. That's why I had been testing locally launching pipelines in different subprocesses and this way I have managed to run several pipelines concurrently....

3 years ago
0 Hi All! When I Set A List As A Task Parameter And Later Try To Retrieve It, What I Get Is A String. Is This The Expected Behavior? I Have Prepared The Following Snippet So That You Can Reproduce It.

If I try to connect a dictionary of type dict[str, list] with task.connect , when retrieving this dictionary with task.get_parameter I get another dictionary dict[str, str] . Therefore, I see the same behavior using task.connect :/

2 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

Hi AnxiousSeal95 !
Yes, main reason is to unclutter the ClearML Web UI but also free up space on our server (mainly due to the large size of the datasets). Once the models are trained, I want to retrain them periodically, and to do so I would like all the data specifications and artifacts generated during training to be linked to the model found under the " Models" section.
What I propose is somehow similar to the functionality of clearml.Dataset . These datasets are themselves a task t...

3 years ago
0 When Clearml Converts A

Glad to hear that! Thanks!

3 years ago
0 Hello Folks! I Don'T Know If This Issue Has Already Been Addressed. I Have A Basic Pipelinecontroller Script With Two Steps: One Of Task Is For Preprocessing Purposes And The Other For Training A Model. Currently I Am Placing The Code Related To The Pack

Hi Martin,

Actually Task.add_requirements behaves as I expect, since that part of the code is in the preprocessing script and for that task it does install all the specified packages. So, my question could be rephrased as the following: when working with PipelineController , is there any way to avoid creating a new development environment for each step of the pipeline?

According to the https://clear.ml/docs/latest/docs/clearml_agent provided in the official ClearML documentatio...

3 years ago
0 Hello Folks! I Don'T Know If This Issue Has Already Been Addressed. I Have A Basic Pipelinecontroller Script With Two Steps: One Of Task Is For Preprocessing Purposes And The Other For Training A Model. Currently I Am Placing The Code Related To The Pack

Yep, I've already unmarked the venv caching setting, but still the agent reinstalls all the requirements again.
Maybe it has to do with the fact that I am not working on a Git repository and clearML is not able to locate the requirements.txt file?

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

The scheme is similar to the following:
` main_pipeline
(PipelineDecorator.pipeline)
|
|----------------------------------|
| |
inference_orchestrator_1 inference_orchestrator_2
(PipelineDecorator.component, (PipelineDecorator.component,
acting as a pipeline) acting as a pipeline)
| |
...

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Well, I can see the difference here. Using the new pipelines generation the user has the flexibility to play with the returned values of each step. We can process those values before passing them to the next step, so maybe makes little sense to include those callbacks in this case

3 years ago
0 Hi All, I Am Faced With The Situation That My Company'S Gitlab Is Temporarily Out Of Service At A Certain Time In The Early Morning (Due To Regular Maintenance Service, Something I Cannot Control). Normally, My System'S Inference Pipelines Are Scheduled T

Mmm well, I can think of a pipeline that could save its state in the instant before the error occurred. So that using some crontab/scheduler the pipeline could be resumed at the point where it was stopped in the case of not having been completed. Is there any functionality like this? Something like PipelineDecorator/PipelineController.resume_from(state_filepath) ?

3 years ago
3 years ago
0 Hello, I Have The Following Basic Snippet Where I'M Trying To Add Another Value To The Task'S Connected Arguments After Calling

Currently I'm working with v1.0.5. Anyway, I found that it is possible to connect the new argument if I store in a variable the arguments returned by task.connect(args) . I expected that since it is a mutable object it would not be necessary to overwrite args , but apparently it is required in this version of ClearML.

3 years ago
0 It Is A Good Practice To Call A Function Decorated By

Hi AgitatedDove14 ,
I have already developed a mock test that can be somewhat similar to the pipeline we are developing. The same problem arises. Only the task is created for the first set of parameters in the for loop. Here, only the configuration text file is created for user 1. Can you reproduce it?
` from clearml import Task
from clearml.automation.controller import PipelineDecorator

@PipelineDecorator.component(
return_values=["admin_config_path"], cache=False, task_type=Task.Task...

3 years ago
0 When Clearml Converts A

BTW, I would like to mention another problem related to this I have encountered. It seems that arguments of type 'int', 'float' or 'list' (maybe also happens with other types) are transformed to 'str' when passed to a function decorated with PipelineDecorator.component at the time of calling it in the pipeline itself. Again, is this something intentional?

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

I tried specifying helpers functions but it still gives the same error. If I define a component through the following code:
` from typing import Optional
from clearml.automation.controller import PipelineDecorator

@PipelineDecorator.component(...)
def step_data_loading(path: str, target_dir: Optional[str] = None):
pass Then in the automatically created script I find the following code: from clearml.automation.controller import PipelineDecorator

def step_data_loading(path: str, target...

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

Hi AgitatedDove14 , it's nice to know you've already pinpointed the problem! I think the solution you propose is a good one, but does that mean I have to unpack all the dictionary values as parameters of the pipeline function? Wouldn't that make the function too "dirty"? Or do you mean you will soon push a commit that will allow me to keep passing a dictionary and ClearML automatically flatten it?

2 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Yes, I'm working with the latest commit. Anyway, I have tried to run dataset.get_local_copy() on another machine and it works. I have no idea why this happens. However, on the new machine get_local_copy() does not return the path I expect. If I have this code:
dataset.upload( output_url="/home/user/server_local_storage/mock_storage" )I would expect the dataset to be stored under the path specified in output_url . But what I get with get_local_copy() is the follo...

3 years ago
0 Hi! Can Someone Show Me An Example Of How

Having the ability to clone and modify the same task over and over again, in principle I would no longer need the multi_instance support feature from PipelineDecorator.pipeline. Is this correct, or are they different things?

2 years ago
Show more results compactanswers