Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
6 Answers
508 Views
0 Votes 6 Answers 508 Views
2 years ago
0 Votes
13 Answers
520 Views
0 Votes 13 Answers 520 Views
When ClearML converts a PipelineDecorator.component decorated function to script code, I have noticed that indexing syntax like A[:, 0] is rewritten as A[(:,...
2 years ago
0 Votes
2 Answers
508 Views
0 Votes 2 Answers 508 Views
Since PipelineDecorator automatically starts the task for you, is there any way to specify arguments to Task.init in the task created for a function decorate...
2 years ago
0 Votes
11 Answers
573 Views
0 Votes 11 Answers 573 Views
Let's say that I specify the output_uri parameter in Task.init like this: task = Task.init( project_name="example_project", task_name="example_task", output_...
2 years ago
0 Votes
1 Answers
501 Views
0 Votes 1 Answers 501 Views
It is possible to attach to an OutputModel an object closely related to it (as some product of data preprocessing that has been done specifically for that mo...
2 years ago
0 Votes
11 Answers
603 Views
0 Votes 11 Answers 603 Views
Hi! I was wondering why ClearML recognize Scikit-learn scalers as Input Models... Am I missing something here? For me it would make sense to include the scal...
2 years ago
0 Votes
3 Answers
537 Views
0 Votes 3 Answers 537 Views
Hello, I have the following basic snippet where I'm trying to add another value to the Task's connected arguments after calling task.connect(args) . Script e...
2 years ago
0 Votes
13 Answers
610 Views
0 Votes 13 Answers 610 Views
2 years ago
0 Votes
7 Answers
638 Views
0 Votes 7 Answers 638 Views
Hi, not sure if I'm doing something wrong or I found a bug. When I try to overwrite some parameters in a cloned task using get_parameters and set_parameters ...
2 years ago
0 Votes
5 Answers
559 Views
0 Votes 5 Answers 559 Views
Hi! From a task created using PipelineDecorator.pipeline , is there any way to get a task ID from the name of the step listed in the table below? My plan is ...
2 years ago
0 Votes
10 Answers
534 Views
0 Votes 10 Answers 534 Views
Hi! Is there any reason why integer/float values are casted to string when connecting arguments dictionary to task and then retrieve them using task.get_para...
2 years ago
0 Votes
3 Answers
586 Views
0 Votes 3 Answers 586 Views
2 years ago
0 Votes
30 Answers
536 Views
0 Votes 30 Answers 536 Views
Is there any reason why doing the following is not possible? Am I doing it right? I want to run a pipeline with different parameters but I get the following ...
2 years ago
0 Votes
4 Answers
558 Views
0 Votes 4 Answers 558 Views
I'm trying to implement a cleanup service by following this example https://github.com/allegroai/clearml/blob/master/examples/services/cleanup/cleanup_servic...
2 years ago
0 Votes
12 Answers
642 Views
0 Votes 12 Answers 642 Views
2 years ago
0 Votes
1 Answers
529 Views
0 Votes 1 Answers 529 Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
2 years ago
0 Votes
1 Answers
548 Views
0 Votes 1 Answers 548 Views
Is there any way to create a queue from code?
2 years ago
0 Votes
1 Answers
642 Views
0 Votes 1 Answers 642 Views
2 years ago
0 Votes
14 Answers
534 Views
0 Votes 14 Answers 534 Views
Hi all, I am testing the new PipelineDecorator feature. Is there any way to automatically detect the Git repository in the pipeline step decorated with Pipel...
2 years ago
0 Votes
3 Answers
528 Views
0 Votes 3 Answers 528 Views
I have another question regarding creating a Task with PipelineDecorator.component . Where can I specify the reuse_last_task_id parameter? I need to set it t...
2 years ago
0 Votes
29 Answers
570 Views
0 Votes 29 Answers 570 Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
2 years ago
0 Votes
2 Answers
577 Views
0 Votes 2 Answers 577 Views
2 years ago
0 Votes
8 Answers
522 Views
0 Votes 8 Answers 522 Views
2 years ago
0 Votes
10 Answers
518 Views
0 Votes 10 Answers 518 Views
Is there any example showing how to work with nested pipelines? In my case I have several functions decorated with PipelineDecorator . In a pipeline I call s...
2 years ago
0 Votes
2 Answers
587 Views
0 Votes 2 Answers 587 Views
Hello, I was wondering if clearML offers the option to spin up again the clearml-agent automatically every time the machine where it was being executed as a ...
2 years ago
0 Votes
11 Answers
673 Views
0 Votes 11 Answers 673 Views
Hi guys, Suppose I have the following script: import numpy as np import pandas as pd from clearml import Task # Import required project dependencies. from tf...
2 years ago
0 Votes
6 Answers
522 Views
0 Votes 6 Answers 522 Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
2 years ago
0 Votes
10 Answers
588 Views
0 Votes 10 Answers 588 Views
Hi, Is there a simple way to make Task.init compatible with Dask.distributed client? When I try to run a script where I want to read concurrently a dataset i...
2 years ago
0 Votes
6 Answers
545 Views
0 Votes 6 Answers 545 Views
Hi, Is there any reason why artifacts linked to a task are not removed when the task is removed from the experiment list?
2 years ago
0 Votes
13 Answers
529 Views
0 Votes 13 Answers 529 Views
2 years ago
Show more results questions
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

AnxiousSeal95 I see. That's why I was thinking of storing the model inside a task just like with the Dataset class. So that you can either use just the model via InputModel or the model and all its artifacts via Task.get_task by using the ID of the task where the model is located.
I would like my cleanup service to remove all tasks older than two weeks, but not the models. Right now, if I delete all tasks the model does not work (as it needs the training tasks). For now, I ...

2 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

Hi AnxiousSeal95 !
That's it. My idea is that artifacts can be linked to the model. Typically these artifacts are often links to serialized objects (such as datasets or scalers). They are usually directories or temporary files in mount units that I want to be loaded as artifacts of the task, removed (as they are temporary) and later I can get a new local path via task.artifacts["scalers"].get_local_copy() . I think this way the model's dependence on the task that created it could be re...

2 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

Hi AnxiousSeal95 !
Yes, main reason is to unclutter the ClearML Web UI but also free up space on our server (mainly due to the large size of the datasets). Once the models are trained, I want to retrain them periodically, and to do so I would like all the data specifications and artifacts generated during training to be linked to the model found under the " Models" section.
What I propose is somehow similar to the functionality of clearml.Dataset . These datasets are themselves a task t...

2 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

BTW, let's say I accidentally removed the 'default' queue from the queue list. As a result, when I try to stop an agent using clearml-agent daemon --stop , I get the following error:
clearml_agent: ERROR: APIError: code 400/707: No queue is tagged as the default queue for this company
I have already created another queue also called 'default' but it had no effect :/

2 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Yes, before removing the 'default' queue I was able to shut down agents without specifying further options after the --stop command. I just had to run clearml-agent daemon --stop as many times as there were agents. Of course, I will open the issue as soon as possible :D

2 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Well, just as you can pass the 'task_type' argument in PipelineDecorator.component , it might be a good option to pass the rest of the 'Task.init' arguments as they are passed in the original method (without using a dictionary)

2 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Oh, I see. This explains the surprising behavior. But what if Task.init code is created automatically by PipelineDecorator.component ? How can I pass arguments to the init method in that case?

2 years ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

I mean to use a function decorated with PipelineDecorator.pipeline inside another pipeline decorated in the same way.
In the traceback attached below you can see that I am trying to use a component named user_config_creation inside the create_user_configs sub-pipeline. I have imported user_config_creation inside create_user_configs but a KeyError is raised (however I assume the function has been imported correctly because no ImportError or ` ModuleNo...

2 years ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Mmm what would be the implications of not being part of the DAG? I mean, how could that step be launched if it is not part of the execution graph?

2 years ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Beautiful. I have tested the new functionality with several use cases and it works just as I expected. Excellent work, as usual :D

2 years ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Hi AgitatedDove14 ,
Any updates on the new ClearML release that fixes the bugs we mentioned in this thread? :)

2 years ago
0 Let'S Say That I Specify The

I currently deal with that by skipping the first 5 characters of the path, i. e. the 'file:' part. But I'm sure there is a cleaner way to proceed.

2 years ago
0 Let'S Say That I Specify The

But this path actually does not exist in my system, so how should I fix that?

2 years ago
0 Hi, Not Sure If I'M Doing Something Wrong Or I Found A Bug. When I Try To Overwrite Some Parameters In A Cloned Task Using

I have only checked it from code.
Exactly, I have followed that same workflow shown in that example. Maybe it has something to do with the dictionary's mutability?

2 years ago
0 Hi All, I Am Faced With The Situation That My Company'S Gitlab Is Temporarily Out Of Service At A Certain Time In The Early Morning (Due To Regular Maintenance Service, Something I Cannot Control). Normally, My System'S Inference Pipelines Are Scheduled T

Mmm well, I can think of a pipeline that could save its state in the instant before the error occurred. So that using some crontab/scheduler the pipeline could be resumed at the point where it was stopped in the case of not having been completed. Is there any functionality like this? Something like PipelineDecorator/PipelineController.resume_from(state_filepath) ?

2 years ago
0 Hi! From A Task Created Using

Anyway, is there any way to retrieve the information stored in the RESULTS tab of ClearML Web UI?

2 years ago
0 Hi! From A Task Created Using

That' s right, I don't know why I was trying to make it so complicated 😅

2 years ago
0 Hi All, I Am Testing The New

I mean the agent that will run the function (which represents a pipeline step) should clone the repo in order to find the location of the project modules that are required for the function to be executed. Also, I have found that clearml does not automatically detect the imports specified within the function decorated with PipelineDecorator.component (despite I followed a similar scheme to the one in the example https://github.com/allegroai/clearml/blob/master/examples/pipeline/pi...

2 years ago
0 Hi All, I Am Testing The New

Sure, it's already enabled. I noticed in the ClearML agent configuration another parameter related to environment caching, named as venv_update (I believe it's still in beta). Do you think enabling this parameter significantly helps to build environments faster?

Yes, I guess. Since pipelines are designed to be executed remotely it may be pointless to enable an output_uri parameter in the PipelineDecorator.component . Anyway, could another task be initialized in the same scr...

2 years ago
0 Hi All, I Am Testing The New

By the way, where can I change the default artifacts location ( output_uri ) if a have a script similar to this example (I mean, from the code, not agent's config):
https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py

2 years ago
0 Let'S Say That I Specify The

My idea is to take advantage of the capability of getting parameters connected to a task from another task to read the path where the artifacts are stored locally, so I don't have to define it again in each script corresponding to a different task.

2 years ago
0 Let'S Say That I Specify The

Sure, but I mean, apart from label it as a local path, what's the point of renaming the original path if my goal is to access it later using the name I gave it?

2 years ago
0 Hi, Is There Any Reason Why Artifacts Linked To A Task Are Not Removed When The Task Is Removed From The Experiment List?

Yeah, but after doing that a message pops up showing a list of artifacts from the task that could not be deleted

2 years ago
0 When Clearml Converts A

BTW, I would like to mention another problem related to this I have encountered. It seems that arguments of type 'int', 'float' or 'list' (maybe also happens with other types) are transformed to 'str' when passed to a function decorated with PipelineDecorator.component at the time of calling it in the pipeline itself. Again, is this something intentional?

2 years ago
0 When Clearml Converts A

Sure, I will post a mock example in a while

2 years ago
Show more results compactanswers