Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
14 Answers
951 Views
0 Votes 14 Answers 951 Views
Hi all, I am testing the new PipelineDecorator feature. Is there any way to automatically detect the Git repository in the pipeline step decorated with Pipel...
3 years ago
0 Votes
14 Answers
982 Views
0 Votes 14 Answers 982 Views
Hi! Can someone show me an example of how PipelineController.create_draft works? I'm trying to create a template of a pipeline to run it later but I can't ge...
2 years ago
0 Votes
11 Answers
979 Views
0 Votes 11 Answers 979 Views
What is the recommended way to stop the execution of a specific agent? This command doesn't allow me to specify the agent IP I want to stop: clearml-agent da...
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
3 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hi, not sure if I'm doing something wrong or I found a bug. When I try to overwrite some parameters in a cloned task using get_parameters and set_parameters ...
3 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Hi guys, Suppose I have the following script: import numpy as np import pandas as pd from clearml import Task # Import required project dependencies. from tf...
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
It is possible to attach to an OutputModel an object closely related to it (as some product of data preprocessing that has been done specifically for that mo...
3 years ago
0 Votes
6 Answers
893 Views
0 Votes 6 Answers 893 Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
3 years ago
0 Votes
9 Answers
957 Views
0 Votes 9 Answers 957 Views
It is a good practice to call a function decorated by PipelineDecorator in a for loop? I tried it in a real-world example and I didn't get the results I expe...
3 years ago
0 Votes
21 Answers
1K Views
0 Votes 21 Answers 1K Views
Hi all! I noticed when a pipeline fails, all its components continue running. Wouldn't it make more sense for the pipeline to send an abort signal to all tas...
3 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
Hi, Is there a simple way to make Task.init compatible with Dask.distributed client? When I try to run a script where I want to read concurrently a dataset i...
3 years ago
0 Votes
6 Answers
904 Views
0 Votes 6 Answers 904 Views
3 years ago
0 Votes
8 Answers
883 Views
0 Votes 8 Answers 883 Views
2 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
3 years ago
0 Votes
5 Answers
951 Views
0 Votes 5 Answers 951 Views
Hi, can anyone help me with this code? (just a mock example, but it nicely captures the behavior of the real code) import pandas as pd from clearml import Ta...
2 years ago
0 Votes
3 Answers
937 Views
0 Votes 3 Answers 937 Views
I have another question regarding creating a Task with PipelineDecorator.component . Where can I specify the reuse_last_task_id parameter? I need to set it t...
3 years ago
0 Votes
2 Answers
917 Views
0 Votes 2 Answers 917 Views
Since PipelineDecorator automatically starts the task for you, is there any way to specify arguments to Task.init in the task created for a function decorate...
3 years ago
0 Votes
1 Answers
946 Views
0 Votes 1 Answers 946 Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
3 years ago
0 Votes
3 Answers
921 Views
0 Votes 3 Answers 921 Views
Hello, I have the following basic snippet where I'm trying to add another value to the Task's connected arguments after calling task.connect(args) . Script e...
3 years ago
0 Votes
1 Answers
924 Views
0 Votes 1 Answers 924 Views
Is there any way to create a queue from code?
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
2 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
3 years ago
0 Votes
1 Answers
960 Views
0 Votes 1 Answers 960 Views
Hi everybody, Where can I find the documentation about the new TaskScheduler feature?
3 years ago
0 Votes
6 Answers
957 Views
0 Votes 6 Answers 957 Views
Hi all! Let's say I have two functions decorated with PipelineDecorator.pipeline . Then I have a set of functions decorated with PipelineDecorator.component ...
3 years ago
0 Votes
29 Answers
998 Views
0 Votes 29 Answers 998 Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
3 years ago
0 Votes
10 Answers
908 Views
0 Votes 10 Answers 908 Views
Is there any example showing how to work with nested pipelines? In my case I have several functions decorated with PipelineDecorator . In a pipeline I call s...
3 years ago
0 Votes
9 Answers
1K Views
0 Votes 9 Answers 1K Views
Hi, I just updated clearml to version v1.1.3. Right after launching a training pipeline, the system crashed due to the following error: Traceback (most recen...
3 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
Hello, I was wondering if clearML offers the option to spin up again the clearml-agent automatically every time the machine where it was being executed as a ...
3 years ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
3 years ago
0 Votes
10 Answers
928 Views
0 Votes 10 Answers 928 Views
Hi! Is there any reason why integer/float values are casted to string when connecting arguments dictionary to task and then retrieve them using task.get_para...
3 years ago
Show more results questions
0 Hi! Can Someone Show Me An Example Of How

Hi AgitatedDove14 , so isn't it ClearML best practice to create a draft pipeline to have the task on the server so that it can be cloned, modified and executed at any time?

2 years ago
0 Hi! Can Someone Show Me An Example Of How

Sure! Thank you 🙂

2 years ago
0 Hi! Can Someone Show Me An Example Of How

Exactly!! That's what I was looking for: create the pipeline but not launching it. Thanks again AgitatedDove14

2 years ago
0 Hi! Can Someone Show Me An Example Of How

Hi AgitatedDove14 , just one last thing before closing the thread. I was wondering what is the use of PipelineController.create_draft if you can't use it to clone and run tasks, as we have seen

2 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Yes, before removing the 'default' queue I was able to shut down agents without specifying further options after the --stop command. I just had to run clearml-agent daemon --stop as many times as there were agents. Of course, I will open the issue as soon as possible :D

3 years ago
0 Hi All, I Am Testing The New

Of course it's always a good idea to have that extra option just in case 🙂

Nevermind, I've already found a cleaner way to address this problem. I really appreciate your help!

3 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

BTW, let's say I accidentally removed the 'default' queue from the queue list. As a result, when I try to stop an agent using clearml-agent daemon --stop , I get the following error:
clearml_agent: ERROR: APIError: code 400/707: No queue is tagged as the default queue for this company
I have already created another queue also called 'default' but it had no effect :/

3 years ago
0 Hi All, I Am Testing The New

By the way, where can I change the default artifacts location ( output_uri ) if a have a script similar to this example (I mean, from the code, not agent's config):
https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

I'm totally agree with the pipelinecontroller/decorator part. Regarding the proposal for the component parameter, I also think it would be a good feature, although it might mislead the fact that there will be times when the pipeline will fail because it is an intrinsically crucial step, so it doesn't matter whether 'continue_pipeline_on_failure' is set to True or False. Anyway, I can't think a better way to deal with that right now.

3 years ago
0 Hi, I Just Updated Clearml To Version V1.1.3. Right After Launching A Training Pipeline, The System Crashed Due To The Following Error:

Sure, here is a trivial example:
from clearml import Dataset dataset = Dataset.create(dataset_name="Dataset_v1.1.3", dataset_project="Mocks") dataset.finalize() loaded_dataset = Dataset.get(dataset_id=dataset.id)

3 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Well, just as you can pass the 'task_type' argument in PipelineDecorator.component , it might be a good option to pass the rest of the 'Task.init' arguments as they are passed in the original method (without using a dictionary)

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

I'm getting a NameError because 'Optional' type hint is not defined in the global scope

3 years ago
0 It Is A Good Practice To Call A Function Decorated By

Oh, I see. In the meantime I will duplicate the function and rename it so I can work with a different configuration. I really appreciate your effort as well as having a continuous feedback to keep improving this wonderful library!

3 years ago
0 Hi Guys, Suppose I Have The Following Script:

So ClearML will scan all the repository code searching for package dependencies? Is that right?

3 years ago
0 Is There Any Example Showing How To Work With Nested Pipelines? In My Case I Have Several Functions Decorated With

Beautiful. I have tested the new functionality with several use cases and it works just as I expected. Excellent work, as usual :D

3 years ago
0 Hi, Can Anyone Help Me With This Code? (Just A Mock Example, But It Nicely Captures The Behavior Of The Real Code)

Hey CostlyOstrich36 AgitatedDove14 ! Any news on this? Should I open an issue?

2 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

AgitatedDove14 Oops, something still seems to be wrong. When trying to retrieve the dataset using get_local_copy() I get the following error:
` Traceback (most recent call last):
File "/home/user/myproject/lab.py", line 27, in <module>
print(dataset.get_local_copy())
File "/home/user/.conda/envs/myenv/lib/python3.9/site-packages/clearml/datasets/dataset.py", line 554, in get_local_copy
target_folder = self._merge_datasets(
File "/home/user/.conda/envs/myenv/lib/python3.9/site-p...

3 years ago
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

They share the same code (i.e. the same decorated functions), but using a different configuration.

3 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Oh, I see. This explains the surprising behavior. But what if Task.init code is created automatically by PipelineDecorator.component ? How can I pass arguments to the init method in that case?

3 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

I'm using the last commit. I'm just fitting a scikit-learn MinMaxScaler object to a dataset of type tf.data.Dataset inside a function (which represents the model training step) decorated with PipelineDecorator.component . The function does not even return the scaler object as an artifact. However, the scaler object is logged as an artifact of the task, as shown in the image below.

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

Yes, although I use both terms interchangeably. The information will actually be contained in JSON files.

2 years ago
0 Hello Folks! I Don'T Know If This Issue Has Already Been Addressed. I Have A Basic Pipelinecontroller Script With Two Steps: One Of Task Is For Preprocessing Purposes And The Other For Training A Model. Currently I Am Placing The Code Related To The Pack

Thanks for the background. I now have a big picture of the process ClearML goes through. It was helpful in clarifying some of the questions that I didn't know how to ask properly. So, the idea is that a base task is already stored on the ClearML server for later use in a production environment. This is because such a task will always be created during the model development process.

Going back to my initial question, as far as I understood, if the environment caching option is ena...

3 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

By adding the slash I have been able to see that indeed the dataset is stored in output_url . However, when calling finalize , I get the same error. And yes, I have installed the version corresponding to the last commit :/

3 years ago
Show more results compactanswers