Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
3 years ago
0 Votes
6 Answers
889 Views
0 Votes 6 Answers 889 Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
3 years ago
0 Votes
14 Answers
979 Views
0 Votes 14 Answers 979 Views
Hi! Can someone show me an example of how PipelineController.create_draft works? I'm trying to create a template of a pipeline to run it later but I can't ge...
2 years ago
0 Votes
2 Answers
917 Views
0 Votes 2 Answers 917 Views
Why
Why Task.add_tags method has no effect when running remotely? What if I want to tag a step based on a parameter passed to the pipeline through PipelineContro...
2 years ago
0 Votes
13 Answers
911 Views
0 Votes 13 Answers 911 Views
When ClearML converts a PipelineDecorator.component decorated function to script code, I have noticed that indexing syntax like A[:, 0] is rewritten as A[(:,...
3 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hi, not sure if I'm doing something wrong or I found a bug. When I try to overwrite some parameters in a cloned task using get_parameters and set_parameters ...
3 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Hi guys, Suppose I have the following script: import numpy as np import pandas as pd from clearml import Task # Import required project dependencies. from tf...
3 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
Hi, Is there a simple way to make Task.init compatible with Dask.distributed client? When I try to run a script where I want to read concurrently a dataset i...
3 years ago
0 Votes
16 Answers
1K Views
0 Votes 16 Answers 1K Views
Hi! I noticed a bug related to reusing the same component in a pipeline. I have prepared a mock example so that you can reproduce it: from clearml.automation...
3 years ago
0 Votes
14 Answers
948 Views
0 Votes 14 Answers 948 Views
Hi all, I am testing the new PipelineDecorator feature. Is there any way to automatically detect the Git repository in the pipeline step decorated with Pipel...
3 years ago
0 Votes
9 Answers
954 Views
0 Votes 9 Answers 954 Views
It is a good practice to call a function decorated by PipelineDecorator in a for loop? I tried it in a real-world example and I didn't get the results I expe...
3 years ago
0 Votes
29 Answers
993 Views
0 Votes 29 Answers 993 Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
3 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Hi! I was wondering why ClearML recognize Scikit-learn scalers as Input Models... Am I missing something here? For me it would make sense to include the scal...
3 years ago
0 Votes
3 Answers
919 Views
0 Votes 3 Answers 919 Views
Hello, I have the following basic snippet where I'm trying to add another value to the Task's connected arguments after calling task.connect(args) . Script e...
3 years ago
Show more results questions
0 Hi! Can Someone Show Me An Example Of How

Hi AgitatedDove14 , so isn't it ClearML best practice to create a draft pipeline to have the task on the server so that it can be cloned, modified and executed at any time?

2 years ago
0 Hi All, I Am Testing The New

I am aware of the option to enable virtual environment caching, but that is still very time consuming.

3 years ago
0 Let'S Say That I Specify The

My idea is to take advantage of the capability of getting parameters connected to a task from another task to read the path where the artifacts are stored locally, so I don't have to define it again in each script corresponding to a different task.

3 years ago
0 Hi! Can Someone Show Me An Example Of How

Sure! Thank you 🙂

2 years ago
0 Hi All, I Am Testing The New

I'm using the latest version (1.1.1)

3 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Indeed it does! But what still puzzles me so badly is why I get below path when running dataset.get_local_copy() on one of the machines of my cluster:
/home/user/.clearml/cache/storage_manager/datasets/.lock.000.ds_61ff8d4335dd4b74bd78c3576fa44131.clearml
Why is it pointing to a .lock file?

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

In my use case I have a pipeline that executes inference tasks with several models simultaneously. Each inference task is actually a component that acts as a pipeline, since it executes the required steps to generate the predictions (dataset creation, preprocessing and prediction). For this, I'm using the new pipeline functionality ( PipelineDecorator )

3 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

AnxiousSeal95 I see. That's why I was thinking of storing the model inside a task just like with the Dataset class. So that you can either use just the model via InputModel or the model and all its artifacts via Task.get_task by using the ID of the task where the model is located.
I would like my cleanup service to remove all tasks older than two weeks, but not the models. Right now, if I delete all tasks the model does not work (as it needs the training tasks). For now, I ...

3 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Yes, before removing the 'default' queue I was able to shut down agents without specifying further options after the --stop command. I just had to run clearml-agent daemon --stop as many times as there were agents. Of course, I will open the issue as soon as possible :D

3 years ago
0 Hi All, I Am Testing The New

Okay, so the idea behind the new decorator is not to group all the defined steps under the same script so that they share the same environment, but rather to simplify the process of creating scripts for each step and avoid manually calling Task.init on those scripts.

Regarding virtual environment creation from caching, I will keep running benchmarks (from what you say it might be due to high workload in the servers we use)

So far I've been unlucky in the attempt of clearml recog...

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Hi AgitatedDove14 , great, glad it was fixed quickly!

By the way, before releasing version 1.1.3 you might want to take a look at this mock example. I'm trying to run the same pipeline (with different configurations) in a single for loop, as you can see below:
` from clearml import Task
from clearml.automation.controller import PipelineDecorator

@PipelineDecorator.component(return_values=["msg"], execution_queue="myqueue1")
def step_1(msg: str):
msg += "\nI've survived step 1!"
re...

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Or perhaps the complementary scenario with a continue_on_failed_steps parameter which may be a list containing only the steps that can be ignored in case of failure.

3 years ago
0 Why

I just placed tagging code before Task.execute_remotely() and now it works. Thank you! 🙂

2 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Okey, so I could signal to the main pipeline the exception raised in any of the pipeline components and it should halt the whole pipeline. However, are you thinking of including this callbacks features in the new pipelines as well?

3 years ago
0 When Clearml Converts A

Nice, in the meantime as a workaround I will implement a temporary parsing code at the beginning of step functions

3 years ago
0 It Is A Good Practice To Call A Function Decorated By

I tested cache=False and I still get the same error 😕 In the dashboard the task corresponding to step_two does not appear duplicated, I assume the task is being launched sequentially. I'm going to prepare a more elaborate example to see what happens. Currently I can't run PipelineDecorator.debug_pipeline() because I need at least two devices to read some data and process it on the other one.

3 years ago
0 It Is A Good Practice To Call A Function Decorated By

Oh, I see. In the meantime I will duplicate the function and rename it so I can work with a different configuration. I really appreciate your effort as well as having a continuous feedback to keep improving this wonderful library!

3 years ago
0 When Clearml Converts A

Sure, I will post a mock example in a while

3 years ago
0 Hi, I Have A Question Regarding The New

But I was actually asking about accessing the Pipeline task ID, not the tasks corresponding to the components.

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

BTW, how can I run 'execute_orchestrator' concurrently? That is, launch it for several configurations at the same time? The way it's implemented now, it doesn't start the next configuration until the current one is finished.

3 years ago
0 Hi! From A Task Created Using

Well, I am thinking in the case that there are several pipelines in the system and that when filtering a task by its name and project I can get several tasks. How could I build a filter for Task.get_task(task_filter=...) that returns only the task whose parent task is the pipeline task?

3 years ago
0 Hi, I Have A Question Regarding The New

But when I call Task.current_task().task_id within the code of one of the pipeline components, I get the task ID of the component itself. I want the pipeline task ID

3 years ago
0 Hi! If There Are Several Tasks Running Concurrently, Which Task Should

Great, thank you very much for the info! I just spotted the get_logger classmethod. As for the initial question, that's just the behavior I expected!

3 years ago
Show more results compactanswers