Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
Hello, I have the following basic snippet where I'm trying to add another value to the Task's connected arguments after calling task.connect(args) . Script e...
3 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
3 years ago
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
3 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
3 years ago
0 Votes
29 Answers
1K Views
0 Votes 29 Answers 1K Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
3 years ago
0 Votes
30 Answers
1K Views
0 Votes 30 Answers 1K Views
Is there any reason why doing the following is not possible? Am I doing it right? I want to run a pipeline with different parameters but I get the following ...
3 years ago
0 Votes
21 Answers
1K Views
0 Votes 21 Answers 1K Views
Hi all! I noticed when a pipeline fails, all its components continue running. Wouldn't it make more sense for the pipeline to send an abort signal to all tas...
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
Is there any way to create a queue from code?
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
3 years ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
3 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
Hi, Is there any reason why artifacts linked to a task are not removed when the task is removed from the experiment list?
3 years ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
When ClearML converts a PipelineDecorator.component decorated function to script code, I have noticed that indexing syntax like A[:, 0] is rewritten as A[(:,...
3 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
3 years ago
0 Votes
14 Answers
1K Views
0 Votes 14 Answers 1K Views
Hi! Can someone show me an example of how PipelineController.create_draft works? I'm trying to create a template of a pipeline to run it later but I can't ge...
3 years ago
0 Votes
14 Answers
1K Views
0 Votes 14 Answers 1K Views
Hi all, I am testing the new PipelineDecorator feature. Is there any way to automatically detect the Git repository in the pipeline step decorated with Pipel...
3 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
Hi, Is there a simple way to make Task.init compatible with Dask.distributed client? When I try to run a script where I want to read concurrently a dataset i...
3 years ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
Hi, Let's say I have several functions decorated with PipelineDecorator.component (functions A, B and C). Function C can only be executed after functions A a...
3 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Let's say that I specify the output_uri parameter in Task.init like this: task = Task.init( project_name="example_project", task_name="example_task", output_...
3 years ago
0 Votes
5 Answers
1K Views
0 Votes 5 Answers 1K Views
Hi, can anyone help me with this code? (just a mock example, but it nicely captures the behavior of the real code) import pandas as pd from clearml import Ta...
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
3 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
Why
Why Task.add_tags method has no effect when running remotely? What if I want to tag a step based on a parameter passed to the pipeline through PipelineContro...
3 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
Hi! Is there any reason why integer/float values are casted to string when connecting arguments dictionary to task and then retrieve them using task.get_para...
3 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
Hello, I was wondering if clearML offers the option to spin up again the clearml-agent automatically every time the machine where it was being executed as a ...
3 years ago
0 Votes
18 Answers
1K Views
0 Votes 18 Answers 1K Views
Regarding the new version 1.1.2, I have noticed type hints are now included in the script generated by PipelineDecorator.component in the function arguments....
3 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
3 years ago
0 Votes
4 Answers
1K Views
0 Votes 4 Answers 1K Views
I'm trying to implement a cleanup service by following this example https://github.com/allegroai/clearml/blob/master/examples/services/cleanup/cleanup_servic...
3 years ago
0 Votes
9 Answers
1K Views
0 Votes 9 Answers 1K Views
It is a good practice to call a function decorated by PipelineDecorator in a for loop? I tried it in a real-world example and I didn't get the results I expe...
3 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
3 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hi! If there are several tasks running concurrently, which task should Task.current_task() return?
3 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
3 years ago
Show more results questions
0 Let'S Say That I Specify The

Sure, but I mean, apart from label it as a local path, what's the point of renaming the original path if my goal is to access it later using the name I gave it?

3 years ago
3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 After checking, I discovered that apparently it doesn't matter if each pipeline is executed by a different worker, the error persists. Honestly this has me puzzled. I'm really looking forward to getting this functionality right because it's an aspect that would make ClearML shine even more.

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

Well, this is just a mock example 🙂 . In the real application I'm working on there will be more than one configuration file (in principle one for the data and one for the DL model). Regarding the fix, I am not in a hurry at the moment. I'll happily wait for tomorrow (or the day after) when the commit is pushed!

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

But maybe another solution would be to pass the configuration files paths as function arguments, then read and parse them inside the pipeline

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

What exactly do you mean by that? From VS Code I execute the following script, and then the agents take care of executing the code remotely:
` import pandas as pd

from clearml import Task, TaskTypes
from clearml.automation.controller import PipelineDecorator

CACHE = False

@PipelineDecorator.component(
name="Wind data creator",
return_values=["wind_series"],
cache=CACHE,
execution_queue="data_cpu",
task_type=TaskTypes.data_processing,
)
def generate_wind(start_date: st...

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 The pipelines are executed by the agents that are listening to the queue given by pipeline_execution_queue="controllers"

3 years ago
3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 I ended up with two pipelines being executed until they completed the workflow but duplicating each of their steps. You can check it here:
https://clearml.slack.com/files/U02A5DGPMPU/F02SR3G9RDK/image.png

3 years ago
0 Let'S Say That I Specify The

Now it's okey. I have found a more intuitive way to get around. I was facing the classic 'xy' problem :)

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 I have the strong feeling it must be an agent issue, because when I place PipelineDecorator.run_locally() before calling the pipeline, everything works perfectly. See:

3 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Indeed it does! But what still puzzles me so badly is why I get below path when running dataset.get_local_copy() on one of the machines of my cluster:
/home/user/.clearml/cache/storage_manager/datasets/.lock.000.ds_61ff8d4335dd4b74bd78c3576fa44131.clearml
Why is it pointing to a .lock file?

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 By adding PipelineDecorator.run_locally() everything seems to work perfectly. This is what I expect the experiment listing to look like when the agents are the ones running the code. With this, I'm pretty sure the error search can be narrowed down to the agents' code.

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Or perhaps the complementary scenario with a continue_on_failed_steps parameter which may be a list containing only the steps that can be ignored in case of failure.

3 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 BTW, I got the notification from GitHub telling me you had committed the fix and I went ahead. After testing the code again, I see the task parameter dictionary has been removed properly (now it has been broken down into flat parameters). However, I still have the same problem with duplicate tasks, as you can see in the image.

3 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Or maybe you could bundle some parameters that belongs to PipelineDecorator.component into high-level configuration variable (something like PipelineDecorator.global_config (?))

3 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

AgitatedDove14 Oops, something still seems to be wrong. When trying to retrieve the dataset using get_local_copy() I get the following error:
` Traceback (most recent call last):
File "/home/user/myproject/lab.py", line 27, in <module>
print(dataset.get_local_copy())
File "/home/user/.conda/envs/myenv/lib/python3.9/site-packages/clearml/datasets/dataset.py", line 554, in get_local_copy
target_folder = self._merge_datasets(
File "/home/user/.conda/envs/myenv/lib/python3.9/site-p...

3 years ago
0 Hi, I Just Updated Clearml To Version V1.1.3. Right After Launching A Training Pipeline, The System Crashed Due To The Following Error:

Sure, here is a trivial example:
from clearml import Dataset dataset = Dataset.create(dataset_name="Dataset_v1.1.3", dataset_project="Mocks") dataset.finalize() loaded_dataset = Dataset.get(dataset_id=dataset.id)

3 years ago
0 Hi, Can Anyone Help Me With This Code? (Just A Mock Example, But It Nicely Captures The Behavior Of The Real Code)

CostlyOstrich36 Yes, it happens on the following line, at the time of calling the pipeline.
forecast = prediction_service(config=default_config)Were you able to reproduce the example?

3 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

By adding the slash I have been able to see that indeed the dataset is stored in output_url . However, when calling finalize , I get the same error. And yes, I have installed the version corresponding to the last commit :/

3 years ago
Show more results compactanswers