Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0

Badges 1

183 × Eureka!
0 Hi! Can Someone Show Me An Example Of How

I don't know if you remember the need I had some time ago to launch the same pipeline through configuration. I've been thinking about it and I think PipelineController fits my needs better than PipelineDecorator in that respect.

3 years ago
0 Hi All! Let'S Say I Have Two Functions Decorated With

Mmmm you are right. Even if I had 1000 components spread in different project modules, only those components that are imported in the script where the pipeline is defined would be included in the DAG plot, is that right?

4 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

BTW, how can I run 'execute_orchestrator' concurrently? That is, launch it for several configurations at the same time? The way it's implemented now, it doesn't start the next configuration until the current one is finished.

4 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

I tried specifying helpers functions but it still gives the same error. If I define a component through the following code:
` from typing import Optional
from clearml.automation.controller import PipelineDecorator

@PipelineDecorator.component(...)
def step_data_loading(path: str, target_dir: Optional[str] = None):
pass Then in the automatically created script I find the following code: from clearml.automation.controller import PipelineDecorator

def step_data_loading(path: str, target...

4 years ago
0 Hi All! Let'S Say I Have Two Functions Decorated With

Mmm I see. However I think that only the components used for that pipeline should be shown, as it may be the case that you have defined, say, 1000 components, and you only use 10 in a pipeline. I think that listing them all would just clutter up the results tab for that pipeline task

4 years ago
0 Since

For instance, the auto_connect family arguments

4 years ago
0 I'M Trying To Implement A Cleanup Service By Following This Example

Hi SuccessfulKoala55
So, how can I get the ID of the requested project through the resp object? I tried with resp["id"] but it didn't work.

4 years ago
0 Hi! From A Task Created Using

Well, I am thinking in the case that there are several pipelines in the system and that when filtering a task by its name and project I can get several tasks. How could I build a filter for Task.get_task(task_filter=...) that returns only the task whose parent task is the pipeline task?

4 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

AnxiousSeal95 I see. That's why I was thinking of storing the model inside a task just like with the Dataset class. So that you can either use just the model via InputModel or the model and all its artifacts via Task.get_task by using the ID of the task where the model is located.
I would like my cleanup service to remove all tasks older than two weeks, but not the models. Right now, if I delete all tasks the model does not work (as it needs the training tasks). For now, I ...

4 years ago
0 Hi! Is There Any Reason Why Integer/Float Values Are Casted To String When Connecting Arguments Dictionary To Task And Then Retrieve Them Using

SuccessfulKoala55 I have not tried yet with argparse, but maybe I will encounter the same problem

4 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

AgitatedDove14 Oops, something still seems to be wrong. When trying to retrieve the dataset using get_local_copy() I get the following error:
` Traceback (most recent call last):
File "/home/user/myproject/lab.py", line 27, in <module>
print(dataset.get_local_copy())
File "/home/user/.conda/envs/myenv/lib/python3.9/site-packages/clearml/datasets/dataset.py", line 554, in get_local_copy
target_folder = self._merge_datasets(
File "/home/user/.conda/envs/myenv/lib/python3.9/site-p...

4 years ago
0 Hi! If There Are Several Tasks Running Concurrently, Which Task Should

I have tried it and it depends on the context. When I call the method inside a function decorated with PipelineDecorator.component , I get the component task, while if I call it inside PipelineDecorator.pipeline , I get the task corresponding to the pipeline. However, as you said that is not the expected behavior, although I think it makes sense.

4 years ago
0 Hi! Can Someone Show Me An Example Of How

I see the point. The reason I'm using PipelineController now is that I've realised that in the code I only send IDs from one step of the pipeline to another, and not artefacts as such. So I think it makes more sense in this case to work with the former.

3 years ago
0 What Is The Recommended Way To Stop The Execution Of A Specific Agent? This Command Doesn'T Allow Me To Specify The Agent Ip I Want To Stop:

After doing so the agent is removed from the list provided by ps -ef | grep clearml-agent , but it is still visible from the ClearML UI and also when I run clearml-agent list

4 years ago
0 Why

I just placed tagging code before Task.execute_remotely() and now it works. Thank you! 🙂

3 years ago
0 Hi! Is There Any Reason Why Integer/Float Values Are Casted To String When Connecting Arguments Dictionary To Task And Then Retrieve Them Using

Well, I need to write boilerplate code to do parsing stuff if I want to use the original values after I connect the dictionary to the task, so it's a bit messy.
Currently I'm using clearml v1.0.5 and clearml-agent v1.0.0

4 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Yes, I'm working with the latest commit. Anyway, I have tried to run dataset.get_local_copy() on another machine and it works. I have no idea why this happens. However, on the new machine get_local_copy() does not return the path I expect. If I have this code:
dataset.upload( output_url="/home/user/server_local_storage/mock_storage" )I would expect the dataset to be stored under the path specified in output_url . But what I get with get_local_copy() is the follo...

4 years ago
0 Hi, I Am Having Difficulties When Using The Dataset Functionality. I Am Trying To Create A Dataset With The Following Simple Code:

Mmm but what if the dataset size is too large to be stored in the .cache path? It will be stored there anyway?

4 years ago
4 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 BTW, I got the notification from GitHub telling me you had committed the fix and I went ahead. After testing the code again, I see the task parameter dictionary has been removed properly (now it has been broken down into flat parameters). However, I still have the same problem with duplicate tasks, as you can see in the image.

3 years ago
0 Let'S Say That I Specify The

Sure, but I mean, apart from label it as a local path, what's the point of renaming the original path if my goal is to access it later using the name I gave it?

4 years ago
0 Hi All, I Am Testing The New

By the way, where can I change the default artifacts location ( output_uri ) if a have a script similar to this example (I mean, from the code, not agent's config):
https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py

4 years ago
0 Hi! Can Someone Show Me An Example Of How

Exactly!! That's what I was looking for: create the pipeline but not launching it. Thanks again AgitatedDove14

3 years ago
0 When Clearml Converts A

BTW, I would like to mention another problem related to this I have encountered. It seems that arguments of type 'int', 'float' or 'list' (maybe also happens with other types) are transformed to 'str' when passed to a function decorated with PipelineDecorator.component at the time of calling it in the pipeline itself. Again, is this something intentional?

4 years ago
0 Let'S Say That I Specify The

Now it's okey. I have found a more intuitive way to get around. I was facing the classic 'xy' problem :)

4 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Well, I see the same utility as it has in the first pipelines generation. After all, isn't the new decorator about keeping the same functionality but saving the user some boilerplate code?

4 years ago
Show more results compactanswers