Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
6 Answers
651 Views
0 Votes 6 Answers 651 Views
Hi all! Let's say I have two functions decorated with PipelineDecorator.pipeline . Then I have a set of functions decorated with PipelineDecorator.component ...
2 years ago
0 Votes
9 Answers
653 Views
0 Votes 9 Answers 653 Views
It is a good practice to call a function decorated by PipelineDecorator in a for loop? I tried it in a real-world example and I didn't get the results I expe...
2 years ago
0 Votes
11 Answers
691 Views
0 Votes 11 Answers 691 Views
Let's say that I specify the output_uri parameter in Task.init like this: task = Task.init( project_name="example_project", task_name="example_task", output_...
2 years ago
0 Votes
29 Answers
706 Views
0 Votes 29 Answers 706 Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
2 years ago
0 Votes
5 Answers
681 Views
0 Votes 5 Answers 681 Views
Hi! From a task created using PipelineDecorator.pipeline , is there any way to get a task ID from the name of the step listed in the table below? My plan is ...
2 years ago
0 Votes
6 Answers
625 Views
0 Votes 6 Answers 625 Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
2 years ago
0 Votes
10 Answers
645 Views
0 Votes 10 Answers 645 Views
Hi! Is there any reason why integer/float values are casted to string when connecting arguments dictionary to task and then retrieve them using task.get_para...
2 years ago
0 Votes
6 Answers
608 Views
0 Votes 6 Answers 608 Views
2 years ago
0 Votes
10 Answers
695 Views
0 Votes 10 Answers 695 Views
Hi, Is there a simple way to make Task.init compatible with Dask.distributed client? When I try to run a script where I want to read concurrently a dataset i...
2 years ago
0 Votes
1 Answers
610 Views
0 Votes 1 Answers 610 Views
It is possible to attach to an OutputModel an object closely related to it (as some product of data preprocessing that has been done specifically for that mo...
2 years ago
0 Votes
1 Answers
655 Views
0 Votes 1 Answers 655 Views
Hi everybody, Where can I find the documentation about the new TaskScheduler feature?
2 years ago
0 Votes
0 Answers
734 Views
0 Votes 0 Answers 734 Views
Hi, Let's say I have several functions decorated with PipelineDecorator.component (functions A, B and C). Function C can only be executed after functions A a...
2 years ago
0 Votes
11 Answers
799 Views
0 Votes 11 Answers 799 Views
Hi guys, Suppose I have the following script: import numpy as np import pandas as pd from clearml import Task # Import required project dependencies. from tf...
2 years ago
0 Votes
2 Answers
617 Views
0 Votes 2 Answers 617 Views
Since PipelineDecorator automatically starts the task for you, is there any way to specify arguments to Task.init in the task created for a function decorate...
2 years ago
0 Votes
7 Answers
771 Views
0 Votes 7 Answers 771 Views
Hi, not sure if I'm doing something wrong or I found a bug. When I try to overwrite some parameters in a cloned task using get_parameters and set_parameters ...
2 years ago
0 Votes
10 Answers
632 Views
0 Votes 10 Answers 632 Views
Hi all! When I set a list as a Task parameter and later try to retrieve it, what I get is a string. Is this the expected behavior? I have prepared the follow...
2 years ago
0 Votes
8 Answers
618 Views
0 Votes 8 Answers 618 Views
2 years ago
0 Votes
10 Answers
630 Views
0 Votes 10 Answers 630 Views
Is there any example showing how to work with nested pipelines? In my case I have several functions decorated with PipelineDecorator . In a pipeline I call s...
2 years ago
0 Votes
14 Answers
661 Views
0 Votes 14 Answers 661 Views
Hi! Can someone show me an example of how PipelineController.create_draft works? I'm trying to create a template of a pipeline to run it later but I can't ge...
2 years ago
0 Votes
18 Answers
674 Views
0 Votes 18 Answers 674 Views
Regarding the new version 1.1.2, I have noticed type hints are now included in the script generated by PipelineDecorator.component in the function arguments....
2 years ago
0 Votes
12 Answers
752 Views
0 Votes 12 Answers 752 Views
2 years ago
0 Votes
1 Answers
640 Views
0 Votes 1 Answers 640 Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
2 years ago
0 Votes
30 Answers
650 Views
0 Votes 30 Answers 650 Views
Is there any reason why doing the following is not possible? Am I doing it right? I want to run a pipeline with different parameters but I get the following ...
2 years ago
0 Votes
14 Answers
647 Views
0 Votes 14 Answers 647 Views
Hi all, I am testing the new PipelineDecorator feature. Is there any way to automatically detect the Git repository in the pipeline step decorated with Pipel...
2 years ago
0 Votes
11 Answers
701 Views
0 Votes 11 Answers 701 Views
What is the recommended way to stop the execution of a specific agent? This command doesn't allow me to specify the agent IP I want to stop: clearml-agent da...
2 years ago
0 Votes
3 Answers
632 Views
0 Votes 3 Answers 632 Views
I have another question regarding creating a Task with PipelineDecorator.component . Where can I specify the reuse_last_task_id parameter? I need to set it t...
2 years ago
0 Votes
13 Answers
628 Views
0 Votes 13 Answers 628 Views
When ClearML converts a PipelineDecorator.component decorated function to script code, I have noticed that indexing syntax like A[:, 0] is rewritten as A[(:,...
2 years ago
0 Votes
2 Answers
711 Views
0 Votes 2 Answers 711 Views
Hello, I was wondering if clearML offers the option to spin up again the clearml-agent automatically every time the machine where it was being executed as a ...
2 years ago
0 Votes
2 Answers
705 Views
0 Votes 2 Answers 705 Views
2 years ago
0 Votes
7 Answers
762 Views
0 Votes 7 Answers 762 Views
2 years ago
Show more results questions
0 When Clearml Converts A

Nice, in the meantime as a workaround I will implement a temporary parsing code at the beginning of step functions

2 years ago
0 Hi! I Was Wondering Why Clearml Recognize Scikit-Learn Scalers As Input Models... Am I Missing Something Here? For Me It Would Make Sense To Include The Scalers As A Configuration Object Of The Trained Model, Not Outside

Oh, I see. This explains the surprising behavior. But what if Task.init code is created automatically by PipelineDecorator.component ? How can I pass arguments to the init method in that case?

2 years ago
0 Since

For instance, the auto_connect family arguments

2 years ago
0 Hi, Not Sure If I'M Doing Something Wrong Or I Found A Bug. When I Try To Overwrite Some Parameters In A Cloned Task Using

Yes, when the parameters that are connected do not have nested dictionaries, everything works fine. The problem comes when I try to do something like this:

` from clearml import Task

task = Task.init(project_name="Examples", task_name="task with connected dict")

args = {}
args["period"] = {"start": "2020-01-01 00:00", "end": "2020-12-31 23:00"}

task.connect(args) `
and the clone task is like this:

` from clearml import Task

template_task = Task.get_task(task_id="<Your template task id>"...

2 years ago
0 When Clearml Converts A

Exactly, when 'extra' has a default value (in this case, 43), the argument preserves its original type. However, when 'extra' is a positional argument then it is transformed to 'str'

2 years ago
0 Let'S Say That I Specify The

My idea is to take advantage of the capability of getting parameters connected to a task from another task to read the path where the artifacts are stored locally, so I don't have to define it again in each script corresponding to a different task.

2 years ago
0 Let'S Say That I Specify The

Sure, but I mean, apart from label it as a local path, what's the point of renaming the original path if my goal is to access it later using the name I gave it?

2 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 BTW, I got the notification from GitHub telling me you had committed the fix and I went ahead. After testing the code again, I see the task parameter dictionary has been removed properly (now it has been broken down into flat parameters). However, I still have the same problem with duplicate tasks, as you can see in the image.

2 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

What exactly do you mean by that? From VS Code I execute the following script, and then the agents take care of executing the code remotely:
` import pandas as pd

from clearml import Task, TaskTypes
from clearml.automation.controller import PipelineDecorator

CACHE = False

@PipelineDecorator.component(
name="Wind data creator",
return_values=["wind_series"],
cache=CACHE,
execution_queue="data_cpu",
task_type=TaskTypes.data_processing,
)
def generate_wind(start_date: st...

2 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

Yes, although I use both terms interchangeably. The information will actually be contained in JSON files.

2 years ago
0 Hi! Can Someone Show Me An Example Of How

Hi AgitatedDove14 , just one last thing before closing the thread. I was wondering what is the use of PipelineController.create_draft if you can't use it to clone and run tasks, as we have seen

2 years ago
0 Hi! Can Someone Show Me An Example Of How

Having the ability to clone and modify the same task over and over again, in principle I would no longer need the multi_instance support feature from PipelineDecorator.pipeline. Is this correct, or are they different things?

2 years ago
0 When Clearml Converts A

Sure, I will post a mock example in a while

2 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

Hi AnxiousSeal95 !
Yes, main reason is to unclutter the ClearML Web UI but also free up space on our server (mainly due to the large size of the datasets). Once the models are trained, I want to retrain them periodically, and to do so I would like all the data specifications and artifacts generated during training to be linked to the model found under the " Models" section.
What I propose is somehow similar to the functionality of clearml.Dataset . These datasets are themselves a task t...

2 years ago
0 Hi! I Am Implementing A Cleanup Service. After Completing Several Training Tasks, I Am Only Interested In The Trained Models And Some Artifacts Resulting From The Training Process (Such As Scalers, Etc.). Therefore, I Would Like To Remove All The Tasks Th

Hi AnxiousSeal95 !
That's it. My idea is that artifacts can be linked to the model. Typically these artifacts are often links to serialized objects (such as datasets or scalers). They are usually directories or temporary files in mount units that I want to be loaded as artifacts of the task, removed (as they are temporary) and later I can get a new local path via task.artifacts["scalers"].get_local_copy() . I think this way the model's dependence on the task that created it could be re...

2 years ago
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

Well, instead of plain functions or files I use components because I need some of those steps to run on one machine and some on another. And it works perfectly fine (ignoring some minor bugs like this one). So I'm actually inserting component-decorated functions into 'helper_functions' parameter

2 years ago
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

Can you think of any other way to launch multiple pipelines concurrently? Since we have already seen it is only possible to run a single Pipelinecontroller in a single Python process

2 years ago
2 years ago
0 Hi, Is There A Simple Way To Make

For sure! Excluding some parts related to preprocessing, this is the code I would like to parallelize with dask.distributed.Client .

` from typing import Any, Dict, List, Tuple, Union
from pathlib import Path

import xarray as xr
from clearml import Task
from dask.distributed import Client, LocalCluster

def start_dask_client(
n_workers: int = None, threads_per_worker: int = None, memory_limit: str = "2Gb"
) -> Client:
cluster = LocalCluster(
n_workers=n_workers,
...

2 years ago
0 Hi, Is There A Simple Way To Make

I see, but I don't understand the part where you talk about passing the task ID to the child processes. Sorry if it's something trivial. I recently started working with ClearML.

2 years ago
0 Hi, Is There A Simple Way To Make

Yep, you were absolutely right. What Dask did not like was the object self.preprocesser inside read_and_process_file , not Task.init . Since the dask.distributed.Client is initialized in that same class, maybe it's something that Dask doesn't allow.

Sorry for blaming ClearML without solid evidence x)

2 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Or maybe you could bundle some parameters that belongs to PipelineDecorator.component into high-level configuration variable (something like PipelineDecorator.global_config (?))

2 years ago
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

I'm totally agree with the pipelinecontroller/decorator part. Regarding the proposal for the component parameter, I also think it would be a good feature, although it might mislead the fact that there will be times when the pipeline will fail because it is an intrinsically crucial step, so it doesn't matter whether 'continue_pipeline_on_failure' is set to True or False. Anyway, I can't think a better way to deal with that right now.

2 years ago
Show more results compactanswers