Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
GiganticTurtle0
Moderator
46 Questions, 183 Answers
  Active since 10 January 2023
  Last activity 2 years ago

Reputation

0

Badges 1

183 × Eureka!
0 Votes
11 Answers
2K Views
0 Votes 11 Answers 2K Views
Hi! I was wondering why ClearML recognize Scikit-learn scalers as Input Models... Am I missing something here? For me it would make sense to include the scal...
4 years ago
0 Votes
9 Answers
2K Views
0 Votes 9 Answers 2K Views
It is a good practice to call a function decorated by PipelineDecorator in a for loop? I tried it in a real-world example and I didn't get the results I expe...
4 years ago
0 Votes
13 Answers
2K Views
0 Votes 13 Answers 2K Views
When ClearML converts a PipelineDecorator.component decorated function to script code, I have noticed that indexing syntax like A[:, 0] is rewritten as A[(:,...
4 years ago
0 Votes
29 Answers
2K Views
0 Votes 29 Answers 2K Views
Hi, I am having difficulties when using the Dataset functionality. I am trying to create a dataset with the following simple code: from clearml import Task, ...
4 years ago
0 Votes
8 Answers
2K Views
0 Votes 8 Answers 2K Views
3 years ago
0 Votes
14 Answers
2K Views
0 Votes 14 Answers 2K Views
Hi! Can someone show me an example of how PipelineController.create_draft works? I'm trying to create a template of a pipeline to run it later but I can't ge...
3 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
I have another question regarding creating a Task with PipelineDecorator.component . Where can I specify the reuse_last_task_id parameter? I need to set it t...
4 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
Hi! If there are several tasks running concurrently, which task should Task.current_task() return?
4 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
Hi all! Let's say I have two functions decorated with PipelineDecorator.pipeline . Then I have a set of functions decorated with PipelineDecorator.component ...
4 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
Why
Why Task.add_tags method has no effect when running remotely? What if I want to tag a step based on a parameter passed to the pipeline through PipelineContro...
3 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
Hi! Is there any reason why integer/float values are casted to string when connecting arguments dictionary to task and then retrieve them using task.get_para...
4 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
4 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
Is there any similar functionality for the PipelineController class that resembles the behavior of task.execute_remotely() (no arguments supplied)? I mean ju...
4 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
Is there any way to create a queue from code?
4 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
4 years ago
0 Votes
14 Answers
2K Views
0 Votes 14 Answers 2K Views
Hi all, I am testing the new PipelineDecorator feature. Is there any way to automatically detect the Git repository in the pipeline step decorated with Pipel...
4 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
Hi, Is there a simple way to make Task.init compatible with Dask.distributed client? When I try to run a script where I want to read concurrently a dataset i...
4 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
Hi, can anyone help me with this code? (just a mock example, but it nicely captures the behavior of the real code) import pandas as pd from clearml import Ta...
3 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Hi, Let's say I have several functions decorated with PipelineDecorator.component (functions A, B and C). Function C can only be executed after functions A a...
4 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
Hello, I have the following basic snippet where I'm trying to add another value to the Task's connected arguments after calling task.connect(args) . Script e...
4 years ago
0 Votes
11 Answers
2K Views
0 Votes 11 Answers 2K Views
Let's say that I specify the output_uri parameter in Task.init like this: task = Task.init( project_name="example_project", task_name="example_task", output_...
4 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
It is possible to attach to an OutputModel an object closely related to it (as some product of data preprocessing that has been done specifically for that mo...
4 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
4 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
3 years ago
0 Votes
12 Answers
2K Views
0 Votes 12 Answers 2K Views
4 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
Hi everybody, Where can I find the documentation about the new TaskScheduler feature?
4 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
Hi, I have a question regarding the new PipelineDecorator feature and it's about how to access the task created by PipelineDecorator.pipeline through its ID ...
4 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
Hi, not sure if I'm doing something wrong or I found a bug. When I try to overwrite some parameters in a cloned task using get_parameters and set_parameters ...
4 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
Hi all! When I set a list as a Task parameter and later try to retrieve it, what I get is a string. Is this the expected behavior? I have prepared the follow...
3 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
Hi, Is there any reason why artifacts linked to a task are not removed when the task is removed from the experiment list?
4 years ago
Show more results questions
0 Hi All! I Noticed When A Pipeline Fails, All Its Components Continue Running. Wouldn'T It Make More Sense For The Pipeline To Send An Abort Signal To All Tasks That Depend On The Pipeline? I'M Using Clearml V1.1.3Rc0 And Clearml-Agent 1.1.0

Well, I see the same utility as it has in the first pipelines generation. After all, isn't the new decorator about keeping the same functionality but saving the user some boilerplate code?

4 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 After checking, I discovered that apparently it doesn't matter if each pipeline is executed by a different worker, the error persists. Honestly this has me puzzled. I'm really looking forward to getting this functionality right because it's an aspect that would make ClearML shine even more.

3 years ago
0 Regarding The New Version 1.1.2, I Have Noticed Type Hints Are Now Included In The Script Generated By

Sure, converting pipelines into components also works for me (ignoring still having to fix the problem with LazyEvalWrapper return values). But this way some interesting features of the pipeline are missing, such as displaying the step execution DaG in the PLOTS tab .

4 years ago
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

To sum up, we agree that it will be nice to enable the nested components tags. I will continue playing with the capabilities of nested components and keep reporting bugs as I come across them!

4 years ago
0 Hi All, I Am Testing The New

I am aware of the option to enable virtual environment caching, but that is still very time consuming.

4 years ago
4 years ago
0 Hi All, I Am Testing The New

Sure, it's already enabled. I noticed in the ClearML agent configuration another parameter related to environment caching, named as venv_update (I believe it's still in beta). Do you think enabling this parameter significantly helps to build environments faster?

Yes, I guess. Since pipelines are designed to be executed remotely it may be pointless to enable an output_uri parameter in the PipelineDecorator.component . Anyway, could another task be initialized in the same scr...

4 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

AgitatedDove14 I have the strong feeling it must be an agent issue, because when I place PipelineDecorator.run_locally() before calling the pipeline, everything works perfectly. See:

3 years ago
0 Hi, Can Anyone Help Me With This Code? (Just A Mock Example, But It Nicely Captures The Behavior Of The Real Code)

Hey CostlyOstrich36 AgitatedDove14 ! Any news on this? Should I open an issue?

3 years ago
0 Hello Folks! I Don'T Know If This Issue Has Already Been Addressed. I Have A Basic Pipelinecontroller Script With Two Steps: One Of Task Is For Preprocessing Purposes And The Other For Training A Model. Currently I Am Placing The Code Related To The Pack

I mean that I have a script for data preprocessing task where I need the following dependencies:

` import sys
from pathlib import Path
from contextlib import contextmanager

import numpy as np
from clearml import Task

with add_temporary_module_search_path("/home/user/myclearML/"):
from helpers import (
read_netcdf_dataset,
write_records,
) However, the xarray package is a dependency of the helpers module which is required by the read_netcdf_dataset `...

4 years ago
0 Hi All, I Am Faced With The Situation That My Company'S Gitlab Is Temporarily Out Of Service At A Certain Time In The Early Morning (Due To Regular Maintenance Service, Something I Cannot Control). Normally, My System'S Inference Pipelines Are Scheduled T

Mmm well, I can think of a pipeline that could save its state in the instant before the error occurred. So that using some crontab/scheduler the pipeline could be resumed at the point where it was stopped in the case of not having been completed. Is there any functionality like this? Something like PipelineDecorator/PipelineController.resume_from(state_filepath) ?

4 years ago
0 Hi All! When I Set A List As A Task Parameter And Later Try To Retrieve It, What I Get Is A String. Is This The Expected Behavior? I Have Prepared The Following Snippet So That You Can Reproduce It.

Thanks AgitatedDove14 ! Wow, I was definitely not expecting that behavior 🤣 I will check it out tomorrow. Just one more thing, what do you mean by "my_task_id_that_i_generated_before_here"?

3 years ago
0 Hello Folks! I Don'T Know If This Issue Has Already Been Addressed. I Have A Basic Pipelinecontroller Script With Two Steps: One Of Task Is For Preprocessing Purposes And The Other For Training A Model. Currently I Am Placing The Code Related To The Pack

When you said clearml-agent initial setup are you talking about the agent section in the clearml.conf or the CLI instructions? If it is the second case I am starting the agent with the basic command:
clearml-agent daemon --queue defaultIs there any other settings I should specify to the agent?

4 years ago
0 Hi All, I Am Testing The New

How can I tell clearml I will use the same virtual environment in all steps and there is no need to waste time re-installing all packages for each step?

4 years ago
0 What Is The Recommended Way To Stop The Execution Of A Specific Agent? This Command Doesn'T Allow Me To Specify The Agent Ip I Want To Stop:

But how can I reference that exact daemon execution? I tried with the ID but it fails:

clearml-agent daemon AGENT_ID --stop

4 years ago
0 Hi! Can Someone Show Me An Example Of How

Hi AgitatedDove14 , just one last thing before closing the thread. I was wondering what is the use of PipelineController.create_draft if you can't use it to clone and run tasks, as we have seen

3 years ago
0 Hi, Not Sure If I'M Doing Something Wrong Or I Found A Bug. When I Try To Overwrite Some Parameters In A Cloned Task Using

I have only checked it from code.
Exactly, I have followed that same workflow shown in that example. Maybe it has something to do with the dictionary's mutability?

4 years ago
0 Hi! I Noticed A Bug Related To Reusing The Same Component In A Pipeline. I Have Prepared A Mock Example So That You Can Reproduce It:

Nested pipelines do not depend on each other. You can think of it as several models being trained or doing inference at the same time, but each one delivering results for a different client. So you don't use the output from one nested pipeline to feed another one running concurrently, if that's what you mean.

4 years ago
0 Is There Any Reason Why Doing The Following Is Not Possible? Am I Doing It Right? I Want To Run A Pipeline With Different Parameters But I Get The Following Error?

What exactly do you mean by that? From VS Code I execute the following script, and then the agents take care of executing the code remotely:
` import pandas as pd

from clearml import Task, TaskTypes
from clearml.automation.controller import PipelineDecorator

CACHE = False

@PipelineDecorator.component(
name="Wind data creator",
return_values=["wind_series"],
cache=CACHE,
execution_queue="data_cpu",
task_type=TaskTypes.data_processing,
)
def generate_wind(start_date: st...

3 years ago
0 Hi, Can Anyone Help Me With This Code? (Just A Mock Example, But It Nicely Captures The Behavior Of The Real Code)

CostlyOstrich36 Yes, it happens on the following line, at the time of calling the pipeline.
forecast = prediction_service(config=default_config)Were you able to reproduce the example?

3 years ago
4 years ago
0 Hi, Not Sure If I'M Doing Something Wrong Or I Found A Bug. When I Try To Overwrite Some Parameters In A Cloned Task Using

Yes, when the parameters that are connected do not have nested dictionaries, everything works fine. The problem comes when I try to do something like this:

` from clearml import Task

task = Task.init(project_name="Examples", task_name="task with connected dict")

args = {}
args["period"] = {"start": "2020-01-01 00:00", "end": "2020-12-31 23:00"}

task.connect(args) `
and the clone task is like this:

` from clearml import Task

template_task = Task.get_task(task_id="<Your template task id>"...

4 years ago
0 Hi All, I Am Testing The New

Okay, so the idea behind the new decorator is not to group all the defined steps under the same script so that they share the same environment, but rather to simplify the process of creating scripts for each step and avoid manually calling Task.init on those scripts.

Regarding virtual environment creation from caching, I will keep running benchmarks (from what you say it might be due to high workload in the servers we use)

So far I've been unlucky in the attempt of clearml recog...

4 years ago
Show more results compactanswers