Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
48 Questions, 8049 Answers
  Active since 10 January 2023
  Last activity 6 months ago

Reputation

0

Badges 1

25 × Eureka!
0 Hi. I Have A

Are you saying this component should pull a specific git repo?
PipelineDecorator.component( ..., )seems like there is no reference to a specific repo (arguments repo and repo_branch etc are missing) is that correct?

2 years ago
0 Hi Everyone! I'M A Clearml Newbie Trying It Out In My Local Environment With The Docker Compose Installation Described Here:

Hi @<1668065560107159552:profile|VivaciousPenguin20>
I think you are looking at the wrong experiment, this is a 3 year old experiment ? this does not seem to be your currently executed experiment, right?

8 months ago
0 Hi, I Am Getting Following Error While Trying To Checkout A Gut Hub Rep. Error: Rpc Failed; Curl 56 Gnutls Recv Error (-54): Error In The Pull Function. Fatal: The Remote End Hung Up Unexpectedly Fatal: Early Eof Fatal: Index-Pack Failed Repository Cloni

Hmm, you are missing the entry point in the execution (script path).
Also as I mentioned you can either have a git repo or script in the uncommitted changes, but not both (if you have a git repo then the uncommitted changes are the git diff)

4 years ago
0 Hi Again! I Am Doing Batch Inference From A Parent Task (That Defines A Base Docker Image). However, I'Ve Encountered An Issue Where The Task Takes Several Minutes (Approximately 3-5 Minutes) Specifically When It Reaches The Stage Of "Environment Setup Co

Hi @<1529633468214939648:profile|CostlyElephant1>
what seems to be the issue? I could not locate anything in the log

"Environment setup completed successfully
Starting Task Execution:"

Do you mean it takes a long time to setup the environment inside the container?

CLEARML_AGENT_SKIP_PIP_VENV_INSTALL and CLEARML_AGENT_SKIP_PYTHON_ENV_INSTALL,

It seems to be working, as you can see no virtual environment is created, the only thing that is installed is the cleartml-agent that i...

8 months ago
0 Hi Again, I Am Trying To Execute A Pipeline Remotely, However I Am Running Into A Problem With The Steps That Require A Local Package. Basically I Have A Repo, That I Created Specifically For This Pipeline And I Have Packaged It So That I Can Split It I

I would just add git+ None to your requirements (either in the requirements.txt or even better as part of the pipeline/component where you also specify the repo to be used)
The agent will automatically push the crednetilas when it installs the repo as wheel.
wdyt?
btw: you might also get away with adding -e . into the requirements.txt (but you will need to test that one)

7 months ago
0 Hi, Recently Came Across Trains And Very Impressed By The Work So Far. But A Problem Has Been Bugging Me, This Is Part Of The Trains Log Files I Thought Might Be Useful From Cloning And Enqueuing The Same Task On 2 Remote Machines. The First Machine Defau

AntsySeagull45 kudos on sorting it out 🙂
quick note, trains-agent will try to run the python version specified by the original Task. i.e. if you were running python3.7 it will first try to look for python 3.7 then if it is not there it will run the default python3. This allows a system with multiple python versions to run exactly the python version you had on your original machine. The fact that it was trying to run python2 is quite odd, one explanation I can think of is if the original e...

4 years ago
0 Hi Again, I Am Trying To Execute A Pipeline Remotely, However I Am Running Into A Problem With The Steps That Require A Local Package. Basically I Have A Repo, That I Created Specifically For This Pipeline And I Have Packaged It So That I Can Split It I

Hi @<1523701168822292480:profile|ExuberantBat52>

I am trying to execute a pipeline remotely,

How are you creating your pipeline? and are you referring to an issue with the pipeline logic or is it a component that needs that repo installed ?

7 months ago
0 I Have A Situation Where I’D Like To “Promote” The Pipeline (And Dataset) By Creating It In A Completely Separate Instance Of Clearml Server Which Is Used For Production Retraining (Vs. The Dev. Clearml Server That Is Used For Experiments) A) Is This Some

Hi RoughTiger69
A. Yes makes total sense . Basically you can use Task.export Task.import to do achieve this process (notice we assume the dataset artifacts links are available on both, usually this is the case)

B. The easiest way would be to use Process , then one subprocess is exporting from dev , where the credentials and configuration is passed with os environment. The another subprocess imports it to the prod server (again with os environment pointing to the prod server). Make sense?

2 years ago
0 Hi, Recently Came Across Trains And Very Impressed By The Work So Far. But A Problem Has Been Bugging Me, This Is Part Of The Trains Log Files I Thought Might Be Useful From Cloning And Enqueuing The Same Task On 2 Remote Machines. The First Machine Defau

One more question, in the second log, trains agent is configured with Conda, on the first it is configured with pip, or at least this is what it looks like, can you confirm?

4 years ago
0 If I Have A Task And A Dataset Is Being Created In A Task, How Can I Get A “Link” That This Dataset Is Created In This Task, Similar To How Model Has The Task Where It Came From

Also, the IDs as an entry in the Configuration will not be clickable in the web interface, right?

No, but on the other hand, it will be editable if you clone the Task.
Which brings me to a different scenario,
In the original one, the Main Task created the Dataset, i.e. Output Dataset (and stored it both ways).
I could think of a situation the Task is using the Dataset as input (say preprocessing or traing), then we might want to enable users to clone and change the Input dataset. wdyt?

3 years ago
0 If I Have A Task And A Dataset Is Being Created In A Task, How Can I Get A “Link” That This Dataset Is Created In This Task, Similar To How Model Has The Task Where It Came From

Regrading the first direction, this was just pushed 🙂
https://github.com/allegroai/clearml/commit/597a7ed05e2376ec48604465cf5ebd752cebae9c

Regrading the opposite direction:
That is a good question, I really like the idea of just adding another section named Datasets
SucculentBeetle7 should we do that automatically?

3 years ago
0 If I Have A Task And A Dataset Is Being Created In A Task, How Can I Get A “Link” That This Dataset Is Created In This Task, Similar To How Model Has The Task Where It Came From

Basically you create the Task and make sure the "Dataset" is attached to it:
task = Task.init(...) dataset = Dataset.create(task=task) dataset.add_files(...)This will make sure the code is attached to the Dataset

3 years ago
0 If I Have A Task And A Dataset Is Being Created In A Task, How Can I Get A “Link” That This Dataset Is Created In This Task, Similar To How Model Has The Task Where It Came From

So you want to have two Tasks and connect the two ?
Maybe the best approach is to have th current_task. the parent of the Dataset Task ?
dataset._task.set_parent(Task.current_task())

3 years ago
0 If I Have A Task And A Dataset Is Being Created In A Task, How Can I Get A “Link” That This Dataset Is Created In This Task, Similar To How Model Has The Task Where It Came From

Seems like passing the Task object is not working as expected (I'll make sure it is fixed).
Try:
dataset._task.set_parent(Task.current_task().id)

3 years ago
0 Kindly Let Me Know, How To Change The Hyper-Parameter Value ?

So now for it to take place you need to enqueue the Task and set an agent to pick it up and run it.
When the agent is running the Task the new parameter will be passed.
does that make sense ?

one year ago
0 Hi, Is A Remote Task Execution Wit Azure Devops Private Repository Working? I Am Hitting A Problem Where Clearml Is Creating A Wrong Url For Loading:

is it also possible to somehow propagate ssh keys to the agent pod? Not sure how to approach that

I would use the k8s secret manager to do that (there is a way to mount secrets files into pod, SSH is relatively standard to do)

one year ago
0 I Am Using Pipelines (Just Starting) And I Am Checking Different Options For Overriding Parts Of Configuration Of The Base Task (Step Of My Pipeline). In The Docs For Parameter_Override One Can Find:

Hi UpsetTurkey67

"General/my_parameter_name" so that only this part of the configuration will be updated?

I'm assuming this is a Hyperparameter not a configuration object (i.e. task.connect not task.connect_configuration), if this is the case then Yes 🙂

2 years ago
0 Hi Again, Is There A Way To Pass Secrets As Parameters Of A Task? I Have An Experiment That Requires Connecting To A Database, And I Need To Be Able To Pass The Creds As Task Params (Or In Another Way, I Don'T Know Yet). But I Don'T Want To Expose My Cred

I'm sorry JitteryCoyote63 No 😞
I do know that the enterprise addition have these features (a.k.a vault & permissions), basically to answer these types of situations.

3 years ago
0 Hi, I'M Trying To Clone And Queue Experiments For Running Them On My Workers. I Am Able To Successfully Clone And Queue The Task, But Seems Like The Task Does Not Pass The Correct Parameters To My Python Script On The Worker. We Use Hydra For Configuring

I just cloned it from the examples that are available in the SaaS console upon account creation

Ohhh! that would explain it. Maybe it is broken there?! let me check a second

2 years ago
0 Hi, I'M Trying To Clone And Queue Experiments For Running Them On My Workers. I Am Able To Successfully Clone And Queue The Task, But Seems Like The Task Does Not Pass The Correct Parameters To My Python Script On The Worker. We Use Hydra For Configuring

The package detection is done when running the code on your laptop, and this is when it first logs the packages and versions. Following it, what do you have on your laptop? OS/Conda/Python

2 years ago
0 Hi, I'M Trying To Clone And Queue Experiments For Running Them On My Workers. I Am Able To Successfully Clone And Queue The Task, But Seems Like The Task Does Not Pass The Correct Parameters To My Python Script On The Worker. We Use Hydra For Configuring

Oh no, you are absolutely correct, it is broken (I mean I have no idea why it lists Hydra, or how it got there). I will let the guys know and fix it.
Bottom line, after you clone it, please edit the installed packages and remove the "Hydra" line and replace with just "hydra-core" (no need for version).
The format is the same as "requirements.txt" and will effect the venv created by the agent

2 years ago
Show more results compactanswers