Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
VexedCat68
Moderator
60 Questions, 381 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

371 × Eureka!
0 Votes
25 Answers
2K Views
0 Votes 25 Answers 2K Views
I'll just ask this question again to get some fresh attention to this. Is there any way to run a pipeline step conditionally? E.g, under certain condition, e...
3 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
When i create clearml-dataset from the cli, I get an id. The same doesn't happen when I use the Python API. is there any way to get the ID in python?
4 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
3 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
Is there any way to stop all clearml agent workers on a machine or stop workers from the clearml ui?
3 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
How can I register a json file I'm creating as an artifact
3 years ago
0 Votes
14 Answers
2K Views
0 Votes 14 Answers 2K Views
3 years ago
0 Votes
11 Answers
2K Views
0 Votes 11 Answers 2K Views
3 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
4 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
In the configuration section of an experiment, I can see the args I passed via argparse. Is there any way to create other sections other than args and tf_define
3 years ago
0 Votes
13 Answers
2K Views
0 Votes 13 Answers 2K Views
4 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
4 years ago
0 Votes
4 Answers
2K Views
0 Votes 4 Answers 2K Views
I'm trying to run a task on an agent. I've passed the requirements file but it isn't able to install it. The error is in the reply. Help would be appreciated.
3 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
3 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
Would appreciate some help. Getting this Error. ValueError: Node train_model, parameter '${split_dataset.split_dataset_id}', input type 'split_dataset_id' is...
3 years ago
0 Votes
12 Answers
2K Views
0 Votes 12 Answers 2K Views
Is there a quicker way to abort all running experiments in a project? I have over a thousand running anonymous data tasks in a specific project and I want to...
3 years ago
0 Votes
25 Answers
2K Views
0 Votes 25 Answers 2K Views
Um, is there a way to delete an artifact from a task that is running?
3 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
3 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
Is it possible to add just a string or some other object as an artifact? if yes, then how?
3 years ago
0 Votes
9 Answers
2K Views
0 Votes 9 Answers 2K Views
Trying to create a data pipeline on my own. Wanted to ask, for each batch of data, do I have to create a new Dataset Object or do I just create one Dataset O...
4 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
In your docs for Dataset at https://clear.ml/docs/latest/docs/references/sdk/dataset#class-dataset , I think you might have duplicate explanations for list_m...
3 years ago
0 Votes
11 Answers
2K Views
0 Votes 11 Answers 2K Views
3 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
Given this pipeline step, is there any way to get the return value outside of the pipeline? Like put split_dataset_id in a variable in the main pipeline file.
3 years ago
0 Votes
4 Answers
2K Views
0 Votes 4 Answers 2K Views
Is it possible to write text file and see it in results of the experiment? I want to use it to version data as in keeping a track of what images have been tr...
4 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
When I upload and publish data to clearml-data, it says successful. Now when i try to get it using id, I get the following error. Error: __get_tasks() got mu...
4 years ago
0 Votes
13 Answers
2K Views
0 Votes 13 Answers 2K Views
3 years ago
0 Votes
8 Answers
2K Views
0 Votes 8 Answers 2K Views
Is there a direct way to get a model using its id like it works with Dataset.get?
3 years ago
0 Votes
22 Answers
2K Views
0 Votes 22 Answers 2K Views
I've setup my own clearml server. Only problem is, I can't seem to find where the credentials are. I've attached a screenshot.
4 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
4 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
4 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
3 years ago
Show more results questions
0 I Keep Facing This Issue. I'M Trying To Set Up My Own Clearml Server Using This Tutorial.

I think I get what you're saying yeah. I don't know how I would give each server a different cookie name. I can see this problem being resolved by clearing cookies or manually entering /login at the end of the url

4 years ago
0 Should Dataset Triggers Also Be Activated If There Is No Trigger Condition Except Dataset_Project And A New Task Starts In That Project? Is This Expected Behavior?

So in my case where I schedule a task every time I publish a data, when I publish my dataset once, it triggers and starts a new task.

4 years ago
0 Should Dataset Triggers Also Be Activated If There Is No Trigger Condition Except Dataset_Project And A New Task Starts In That Project? Is This Expected Behavior?

I just assumed it should only be triggered by dataset related things but after a lot of experimenting i realized its also triggered by tasks, if the only condition passed is dataset_project and no other specific trigger condition like on publish or on tags are added.

4 years ago
0 Is There Any Way To Stop All Clearml Agent Workers On A Machine Or Stop Workers From The Clearml Ui?

Basically there is an agent still listening to a queue on a machine which i might've started at some point but i can't seem to stop it.

3 years ago
4 years ago
0 I'M Looking At How Triggers Work In Clearml. Is There An Example, Maybe With Clearml Data And A Dataset Being Uploaded Or Some Other Example?

It works, however it shows the task is enqueued and pending. Note I am using .start() and not .start_remotely() for now

4 years ago
0 Is There Any Way To Stop All Clearml Agent Workers On A Machine Or Stop Workers From The Clearml Ui?

alright, so is there no way to kill it using worker id or worker name?

3 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

Also, since I plan to not train on the whole dataset and instead only on a subset of the data, I was thinking of making each batch of data a new dataset and then just merging the subset of data I want to train on.

3 years ago
0 Trying To Create A Data Pipeline On My Own. Wanted To Ask, For Each Batch Of Data, Do I Have To Create A New Dataset Object Or Do I Just Create One Dataset Object And Add Batches To It. If Its The Latter, Then How.

I'm kind of new to developing end to end applications so I'm also learning how the predefined pipelines work as well. I'll take a look at the clear ml custom pipelines

4 years ago
0 Hey Guys, Sorry For The Rapid Fire Questions In The Past Few Days. I Have Another Issue Though. I Initially Ran A Task, Directly From A Repo. It Succesfully Installed The Requirements From The Requirements File In The Repo And Ran The Task Without Any Iss

The situation is such that I needed a continuous training pipeline to train a detector, the detector being Ultralytics Yolo V5.

To me, it made sense that I would have a training task. The whole training code seemed complex to me so I just modified it just a bit to fit my needs of it getting dataset and model from clearml. Nothing more.

I think created a task using clearml-task and pointed it towards the repo I had created. The task runs fine.

I am unsure at the details of the training code...

3 years ago
0 Is The Only Available Resource To Learn And Use Clearml-Serving, The Github Repo, Or Are Their Other Resources As Well? Also, In The Repo, Once The Model Is Served, It Says I Can Curl To The End Point And It Mentions <Serving-Engine-Ip> But I Have No Ide

Have never done something like this before, and I'm unsure about the whole process from successfully serving the model to sending requests to it for inference. Is there any tutorial or example for it?

3 years ago
0 I Keep Facing This Issue. I'M Trying To Set Up My Own Clearml Server Using This Tutorial.

You can see there's no task bar on the left. basically I can't get any credentials to the server or check queues or anything.

4 years ago
0 Once I Set Up A Clear Ml Server On A Machine. I Understand I Need An Agent Listening On A Queue To Run Code. Do I Absolutely Have To Have An Agent And The Gpu On The Same Machine As The Server? Also Do The Gpu And Agent Have To Be On The Same Machine?

Ok. I kind of have a confusion now. Suppose I have an agent listening to some Queue X. If someone else on some other machine enqueues their task on Queue X, will my agent run it?

4 years ago
3 years ago
0 Is There A Direct Way To Get A Model Using Its Id Like It Works With Dataset.Get?

I did this but this gets me an InputModel. I went through the InputModel class but I'm still unsure how to get the actual tensorflow model.

3 years ago
0 I Keep Facing This Issue. I'M Trying To Set Up My Own Clearml Server Using This Tutorial.

I think maybe it does this because of cache or something. Maybe it keeps a record of an older login and when you restart the server, it keeps trying to use the older details maybe

4 years ago
0 Is It Possible To Add Just A String Or Some Other Object As An Artifact? If Yes, Then How?

Alright, but is it saved as a text file or pickle file?

3 years ago
0 Once I Set Up A Clear Ml Server On A Machine. I Understand I Need An Agent Listening On A Queue To Run Code. Do I Absolutely Have To Have An Agent And The Gpu On The Same Machine As The Server? Also Do The Gpu And Agent Have To Be On The Same Machine?

Basically since I want to train AI Models right. I'm trying to set up the architecture where I can automate the process from data fetching to model training, and need GPU for training.

4 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

My current approach is, watch a folder, when there are sufficient data points, just move N of them into another folder and create a raw dataset and call the pipeline with this dataset.

It gets downloaded, preprocessed, and then uploaded again.

In the final step, the preprocessed dataset is downloaded and is used to train the model.

3 years ago
4 years ago
Show more results compactanswers