Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
VexedCat68
Moderator
60 Questions, 381 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

371 × Eureka!
0 Votes
19 Answers
2K Views
0 Votes 19 Answers 2K Views
Is this the write way to add a tag to an output model artifact of a task? torch.save(model, ' http://best.pt ') output_model = task.models['output'][-1] outp...
3 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
3 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
Hi guys, needed a bit of clarification. Every 15 minutes, the scheduler prints, "Syncing scheduler, sleeping for 15 minutes until next sync". Can you guide m...
3 years ago
0 Votes
8 Answers
2K Views
0 Votes 8 Answers 2K Views
4 years ago
0 Votes
13 Answers
2K Views
0 Votes 13 Answers 2K Views
4 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
3 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
Is there any way to stop all clearml agent workers on a machine or stop workers from the clearml ui?
3 years ago
0 Votes
27 Answers
2K Views
0 Votes 27 Answers 2K Views
I wanted to ask, how to run pipeline steps conditionally? E.g if step returns a specific value, exit the pipeline or run another step instead of the sequenti...
3 years ago
0 Votes
12 Answers
2K Views
0 Votes 12 Answers 2K Views
I'm on the machine with ClearML Server hosted. Is there any way to see datasets uploaded to ClearML Data without downloading them using ClearML Data?
3 years ago
0 Votes
9 Answers
2K Views
0 Votes 9 Answers 2K Views
I have a clearml server deployed. I can see in the docs that one way of doing things is through api calls. The docs mention the end point url but does not me...
3 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
How can I register a json file I'm creating as an artifact
3 years ago
0 Votes
17 Answers
2K Views
0 Votes 17 Answers 2K Views
Should Dataset Triggers also be activated if there is no trigger condition except dataset_project and a new task starts in that project? Is this expected beh...
4 years ago
0 Votes
13 Answers
2K Views
0 Votes 13 Answers 2K Views
Another simple query guys. I've installed clearml on ubuntu. However, it says command not found when i run any command with clearml . I feel like it might be...
4 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Just following instructions from the clearml-serving repo.
3 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
3 years ago
0 Votes
8 Answers
2K Views
0 Votes 8 Answers 2K Views
3 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
Given I want to run a task in a pipeline using a base task id. One of my steps just finds the latest model to use. I want the task to output the id, and the ...
3 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
3 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
4 years ago
0 Votes
13 Answers
2K Views
0 Votes 13 Answers 2K Views
3 years ago
0 Votes
12 Answers
2K Views
0 Votes 12 Answers 2K Views
Is there a quicker way to abort all running experiments in a project? I have over a thousand running anonymous data tasks in a specific project and I want to...
3 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
When i create clearml-dataset from the cli, I get an id. The same doesn't happen when I use the Python API. is there any way to get the ID in python?
4 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
In your docs for Dataset at https://clear.ml/docs/latest/docs/references/sdk/dataset#class-dataset , I think you might have duplicate explanations for list_m...
3 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
3 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
4 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
When I upload and publish data to clearml-data, it says successful. Now when i try to get it using id, I get the following error. Error: __get_tasks() got mu...
4 years ago
0 Votes
4 Answers
2K Views
0 Votes 4 Answers 2K Views
I'm trying to run a task on an agent. I've passed the requirements file but it isn't able to install it. The error is in the reply. Help would be appreciated.
3 years ago
0 Votes
12 Answers
3K Views
0 Votes 12 Answers 3K Views
I'm training a tensorflow model and saving it in the end. I looked at the OutputModel class. How do I connect the model I'm saving to the OutputModel?
3 years ago
0 Votes
25 Answers
2K Views
0 Votes 25 Answers 2K Views
I'll just ask this question again to get some fresh attention to this. Is there any way to run a pipeline step conditionally? E.g, under certain condition, e...
3 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
3 years ago
Show more results questions
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

My current approach is, watch a folder, when there are sufficient data points, just move N of them into another folder and create a raw dataset and call the pipeline with this dataset.

It gets downloaded, preprocessed, and then uploaded again.

In the final step, the preprocessed dataset is downloaded and is used to train the model.

3 years ago
0 Is It Possible To Add Just A String Or Some Other Object As An Artifact? If Yes, Then How?

AgitatedDove14 Just wanted to confirm in what kind of file is the string artifact stored in? txt file or pkl file?

3 years ago
0 How Do I Get Args Like Epochs To Show Up In The Ui Configuration Panel Under Hyperparameters? I Want To Be Able To Change Number Of Epochs And Learning Rate From Within The Ui.

Basically when I have to re run the experiment with different hyperparameters, I should clone the previous experiment and change the hyperparameters then before putting it in the queue?

4 years ago
0 I'M Trying To Use Store Some Data In Clearml-Data. Then I Want To Get It Back Elsewhere Using Dataset.Get_Local_Copy. It Returns A Directory But I Don'T Know How To Actually Access And Use That Data, And Remove It When It'S Done. Help Would Be Appreciated

To me it still looks like the only difference is that the non mutable copy is downloaded to the cache folder while mutable copy downloads to the directory I want. I could delete files from both sets so it seems like it's up to the user to make sure not to mutate the non mutable download in the cache folder.

4 years ago
0 I'M Trying To Understand How Clearml Serving Works And Trying To Set It Up. I Have An Agent Listening To The Serving Queue And I'M Trying To Set Up Clearml Serving To Launch On The Serving Queue. Do I Need To Have Clearml-Serving Installed On The Machine

Also, the steps say that I should run the serving process on the default queue but I've run it on a queue I created called a serving queue and have an agent listening for it.

3 years ago
0 Hi Everyone, Is It Possible To Not Create A Copy Of A Dataset When Adding To Clearml? My Data Is Already In A Directory On The Clearml-Server Machine And I Do Not Want To Copy It, Just Add It To Clearml As Dataset.

I understand your problem. I think you normally can specify where you want the data to be stored in a conf file somewhere. people here can better guide you. However in my experience, it kinda uploads the data and stores it in its own format.

3 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

Sorry for the late response. Agreed, that can work, although I would prefer a way to access the data by M number of batches added instead of a certain range, since these cases aren't interchangeable. Also a simple thing that can be done is that you can create an empty Dataset in the start, and then make it the parent of every dataset you add.

3 years ago
0 Is There A Direct Way To Get A Model Using Its Id Like It Works With Dataset.Get?

Basically when I'm loading the model in InputModel, it loads it fine but I can't seem to get a local copy.

3 years ago
0 I'M Trying To Use Store Some Data In Clearml-Data. Then I Want To Get It Back Elsewhere Using Dataset.Get_Local_Copy. It Returns A Directory But I Don'T Know How To Actually Access And Use That Data, And Remove It When It'S Done. Help Would Be Appreciated

Here they are. I've created and published the dataset. Then when I try to get a local copy, the code works but i'm not sure how to proceed to be able to use that data.

4 years ago
0 When It Comes To Continuous Training, I Wanted To Know How You Train Or Would Train If You Have Annotated Data Incoming? Do You Train Completely Online Where You Train As Soon As You Have A Training Example Available? Do You Instead Train When You Have A

I get what you're saying. I was considering training on just the new data to see how it works. To me it felt like that was the fastest way to deal with data drift. I understand that it may introduce instability however. I was curious how other developers who have successfully managed to set up continuous training deal with it. 100% new data, or a ratio between new and old data. And if it is the latter, what should be the case, which should be the majority, old data or new data?

3 years ago
0 I Have No Prior Devops Experience. I'Ve Been Able To Set Up A Simple Continuous Training Setup Using Clearml. I Wanted To Ask What Should I Learn Which Would Help Me Move A Project From Mlops Level 0 To Level 1, And Then Level 2, Using Clear Ml. I Would A

Honestly anything. I tried looking up on youtube but There's very little material there, especially which is up to date. It's understandable given that ClearML is still in beta. I can look at courses / docs. I just want to be pointed in the right direction as to what I should look up and study

4 years ago
0 I'M Looking At How Triggers Work In Clearml. Is There An Example, Maybe With Clearml Data And A Dataset Being Uploaded Or Some Other Example?

It works, however it shows the task is enqueued and pending. Note I am using .start() and not .start_remotely() for now

4 years ago
0 I'M On The Machine With Clearml Server Hosted. Is There Any Way To See Datasets Uploaded To Clearml Data Without Downloading Them Using Clearml Data?

We want to get a clearer picture here to compare versioning with ClearML Data vs our own custom versioning

3 years ago
0 I'M Looking At How Triggers Work In Clearml. Is There An Example, Maybe With Clearml Data And A Dataset Being Uploaded Or Some Other Example?

I'd like to add an update to this, when I use schedule function instead of schedule task with the dataset trigger scheduler, it works as intended. It runs the desired function when triggered. Then is asleep again next time since no other trigger was fired.

4 years ago
0 Would Appreciate Some Help. Getting This Error. Valueerror: Node Train_Model, Parameter '${Split_Dataset.Split_Dataset_Id}', Input Type 'Split_Dataset_Id' Is Invalid

AgitatedDove14 Can you help me with this? Maybe something like storing the returned values or something in a variable outside the pipeline?

3 years ago
Show more results compactanswers