Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
VexedCat68
Moderator
60 Questions, 381 Answers
  Active since 10 January 2023
  Last activity 7 months ago

Reputation

0

Badges 1

371 × Eureka!
0 Votes
3 Answers
952 Views
0 Votes 3 Answers 952 Views
In your docs for Dataset at https://clear.ml/docs/latest/docs/references/sdk/dataset#class-dataset , I think you might have duplicate explanations for list_m...
2 years ago
0 Votes
9 Answers
970 Views
0 Votes 9 Answers 970 Views
Trying to create a data pipeline on my own. Wanted to ask, for each batch of data, do I have to create a new Dataset Object or do I just create one Dataset O...
3 years ago
0 Votes
5 Answers
977 Views
0 Votes 5 Answers 977 Views
3 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
3 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
2 years ago
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
2 years ago
0 Votes
8 Answers
917 Views
0 Votes 8 Answers 917 Views
3 years ago
0 Votes
25 Answers
940 Views
0 Votes 25 Answers 940 Views
Um, is there a way to delete an artifact from a task that is running?
2 years ago
0 Votes
2 Answers
950 Views
0 Votes 2 Answers 950 Views
When saving the model, there's a label tab but it's empty.
3 years ago
0 Votes
1 Answers
902 Views
0 Votes 1 Answers 902 Views
3 years ago
0 Votes
15 Answers
990 Views
0 Votes 15 Answers 990 Views
How do I get args like epochs to show up in the UI configuration panel under hyperparameters? I want to be able to change number of epochs and learning rate ...
3 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
I'm on the machine with ClearML Server hosted. Is there any way to see datasets uploaded to ClearML Data without downloading them using ClearML Data?
3 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
Given I want to run a task in a pipeline using a base task id. One of my steps just finds the latest model to use. I want the task to output the id, and the ...
2 years ago
0 Votes
0 Answers
946 Views
0 Votes 0 Answers 946 Views
Just following instructions from the clearml-serving repo.
2 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
How can I register a json file I'm creating as an artifact
2 years ago
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
Is there a direct way to get a model using its id like it works with Dataset.get?
2 years ago
0 Votes
19 Answers
1K Views
0 Votes 19 Answers 1K Views
Is this the write way to add a tag to an output model artifact of a task? torch.save(model, ' http://best.pt ') output_model = task.models['output'][-1] outp...
2 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
Given this pipeline step, is there any way to get the return value outside of the pipeline? Like put split_dataset_id in a variable in the main pipeline file.
2 years ago
0 Votes
17 Answers
1K Views
0 Votes 17 Answers 1K Views
Should Dataset Triggers also be activated if there is no trigger condition except dataset_project and a new task starts in that project? Is this expected beh...
3 years ago
0 Votes
23 Answers
1K Views
0 Votes 23 Answers 1K Views
I'm looking at how triggers work in ClearML. Is there an example, maybe with clearml data and a dataset being uploaded or some other example?
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
I'm training a tensorflow model and saving it in the end. I looked at the OutputModel class. How do I connect the model I'm saving to the OutputModel?
2 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
When i create clearml-dataset from the cli, I get an id. The same doesn't happen when I use the Python API. is there any way to get the ID in python?
3 years ago
0 Votes
15 Answers
1K Views
0 Votes 15 Answers 1K Views
I'm publishing the model artifact after the tensorflow model is saved. But in the UI, the download button is grayed out. And get_local_copy() doesn't seem to...
2 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
In the case where I'm passing a schedule_fn to add_task in a TaskScheduler, how do I pass the function arguments?
3 years ago
0 Votes
5 Answers
1K Views
0 Votes 5 Answers 1K Views
Would appreciate some help. Getting this Error. ValueError: Node train_model, parameter '${split_dataset.split_dataset_id}', input type 'split_dataset_id' is...
2 years ago
0 Votes
2 Answers
993 Views
0 Votes 2 Answers 993 Views
Is there a ClearML way to write a scheduler to watch a folder and publish a dataset when there are X number of files in that folder, or do I have to write a ...
3 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
2 years ago
0 Votes
12 Answers
909 Views
0 Votes 12 Answers 909 Views
Is there a quicker way to abort all running experiments in a project? I have over a thousand running anonymous data tasks in a specific project and I want to...
2 years ago
0 Votes
4 Answers
932 Views
0 Votes 4 Answers 932 Views
Is it possible to write text file and see it in results of the experiment? I want to use it to version data as in keeping a track of what images have been tr...
3 years ago
Show more results questions
0 I'M On The Machine With Clearml Server Hosted. Is There Any Way To See Datasets Uploaded To Clearml Data Without Downloading Them Using Clearml Data?

Also, do I have to manually keep track of dataset versions in a separate database? Or am I provided that as well in ClearML?

3 years ago
2 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

I already have the dataset id as a hyperparameter. I get said dataset. I'm only handling one dataset right now but merging multiple ones is a simple task as well.

Also I'm not very experienced and am unsure what proposed querying is and how and if it works in ClearML here.

2 years ago
0 Once I Set Up A Clear Ml Server On A Machine. I Understand I Need An Agent Listening On A Queue To Run Code. Do I Absolutely Have To Have An Agent And The Gpu On The Same Machine As The Server? Also Do The Gpu And Agent Have To Be On The Same Machine?

Just to be absolutely clear.

Agent Listening on Machine A with GPU listening to Queue X.

Task enqueued onto queue X from Machine B with no GPU.

Task runs on Machine A and experiment gets published to server?

3 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

My current approach is, watch a folder, when there are sufficient data points, just move N of them into another folder and create a raw dataset and call the pipeline with this dataset.

It gets downloaded, preprocessed, and then uploaded again.

In the final step, the preprocessed dataset is downloaded and is used to train the model.

2 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

Also, since I plan to not train on the whole dataset and instead only on a subset of the data, I was thinking of making each batch of data a new dataset and then just merging the subset of data I want to train on.

2 years ago
0 Is There An Example Of Simple Pipeline Using Task Scheduler? I Want To Create A Simple Pipeline Where First A Folder Is Monitored Using A Taskscheduler, And Under Certain Conditions, Data Is Uploaded And Dataset Id Is Given To The Next Step In The Pipeli

Ok since its my first time working with pipelines, I wanted to ask. Does the pipeline controller run endlessly or does it run from start to end with me telling it when to start based on a trigger?

2 years ago
0 Hi Everyone, Is It Possible To Not Create A Copy Of A Dataset When Adding To Clearml? My Data Is Already In A Directory On The Clearml-Server Machine And I Do Not Want To Copy It, Just Add It To Clearml As Dataset.

I understand that storing data outside ClearML won't ensure its immutability. I guess this can be built in as a feature into ClearML at some future point.

2 years ago
0 Sorry For The Barrage Of Questions. I Can'T Seem To Figure Out How Best To Get A Python Script I Need To Run On An Agent. I Have An Agent Listening To The Default Queue And I Also Have The Script That I Need To Run On The Agent. Can You Guide Me On How To

So the api is something new for me. I've already seen the sdk. Am I misremembering sending python script and requirements to run on agent directly from the cli? Was there no such way?

3 years ago
0 I'M Looking At How Triggers Work In Clearml. Is There An Example, Maybe With Clearml Data And A Dataset Being Uploaded Or Some Other Example?

But what's happening is, that I only publish a dataset once but every time it polls, it gets triggered and enqueues a task even though the dataset was published only once.

3 years ago
Show more results compactanswers