Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
ExasperatedCrab78
Moderator
2 Questions, 221 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

2 × Eureka!
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
A little something else: Using ClearML, an OAK-1 AI camera and a raspberry pi to create a pushup counter that locks my PC every hour and only unlocks again w...
2 years ago
0 Votes
5 Answers
1K Views
0 Votes 5 Answers 1K Views
We're working on ClearML serving right now and are very interested in what you all are searching for in a serving engine, so we can make the best serving eng...
2 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

It part of the design I think. It makes sense that if we want to keep track of changes, we always build on top of what we already have 🙂 I think of it like a commit: I'm adding files in a NEW commit, not in the old one.

2 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

That makes sense! Maybe something like dataset querying as is used in the clearml hyperdatasets might be useful here? Basically you'd query your dataset to only include sample you want and have the query itself be a hyperparameter in your experiment?

2 years ago
0 Hello! Is There Any Way To Access The The

I'm not quite sure what you mean here? From the docs it seems like you should be able to simply send an HTTP request to the localhost url to get the metrics. Is this not working for you? Otherwise, all the metrics end up in Prometheus, so you can also query that instead or use something like Grafana to visualize it

one year ago
0 Hi Is There Any Option To Get Preview For The Images On Dataset In Case Upload With

AstonishingRabbit13 If I'm not mistaken, you can add images to the preview tab by reporting them as debug samples.

So you'd run: dataset.get_logger().report_image() or report_media()
This is not scalable though, so don't expect the server to handle millions of images well, for that you'd need Hyperdatasets 🙂
But it works well (as the name suggests) for some previews of the images!

Relevant docs:
https://clear.ml/docs/latest/docs/references/sdk/dataset/#get_logger
https://...

2 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

Cool! 😄 Yeah, that makes sense.

So (just brainstorming here) imagine you have your dataset with all samples inside. Every time N new samples arrive they're just added to the larger dataset in an incremental way (with the 3 lines I sent earlier).
So imagine if we could query/filter that large dataset to only include a certain datetime range. That range filter is then stored as hyperparameter too, so in that case, you could easily rerun the same training task multiple times, on differe...

2 years ago
0 Hpo App Question: My Config Includes 11 Parameter Values (0 - 1, Step 0.1). I'Ll Expect To See 11 Experiments, But I Fact It Was "52 Iterations". What I'M Missing (Last Time I Asked Similar Question, But This Time There Is No Issue With Hpo-App Integratio

Ok, so I recreated your issue I think. Problem is, HPO was designed to handle more possible combinations of items than is reasonable to test. In this case though, there are only 11 possible parameter "combinations". But by default, ClearML sets the maximum amount of jobs much higher than that (check advanced settings in the wizard).

It seems like HPO doesn't check for duplicate experiments though, so that means it will keep spawning experiments (even though it might have executed the exact s...

one year ago
0 I Am Trying To Run The Urbandsounds8K Example, But When I Run "Preprocessing" I Get The Error In The Line

VivaciousBadger56 hope you had a great time while away :)

That looks correct indeed. Do you mind checking for me if the dataset actually contains the correct metadata?

Go to the datasets section, select the one you need and on the right click on more information. It should send you to the experiment manager view. Then, under artifacts, do you see a key in the list named metadata? Can you post a screenshot?

2 years ago
0 I Am Trying To Run The Urbandsounds8K Example, But When I Run "Preprocessing" I Get The Error In The Line

VivaciousBadger56 Thank you for the screenshots! I appreciate the effort. You indeed clicked on the right link, I was on mobile so had to instruct from memory 🙂

First of all: every 'object' in the ClearML ecosystem is a task. Experiments are tasks, so are dataset versions and even pipelines! Each task can be viewed using the experiment manager UI, that's just how the backend is structured. Of course we keep experiments and data separate by giving them a separate tab and different UI, but...

2 years ago
0 I Am Trying To Run The Urbandsounds8K Example, But When I Run "Preprocessing" I Get The Error In The Line

VivaciousBadger56 Thanks for your patience, I was away for a week 🙂 Can you check that you properly changed the project name in the line above the one you posted?

In the example, by default, the project name is "ClearML Examples/Urbansounds". But it should give you an error when first running the get_data.py script that you can't actually modify that project (by design). You need to change it to one of you own choice. You might have done that in get_data.py but forgot to do s...

2 years ago
one year ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

Wait is it possible to do what i'm doing but with just one big Dataset object or something?

Don't know if that's possible yet, but maybe something like the proposed querying could help here?

2 years ago
2 years ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

Hi Fawad!
You should be able to get a local mutable copy using Dataset.get_mutable_local_copy and then creating a new dataset.
But personally I prefer this workflow:

dataset = Dataset.get(dataset_project=CLEARML_PROJECT, dataset_name=CLEARML_DATASET_NAME, auto_create=True, writable_copy=True) dataset.add_files(path=save_path, dataset_path=save_path) dataset.finalize(auto_upload=True)
The writable_copy argument gets a dataset and creates a child of it (a new dataset with your ...

2 years ago
2 years ago
0 Hey, I Was Planning To Save The Best Trained Model And Get A Task Id For It, So I Could Use It Later As

Also, this might be a little stupid sorry, but your torch save command saves the model in the current folder, whereas you give clearml the 'model_folder/model' path instead. Could it be that the path is just incorrect?

2 years ago
0 Hi All! I Recently Started Working With Clearml Serving. I Got This Example Working

Thank you so much, sorry for the inconvenience and thank you for your patience! I've pushed it internally and we're looking for a patch 🙂

one year ago
0 Hey, We Are Using Clearml 1.9.0 With Transformers 4.25.1… And We Started Getting Errors That Do Not Reproduce In Earlier Versions (Only Works In 1.7.2 All 1.8.X Don’T Work):

@<1523701949617147904:profile|PricklyRaven28> Please use this patch instead of the one previously shared. It excludes the dict hack :)

one year ago
0 Hey, We Are Using Clearml 1.9.0 With Transformers 4.25.1… And We Started Getting Errors That Do Not Reproduce In Earlier Versions (Only Works In 1.7.2 All 1.8.X Don’T Work):

Hi @<1523701949617147904:profile|PricklyRaven28> sorry that this is happening. I tried to run your minimal example, but get a IndexError: Invalid key: 5872 is out of bounds for size 0 error. That said, I get the same error without the code running in a pipeline. There seems to be no difference between simply running the code and the pipeline (for me). Do you have an updated example, maybe also including getting a local copy of an artifact, so I can check?

one year ago
0 Hello, I Am Using Datasets In Community Server. I Have Been Trying To Create A Child Dataset The Following Way: Dataset_Name = "Training2501" Dataset_Project = "Datasets" Dataset_Path = Dataset.Create( Dataset_Name=Dataset_Name, Dataset_Project=Da

I think that would defeat the purpose of lineage no? The point is to keep track of where data came from in the real world. Rewriting that record is just kind of... metadata?
As for the (*) line, could it be that "0385db..." itself does not have parents itself? So "0385db..." is the base dataset, without parents, and it has 1 child, which has "0385db..." as its parent

2 years ago
0 Hey, We Are Using Clearml 1.9.0 With Transformers 4.25.1… And We Started Getting Errors That Do Not Reproduce In Earlier Versions (Only Works In 1.7.2 All 1.8.X Don’T Work):

It's been accepted in master, but was not released yet indeed!

As for the other issue, it seems like we won't be adding support for non-string dict keys anytime soon. I'm thinking of adding a specific example/tutorial on how to work with Huggingface + ClearML so people can do it themselves.

For now (using the patch) the only thing you need to be careful about is to not connect a dict or object with ints as keys. If you do need to (e.g. ususally huggingface models need the id2label dict some...

one year ago
0 Hi All, What Is The Appropriate Way To Mount A Volume When Running The Docker Container For A Task? I'M Executing A Task From The Experiment Manager And Adding In

Nice! Well found and thanks for posting the solution!

May I ask out of curiosity, why mount X11? Are you planning to use a GUI app on the k8s cluster?

one year ago
0 Hi. I Am Experimenting With

Hi PanickyMoth78 ,

I've just recreated your example and it works for me on clearml==1.6.2 but indeed not on clearml==1.6.3rc1 which means we have some work to do before the full release 🙂 Can you try on clearml==1.6.2 to check that it does work there?

2 years ago
0 Hi There, Another Triton-Related Question: Are We Able To Deploy

Hi @<1547028116780617728:profile|TimelyRabbit96> Awesome that you managed to get it working!

one year ago
2 years ago
0 Hello We Want To Serve A Simple Rulebase Model. Think It As A .Py With With A Simple If...Else Function 1) How Do You Deliver A Rule Based Model. Or Do I Need To Train A Tensorflopytorch,Scikitlearn To Serve A Simple Rulebase Model. 2) How Do You Manage

Like Nathan said, custom engines are a TODO, but for your second question, you can add that API request in the model preprocessing, which is a function you can define yourself! It will be ran every time a request comes in and you can do whatever you want in it and change the incoming data however you wish 🙂

example: https://github.com/allegroai/clearml-serving/blob/main/examples/keras/preprocess.py

2 years ago
2 years ago
0 Hi There, Another Triton-Related Question: Are We Able To Deploy

@<1547028116780617728:profile|TimelyRabbit96>
Pipelines has little to do with serving, so let's not focus on that for now.

Instead, if you need a ensemble_scheduling block, you can use the CLI's --aux-config command to add any extra stuff that needs to be in the config.pbtxt

For example here, under the Setup section step 2, we use the --aux-config flag to add a dynamic batching block: None

one year ago
Show more results compactanswers