Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
ExasperatedCrab78
Moderator
2 Questions, 221 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

2 × Eureka!
0 Votes
0 Answers
910 Views
0 Votes 0 Answers 910 Views
A little something else: Using ClearML, an OAK-1 AI camera and a raspberry pi to create a pushup counter that locks my PC every hour and only unlocks again w...
2 years ago
0 Votes
5 Answers
1K Views
0 Votes 5 Answers 1K Views
We're working on ClearML serving right now and are very interested in what you all are searching for in a serving engine, so we can make the best serving eng...
2 years ago
0 When Dumping Model Via Clearml Serving, What Are The Things That The Clearml Will Look At To Populate The Input_Size And Output_Size? I Tried To Dump An Sklearn Model, And The Input_Size And Output_Size Is Null. I Prefer Not To Update It Separately Using

Just to be sure I understand you correctly: you're saving/dumping an sklearn model in the clearml experiment manager, then want to serve it using clearml serving, but you do not wish to specify the model input and ouput shapes in the CLI?

one year ago
0 Hey, We Are Using Clearml 1.9.0 With Transformers 4.25.1… And We Started Getting Errors That Do Not Reproduce In Earlier Versions (Only Works In 1.7.2 All 1.8.X Don’T Work):

However, I actually do think I can already open the Huggingface PR in the meantime. It has actually relatively little to do with the second bug.

one year ago
0 Hey, We Are Using Clearml 1.9.0 With Transformers 4.25.1… And We Started Getting Errors That Do Not Reproduce In Earlier Versions (Only Works In 1.7.2 All 1.8.X Don’T Work):

When creating it, I found that this hack should be on our side, not on Huggingface's. So I'm only going to fix issue 1 with the PR, issue 2 is ours 🙂

one year ago
one year ago
0 Hi, I'M Using Hyperparameteroptimizer Alongside Optimizeroptuna And I Am Unsure How To Implement Pruning On Tasks That Are Not Producing Good Results. Is There A Way To Implement This On These Modules?

Yeah, I do the same thing all the time. You can limit the amount of tasks that are kept in HPO with the save_top_k_tasks_only parameter and you can create subprojects by simply using a slash in the name 🙂 https://clear.ml/docs/latest/docs/fundamentals/projects#creating-subprojects

2 years ago
0 Hello Everyone, I Am A New User For Clearml, I Have One Question: I Created The Dataset, And Upload Files Successfully By Class

Hmm, I can't really follow your explanation. The removed file SHOULD not exist right? 😅 And what do you mean exactly with the last sentence? An artifact is an output generated as part of a task. Can you show me what you mean with screenshots for example?

2 years ago
0 I Am Looking For The Dataset Used In Sarcasm Detection Demo

Great to hear! Then it comes down to waiting for the next hugging release!

one year ago
0 Hey All, Is Anyone Able To Access The Clear Ml Website?

Isitdown seems to be reporting it as up. Any issues with other websites?

2 years ago
0 Is There A Tutorial For Clearml Serving? I Followed The Steps On Its Repo But I Still Don'T Understand It. Also The Serving Engine Keeps Failing After A While. I Also Don'T Know How To Access The Serving Engine Or How To Send Inference Requests To It.

Hi Fawad, maybe this can help you get started! They're both c++ and python examples of triton inference. Be careful though, the pre and postprocessing used is specific to the model (in this case yolov4) and you'll have to change it to your own model's needs

2 years ago
0 Hello Everyone ! I Tried To Reproduce Your Tutorial :

The point of the alias is for better visibility in the Experiment Manager. Check the screenshots above for what it looks like in the UI. Essentially, setting an Alias makes sure the task that is getting the dataset automatically logs the ID that it gets using Dataset.get() . The reason being that if you later on look back to your experiment, you can also see what dataset was .get() 't back then.

ExuberantBat52 When you still get the log messages, where did you specify the alias?...

one year ago
0 Hi All

Ah I see. So then I would guess it is due to the remote machine (the clearml agent) not being able to properly access your clearml server

one year ago
0 Hey, We Are Using Clearml 1.9.0 With Transformers 4.25.1… And We Started Getting Errors That Do Not Reproduce In Earlier Versions (Only Works In 1.7.2 All 1.8.X Don’T Work):

Hi @<1523701949617147904:profile|PricklyRaven28> just letting you know I still have this on my TODO, I'll update you as soon as I have something!

one year ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

Wait is it possible to do what i'm doing but with just one big Dataset object or something?

Don't know if that's possible yet, but maybe something like the proposed querying could help here?

2 years ago
0 When Dumping Model Via Clearml Serving, What Are The Things That The Clearml Will Look At To Populate The Input_Size And Output_Size? I Tried To Dump An Sklearn Model, And The Input_Size And Output_Size Is Null. I Prefer Not To Update It Separately Using

Unfortunately no, ClearML serving does not infer input or output shapes from the saved models as of today. Maybe you could open an issue on the github of ClearML serving to request it? Preferably with a clear, minimal example, that would be awesome! We'd take it into account for next releases

one year ago
0 Hello

Can you run

ls -la /opt/clearml

and paste the results here?

one year ago
0 When Dumping Model Via Clearml Serving, What Are The Things That The Clearml Will Look At To Populate The Input_Size And Output_Size? I Tried To Dump An Sklearn Model, And The Input_Size And Output_Size Is Null. I Prefer Not To Update It Separately Using

No inputs and outputs are ever set automatically 🙂 For e.g. Keras you'll have to specify it using the CLI when making the endpoint, so Triton knows how to optimise as well as set it correctly in your preprocessing so Triton receives the format it expects.

one year ago
0 When Using Dataset.Get_Local_Copy(), Once I Get The Location, Can I Add Another Folder Inside Location Add Some Files In It, Create A New Dataset Object, And Then Do Dataset.Upload(Location)? Should This Work? Or Since Its Get_Local_Copy, I Won'T Be Able

That makes sense! Maybe something like dataset querying as is used in the clearml hyperdatasets might be useful here? Basically you'd query your dataset to only include sample you want and have the query itself be a hyperparameter in your experiment?

2 years ago
0 Hi Everyone! I Faced The Problem With Clearml-Serving. I'Ve Deployed Onnx Model From Higgingface In Clearml-Serving, But

Hey! Sorry, didn't fully read your question and missed that you already did it. It should not be done inside the clearm-serving-triton service but instead inside the clearml-serving-inference service. This is where the preprocessing script is ran and it seems to be where the error is coming from.

one year ago
0 Hi Team,When Clearml-Agent Is Used To Run The Code,I T Will Setup The Environment ,How It Take The Python Package Version?

Hi @<1533257278776414208:profile|SuperiorCockroach75>

I must say I don't really know where this comes from. As far as I understand the agent should install the packages exactly as they are saved on the task itself. Can you go to the original experiment of the pipeline step in question (You can do this by selecting the step and clicking on Full Details" in the info panel), there under the execution tab you should see which version the task detected.

The task itself will try to autodetect t...

one year ago
0 Hello, Does Clearml_Apiserver Needed To Listen To 8008 Only? Can I Change To Other Ports Likes 9008?

Hi VictoriousPenguin97 ! I think you should be able to change it in the docker-compose file here: https://github.com/allegroai/clearml-server/blob/master/docker/docker-compose.yml

You can map the internal 8008 port to another port on your local machine. But beware to provide the different port number to any client that tries to connect (using clearml-init )

2 years ago
Show more results compactanswers