Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
OddShrimp85
Moderator
35 Questions, 65 Answers
  Active since 10 January 2023
  Last activity 8 months ago

Reputation

0

Badges 1

59 × Eureka!
0 Hi, Is There A Means To Leverage On Clearml To Run A Ml Inference Container That Does Not Terminate?

Can clearml-serving does helm install or upgrade? We have cases where the ml models do not come from the ml experiments in clearml. But would like to tap on clearml q to enable resource queuing.

9 months ago
0 Hello, I Have A Trained Model (Saved As

I figured out that it maybe possible to do these
experiment_task = Task.current_task()
OutputModel(experiment_task ).update_weights(' http://model.pt ') to attach it to the ClearML experiment task.

2 years ago
0 Hi, Is There A Means To Leverage On Clearml To Run A Ml Inference Container That Does Not Terminate?

Thanks @<1523701205467926528:profile|AgitatedDove14> . what I could think of is to write a task that may run python subproecss to do "helm install". In those python script, we could point to /download the helm chart from somewhere (e.g. nfs, s3).

Does this sound right to u?
Anything that I was wondering is if we could pass the helm charts /files when we uses clearml sdk, so we could minimise the step to push them to the nfs/s3.

8 months ago
0 Hello, Is There A Way To Pull Dataset From A Local Hdd When Using Pipeline As Function?

When I run as regular remote task it works. But when I run as a step in pipeline, it cannot access the same folder in my local machine.

2 years ago
0 Hi, Is There A Means To Leverage On Clearml To Run A Ml Inference Container That Does Not Terminate?

A more advanced case will be to decide how long this job should run amd terminate after that. This is to improve the usage of gpu

9 months ago
0 Any Plans To Add Unpublished State For Clearml-Serving?

Not exactly sure yet but I would think user tag for deployed make sense as it should be a deliberated user action. And additional system state is required too since a deployed state should have some pre-requitise system state.

I would also like to ask if clearml has different states for a task, model, or even different task types? Right now I dun see differences, is this a deliberated design?

3 years ago
0 Any Plans To Add Unpublished State For Clearml-Serving?

I guess we need to understand the purpose of the various states. So far only "archive, draft, publish". Did I miss any?

3 years ago
0 I Have Uploaded Dataset In Clearml With

Hi @<1523701070390366208:profile|CostlyOstrich36> , basically

  • I uploaded dataset using clearml Datasets. The output_uri is pointed to my s3, thus the dataset is stored in s3. My s3 is setup with http only.
  • When I retrieve the dataset for training, using Dataset.get() , I encountered ssl cert error as the url to retrieve data was https://<s3url>/... instead of s3://<s3url>/... which is http. This is weird as the dataset url is without https.
  • I am not too sure why and I susp...
2 years ago
0 Hello, There Is A Means To Export / Import Task Using Task.Export_Task, Task.Import_Task. Is There A Way To Preserve The Task Id When We Bring This Task From One Clearml Server To Another? Both Clearml Server Are Not Connected.

Thanks AgitatedDove14 and TimelyMouse69 . The intention was to have some traceability between the two setups. I think the best way is to enforce some naming convention (for project and name) so we can know how they are related? Any better suggestions?

2 years ago
3 years ago
0 Hello, For Deployment Of Clearml Server In K8S, We Need To Label Just One Node App=Clearml And Also Set The Daemon.Json. Wouldn'T This Make Clearml Server Tie To This Node Only And If This Node Fails, Clearml Server Will Not Work. Any Advice? Similarly

Hi SuccessfulKoala55 Thanks for pointing me to this repo. Was using this repo.

I didn't manage to find in this repo that if we still require to label the node app=clearml, like what was mentioned in the deprecated repo. Although from the values.yaml, the node selector is empty. Would u be able to advise?

How is the clearml data handled now then? Thanks

4 years ago
0 Hi, Is There A Way To List All Agents Running In A Host, I Do Not Find Relevant One In Clearml-Agent -H.

Yes. But I not sure what's the agent running. I only know how to stop it if I have the agent id

2 years ago
0 Hi Did Anyone Encounter

Yes. The training working well with cuda

2 years ago
0 Hello Guys, Not Sure If This Is The Right Place To Ask About Clearml Serving. May I Know If An Updated Readme Will Be Released Soon? I Did Not Manage To Get Clearml Serving Work With My Own Clearml Server And Triton Setup.

And just a suggestion which maybe I can post in GitHub issue too.
It is not very clear what are the purpose of the project name and name, even after I read the --help. Perhaps this is something that can be made clearer when updating the docu?

4 years ago
0 Hi, Is There A Means To Leverage On Clearml To Run A Ml Inference Container That Does Not Terminate?

@<1523701205467926528:profile|AgitatedDove14> I looking at a queue system which clearml q offers that allow user to queue job to deploy an app / inference service. This cam be as simple as a pod or a more complete helm chart.

9 months ago
0 Hi Guys, I Have Been Pondering How Does Clearml Generate The "Installed Packages" List. I Triggered My Training Through Machine A (With Some Python Packages) And The Actual Training Is Done In A Docker Container (With Both Global Packages + Packages Insta

Example i build my docker image using a image in docker hub. In this image, i installed torch and cupy packages. But when i run my experiment in this image, the packages are not found.

Yes, I ran the experiment inside.

4 years ago
0 Hello Guys, Not Sure If This Is The Right Place To Ask About Clearml Serving. May I Know If An Updated Readme Will Be Released Soon? I Did Not Manage To Get Clearml Serving Work With My Own Clearml Server And Triton Setup.

Thanks AgitatedDove14 . Specifically, I wanted to use my own clearml server and Triton. Thus, I attempted to use --engine-container-args during launch but error saying no such flag. Looked into --help but I guessed it is not updated yet.

4 years ago
4 years ago
Show more results compactanswers