Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
NuttyCamel41
Moderator
12 Questions, 42 Answers
  Active since 18 January 2023
  Last activity one year ago

Reputation

0

Badges 1

42 × Eureka!
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
Hi all! I am in the process of setting up clearml-serving on my kubernetes cluster using the provided helm charts. Currently I am stuck with running the cont...
one year ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
Hi all, I have the same problem as stated in this Thread. The file encoding of all files in my project is utf-8 and I already set the environment variable PY...
2 years ago
0 Votes
13 Answers
2K Views
0 Votes 13 Answers 2K Views
Hi there! Can anybody help me with specifying the 'platform' for a model in clearml-serving. I am using the k8s clearml-serving setup (version 1.3.1). I alre...
one year ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
Hi there! :) I have an issue regarding the get_local_copy(..) function of the Model class. Whenever this function is called in the course of a hyperparameter...
2 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
Hi all, I again have a problem which was already reported here. It seems like the agent ignores the reqirements even when I am explicitely adding them by Tas...
2 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Hi all, I am trying to add a model to my recently setup k8s self-hosted clearml-serving. The command looks like this and worked whith my previous docker setu...
one year ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
2 years ago
0 Votes
22 Answers
2K Views
0 Votes 22 Answers 2K Views
Hi all! I recently started working with clearML serving. I got this example working https://github.com/allegroai/clearml-serving/tree/main/examples/pytorch a...
2 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
2 years ago
0 Votes
4 Answers
3K Views
0 Votes 4 Answers 3K Views
Hi all! Does anyone know a solution to my issue with deploying models saved on azure on the clearml-serving docker container?
2 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
one year ago
0 Votes
8 Answers
2K Views
0 Votes 8 Answers 2K Views
2 years ago
0 Hi All! I Was Just Wondering What Is The Best Way To Log Additional Information? Right Now I'M Only Printing It To The Console, But That'S Not The Most Pleasant Way To Retrieve The Information Later On. As Far As I Can See, The 'Logger.Report_Text(...)' M

Hi @<1523701087100473344:profile|SuccessfulKoala55> , thanks for your message! 🙂 I am aware that the console is also logged on the server, but I somehow find it not optimal to look for relevant information in the console log and would like to place the information in a more structured way.

2 years ago
0 Hi All! Does Anyone Know A Solution To My Issue With Deploying Models Saved On Azure On The Clearml-Serving Docker Container?

Hi @<1523701205467926528:profile|AgitatedDove14> , thanks for your answer! Can you tell me, how specifically I map my clearml.conf to the containers? By the way, the credentials are already set (and working) in the clearml.conf.

2 years ago
0 Hi All! I Am In The Process Of Setting Up Clearml-Serving On My Kubernetes Cluster Using The Provided Helm Charts. Currently I Am Stuck With Running The Control Task. When I Call

Hi @<1523701070390366208:profile|CostlyOstrich36> , I just have solved the issue! :) After calling clearml-serving create --name "model serving" the printed task id has to be filled in the values.yaml of the clearml-serving helm chart under clearml.servingTaskId. After installing the helm chart, the draft of the service task is started automatically so there is no need to manually enqueue it.
Would it be possible to add this info to the docs? Maybe a small hint on this page [None](https...

one year ago
0 Hi All! I Recently Started Working With Clearml Serving. I Got This Example Working

Hi @<1523701118159294464:profile|ExasperatedCrab78> , I have a sad update on this issue. It does not seem to be completely solved yet. 😕 But I think I can at least describe it a bit better now:

  • Models which are located on the clearML servers (created by Task.init(..., output_uri=True) ) still run perfectly.
  • Models which are located on azure blob storage make different problems in different scenarios (which made me think we resolved this issue):- When I start the docker con...
2 years ago
0 Hi All! I Recently Started Working With Clearml Serving. I Got This Example Working

Ok, I have found the issue. 🙌 When I try to serve a model which is saved on azure (generated by Task.init(..., output_uri='azure://...') ) I get the poll failed for model directory 'test_model_pytorch': failed to open text file for read /models/test_model_pytorch/config.pbtxt: No such file or directory error. A model which was saved on the clearML server (generated by Task.init(..., output_uri=True) ) can be served without any problems.
For now I am not sure why th...

2 years ago
0 Hi All! I Recently Started Working With Clearml Serving. I Got This Example Working

I think you are correct with your guess that the services were not shut down properly. I noticed that some services were still shown as running on the clear ml dashboard. I aborted all and at least got rid of the error ValueError: triton-server process ended with error code 1 . But the two errors you named are still there and I also got these two warnings:
` clearml-serving-triton | Warning: more than one valid Controller Tasks found, using Task ID=4709b0b383a04bb1a033e99fd325dc...

2 years ago
0 Hi There! Can Anybody Help Me With Specifying The 'Platform' For A Model In Clearml-Serving. I Am Using The K8S Clearml-Serving Setup (Version 1.3.1). I Already Tried A Bunch Of Variants Like

Hi @<1523701205467926528:profile|AgitatedDove14> thanks for your hint! I already convert it to torch script using tracing. Everything around the model should be fine, since it already worked with the docker clearml-serving setup.
I think the real issue is that I am not able to specify a platform for the model, as the error above tells me that no platform is given no matter how I try to pass it.

one year ago
0 Hi There! I Want To Set Up Clearml On A Kubernetes Cluster. The Whole Setup Including Authentication Of Fixed Users Seems To Work Fine. I Can Log In On The Webapp And Generate Credentials For Connecting My Local Clearml Installation. The Clearml-Init Also

Hi @<1523701827080556544:profile|JuicyFox94> I figured out what the problem is! For some recent experimentation I set an acces_key and secret_key as environment variables in my os. When I deleted them everything worked fine so the environment variables overwrote the keys given by the clearml.conf. Is that the desired default behaviour?
And just one tip for everbody having similar problems: Switch to using the SDK instead of the CLI for better debugging. This helped me to find the cause of m...

one year ago
0 Hi All! I Recently Started Working With Clearml Serving. I Got This Example Working

By the way, the example which worked for me in the beginning also produces the same error now poll failed for model directory 'test_model_pytorch': failed to open text file for read /models/test_model_pytorch/config.pbtxt: No such file or directory . So there really seems to be something wrong with the docker containers.

2 years ago
0 Hi All! I Am In The Process Of Setting Up Clearml-Serving On My Kubernetes Cluster Using The Provided Helm Charts. Currently I Am Stuck With Running The Control Task. When I Call

Hi @<1523701070390366208:profile|CostlyOstrich36> , of course! Here it is (with blurred urls, paths and account names)

one year ago
0 Hi There! Can Anybody Help Me With Specifying The 'Platform' For A Model In Clearml-Serving. I Am Using The K8S Clearml-Serving Setup (Version 1.3.1). I Already Tried A Bunch Of Variants Like

What do you mean by "How are you creating the model?"? I executed a pytorch model training saved a traced version of the model so that saved with the executed task. This was also no problem with the docker container setup.

one year ago
Show more results compactanswers