Unanswered
Hi Guys, I Am Trying To Upload And Serve A Pre-Existing 3-Rdparty Pytorch Model Inside My Clearml Cluster. However, After Proceeding With The Suggested Sequence Of Operations By Official Docs And Later Even Gpt O3, I Am Having Errors Which I Cannot Solve.
Also, AgitatedDove14 , thank you very much for your advice regarding archive - I did that, removed all current clearml-serving services, created a new one, attached its ID to the ENV file, disabled all running serving dockers and then restarted the clearml-serving-triton-gpu
docker, adding a model file afterwards.
I don't see any docker run errors now in clearml webui tasks console, but now serving is not able to locate the model file itself, and that file is listed in model repository - please, take a look at the screenshots.
28 Views
0
Answers
one month ago
one month ago