Unanswered
Hello Everyone.I Have No Idea Why Clearml-Serving Inference Server Tries To Get Model From That Url(Pic 1), But In Clearml Ui I Have Correct Url(Pic 2). Could You Help Me With This?
clearml-serving --id my_service_id model add --engine triton --endpoint "test_ocr_model" --preprocess "preprocess.py" --name "test-model" --project "clear-ml-test-serving-model" --input-size 1 3 384 384 --input-name "INPUT__0" --input-type float32 --output-size 1 -1 --output-name "OUTPUT__0" --output-type int32
182 Views
0
Answers
2 years ago
one year ago