Hi ComfortableShark77 !
Which commands did you use exactly to deploy the model?
clearml-serving --id my_service_id model add --engine triton --endpoint "test_ocr_model" --preprocess "preprocess.py" --name "test-model" --project "clear-ml-test-serving-model" --input-size 1 3 384 384 --input-name "INPUT__0" --input-type float32 --output-size 1 -1 --output-name "OUTPUT__0" --output-type int32
docker-compose --env-file example.env -f docker-compose-triton-gpu.yml up
for clearml-serving
ComfortableShark77 it seems the clearml-serving is trying to Upload data to a different server (not download the model)
I'm assuming this has to do with the CLEARML_FILES_HOST, and missing credentials. It has nothing to do with downloading the model (that as you posted, will be from the s3 bucket).
Does that make sense ?