AgitatedDove14 SweetBadger76 i hope you can help with one more question
For the test I logged my new model to clearml-server file host and take models for clearml-serving from there. And it works with clearml-serving model add, but for clearml-serving model auto-update i do not exactly understand what happens. I see my auto-update model in models list in serving, but in output logs triton print that poll failed.
command for add model (which works)clearml-serving --id 5e4851e9ec3f4f148e6bd21e68fe22c1 model add --engine triton --endpoint "test_model_pytorch_auto" --preprocess "examples/pytorch/preprocess.py" --name "test torch auto update v2" --project "serving test" --input-size 1 28 28 --input-name "INPUT__0" --input-type float32 --output-size -1 10 --output-name "OUTPUT__0" --output-type float32
command for model auto-update (which has some problems)clearml-serving --id 5e4851e9ec3f4f148e6bd21e68fe22c1 model auto-update --engine triton --endpoint "test_model_pytorch_auto_v2" --preprocess "examples/pytorch/preprocess.py" --name "test torch auto update v2" --project "serving test" --input-size 1 28 28 --input-name "INPUT__0" --input-type float32 --output-size -1 10 --output-name "OUTPUT__0" --output_type float32
and in this command ( clearml-serving --id 5e4851e9ec3f4f148e6bd21e68fe22c1 model list
) output i see model which i added with model add
in endpoints
and can make requests and get responces
but the model which i added with model auto-update
i see in Model Monitorings
and can’t make requests on it
Thx, I will try it and say if it works for me
If this is the case and assuming you were able to use clearml to upload them, this means that adding the
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
To your env file should just work
https://github.com/allegroai/clearml-serving/blob/main/docker/example.env
Make sense?
Yandex Cloud - it’s S3 storage provider for me. And I’m already upload my models to this S3. Now i want to connect this S3 bucket credentials to clearml-serving-triton
Just making sure i understand, you are to upload your models with clearml to the Yandex compatible s3 storage?
AgitatedDove14 thx for your answer, but my question is just about how to pass environment variables to the Yandex Cloud S3 storage in clearml-serving. Because in documentation there are examples for Azure and etc. And if I try something like that i get connection error to my S3 storage
SweetBadger76 thx for your answer, but can u send link in documentation about this? I didn’t see it there. For clearml-server - okey, but in clearml-serving…
AbruptHedgehog21 the bucket and the full link are registered on the model object itself, you can see them in the ui, under the models tab. The only thing you actually need to pass inside is the credentials. Make sense?
hi AbruptHedgehog21
clearml-serving will use your clearml.conf file
Configure it to access your s3 bucket - that is the place for bucket, host etc
Hi QuaintStork2 I have a look at https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
And didn’t see any variables as bucket name and host. Access key and token - ok. But i need also host and bucket name)
Hi AbruptHedgehog21
How i can add S3 credentials to S3 bucket in example.env for clearml-serving-triton? I need to add bucket name, keys and endpoint
Basically boto (s3) environment variables would just work:
https://clear.ml/docs/latest/docs/clearml_serving/clearml_serving#advanced-setup---s3gsazure-access-optional
Hi SweetBadger76 I use Yandex Cloud. And I already connected this bucket to clearml server (in clearml.conf) and my trained models saved to this bucket. And yes, i’ve got list of credentials i need to use.
hi AbruptHedgehog21
which s3 service provider will you use ?
do you have a precise list of the var you need to add to the configuration to access your bucket ? 🙂