Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello Community! How I Can Add S3 Credentials To S3 Bucket In Example.Env For Clearml-Serving-Triton? I Need To Add Bucket Name, Keys And Endpoint

Hello community! How i can add S3 credentials to S3 bucket in example.env for clearml-serving-triton? I need to add bucket name, keys and endpoint

  
  
Posted 2 years ago
Votes Newest

Answers 13


hi AbruptHedgehog21
which s3 service provider will you use ?
do you have a precise list of the var you need to add to the configuration to access your bucket ? 🙂

  
  
Posted 2 years ago

Hi SweetBadger76 I use Yandex Cloud. And I already connected this bucket to clearml server (in clearml.conf) and my trained models saved to this bucket. And yes, i’ve got list of credentials i need to use.

  
  
Posted 2 years ago

Hi AbruptHedgehog21

How i can add S3 credentials to S3 bucket in example.env for clearml-serving-triton? I need to add bucket name, keys and endpoint

Basically boto (s3) environment variables would just work:
https://clear.ml/docs/latest/docs/clearml_serving/clearml_serving#advanced-setup---s3gsazure-access-optional

  
  
Posted 2 years ago

Hi QuaintStork2 I have a look at https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
And didn’t see any variables as bucket name and host. Access key and token - ok. But i need also host and bucket name)

  
  
Posted 2 years ago

hi AbruptHedgehog21
clearml-serving will use your clearml.conf file
Configure it to access your s3 bucket - that is the place for bucket, host etc

  
  
Posted 2 years ago

AbruptHedgehog21 the bucket and the full link are registered on the model object itself, you can see them in the ui, under the models tab. The only thing you actually need to pass inside is the credentials. Make sense?

  
  
Posted 2 years ago

SweetBadger76 thx for your answer, but can u send link in documentation about this? I didn’t see it there. For clearml-server - okey, but in clearml-serving…

  
  
Posted 2 years ago

AgitatedDove14 thx for your answer, but my question is just about how to pass environment variables to the Yandex Cloud S3 storage in clearml-serving. Because in documentation there are examples for Azure and etc. And if I try something like that i get connection error to my S3 storage

  
  
Posted 2 years ago

Just making sure i understand, you are to upload your models with clearml to the Yandex compatible s3 storage?

  
  
Posted 2 years ago

Yandex Cloud - it’s S3 storage provider for me. And I’m already upload my models to this S3. Now i want to connect this S3 bucket credentials to clearml-serving-triton

  
  
Posted 2 years ago

If this is the case and assuming you were able to use clearml to upload them, this means that adding the

AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY

To your env file should just work
https://github.com/allegroai/clearml-serving/blob/main/docker/example.env

Make sense?

  
  
Posted 2 years ago

Thx, I will try it and say if it works for me

  
  
Posted 2 years ago

AgitatedDove14 SweetBadger76 i hope you can help with one more question
For the test I logged my new model to clearml-server file host and take models for clearml-serving from there. And it works with clearml-serving model add, but for clearml-serving model auto-update i do not exactly understand what happens. I see my auto-update model in models list in serving, but in output logs triton print that poll failed.

command for add model (which works)
clearml-serving --id 5e4851e9ec3f4f148e6bd21e68fe22c1 model add --engine triton --endpoint "test_model_pytorch_auto" --preprocess "examples/pytorch/preprocess.py" --name "test torch auto update v2" --project "serving test" --input-size 1 28 28 --input-name "INPUT__0" --input-type float32 --output-size -1 10 --output-name "OUTPUT__0" --output-type float32
command for model auto-update (which has some problems)
clearml-serving --id 5e4851e9ec3f4f148e6bd21e68fe22c1 model auto-update --engine triton --endpoint "test_model_pytorch_auto_v2" --preprocess "examples/pytorch/preprocess.py" --name "test torch auto update v2" --project "serving test" --input-size 1 28 28 --input-name "INPUT__0" --input-type float32 --output-size -1 10 --output-name "OUTPUT__0" --output_type float32

and in this command ( clearml-serving --id 5e4851e9ec3f4f148e6bd21e68fe22c1 model list ) output i see model which i added with model add in endpoints and can make requests and get responces
but the model which i added with model auto-update i see in Model Monitorings and can’t make requests on it

  
  
Posted 2 years ago