Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, We’Re Deploying Clearml On The Eks And Have An Issue With Authenticating The Server With The S3 Bucket. The Connection To S3 Bucket Is Not Working. Our Current Diagnosis: Clearml Internally Uses Aws_Access_Key_Id And Aws_Secret_Access_Key. But We A

Hi,

We’re deploying ClearML on the EKS and have an issue with authenticating the server with the S3 bucket.

The connection to S3 bucket is not working.

Our current diagnosis: ClearML internally uses AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. But we authenticate using saml2aws where the AWS_SESSION_TOKEN and AWS_SECURITY_TOKEN are also necessary. But ClearML seems to leave this out when creating a boto object. Hence the authentication fails.

Is it possible to make a connection to a S3 bucket via this authentication method with the open source version on EKS?

Thanks 🙏

  
  
Posted one year ago
Votes Newest

Answers 12


Good point!
I'll make sure we do 🙂

  
  
Posted one year ago

Is it possible to make a connection to a S3 bucket via this authentication method with the open source version on EKS?

Hi BoredBluewhale23
In your setup, are we talking about agents running inside the Kubernetes cluster, or clients connecting from their own machine ?

  
  
Posted one year ago

As you mentioned it seems that the authentication you're using is not created as part of the boto object. Is there a specific reason you don't want to use access/secret pairs for authentication?

  
  
Posted one year ago

to enable access to the s3 bucket. In this case I wonder how clearml sdk gets access to the s3 bucket if it relies on secret access key and access key id.

Right, basically someone needs to configure the "regular" environment variables for boto to use the IAM role, clearml will basically uses boto, so it should be transparent. does that make sense ? How do you spin the job on the k8s cluster and how do you configure it?

ince these are temp credentials awe need to use the session token and security token for access.

Hmm and were you able to get a token that will last for the entire "running time" of the Task? When are you calling saml2aws and are you using the output to configure the OS environments ?

  
  
Posted one year ago

ColorfulBeetle67

  
  
Posted one year ago

Right, basically someone needs to configure the “regular” environment variables for boto to use the IAM role,

clearml

will basically uses boto, so it should be transparent. does that make sense ? How do you spin the job on the k8s cluster and how do you configure it?

Yep I was thinking the same that the design choice must have been inspired by transparency. At the moment we just use the sdk to log training runs, model eartifacts etc and upload the model. We dont use the clearml agent. The pod has an annotation with a AWS role which has write access to the s3 bucket.

Hmm and were you able to get a token that will last for the entire “running time” of the Task? When are you calling

saml2aws

and are you using the output to configure the OS environments ? (edited)

Yes. The token lasts 1 hour. And my task is takes less than 5 mins. Yes saml2aws configures the default variables AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_SESSION_TOKEN. I was expecting clearml to pick them by default from the environment. From what I understand from https://github.com/allegroai/clearml/blob/master/clearml/storage/helper.py only the access_key_id and secrete access_key are used. Isnt this a bit restricting since use of temporary credentials is quite common?

  
  
Posted one year ago

The pod has an annotation with a AWS role which has write access to the s3 bucket.

So assuming the boto environment variables are configured to use the IAM role, it should be transparent, no? (I can't remember what the exact envs are, but google will probably solve it 🙂 _

AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_SESSION_TOKEN. I was expecting clearml to pick them by default from the environment.

Yes it should, the OS env will always override the configuration file section.
Are you saying it is not working for you?

  
  
Posted one year ago

This works as expected! Thanks AgitatedDove14 Maybe we could add it to the documentation https://clear.ml/docs/latest/docs/integrations/storage/ ? I think its important.

  
  
Posted one year ago

AgitatedDove14 Thank you! I was wondering what this flag was about! I will test this and update here for future reference!

  
  
Posted one year ago

Hi. Its regarding boto clients inside kubernetes cluster and clients in machines of the developers.
In case of clients in the kubernetes cluster we use IAM policies attached to the serviceaccount to enable access to the s3 bucket. In this case I wonder how clearml sdk gets access to the s3 bucket if it relies on secret access key and access key id.
In case of clients in the machines of developer, we use https://github.com/Versent/saml2aws to retrieve temporary credentials. Since these are temp credentials awe need to use the session token and security token for access. In both the above cases could you clarify if its possible to connect clearml to s3.

  
  
Posted one year ago

AgitatedDove14 TBH, the IAM role scenario I havent tested out since I couldnt get it to work with temp credentials. I can get back to you on this!

And yes I assumed from the docs that the OS env will overwrite the config file ( which I dint provide since I set the credtials based on your answer on https://stackoverflow.com/questions/66216294/clearml-how-to-change-clearml-conf-file-in-aws-sagemaker ) . But somehow it does not work with temp credentials ( where apart from the secret access key and acces key ID, the access token is also necessary). Can it be that internally in the clearml python sdk, when the boto3 resource is created only the secret access key and access key ID are used and the secret access token is left out?

  
  
Posted one year ago

ColorfulBeetle67 you might need to configure use_credentials_chain see here:
https://github.com/allegroai/clearml/blob/a9774c3842ea526d222044092172980ae505e24f/docs/clearml.conf#L85
Regrading the Token, I did not find any reference to "AWS_SESSION_TOKEN" in the clearml code, my guess it is used internally by boto?!

  
  
Posted one year ago