Maybe you've already seen this and it doesn't help but:
Based on ClearML's S3 setup guide , it looks like you can handle 1 and 2 with Boto3 and have config that sets the keys per session / profile so they would be picked up from a shared config file or aws config file. Not 100% sure if that will avoid the credential expiry mid-task issue but I think it should, maybe with some additional code for you to run a check for cred updates if you get kicked out of your session.
Apologies if I'm repeating things you've read but hope this helps a bit!
Hi @<1541954607595393024:profile|BattyCrocodile47> , sorry - I seemed to have missed this... 🙏
The easiest way to achieve #1 would be to use the AWS_PROFILE
env var supported by boto3. For #2, if you have an IAM role assigned to an instance, you'll need to make sure the use_credentials_chain
option is set to true
.
#3 should work 🙂
Regarding the last questions, yes you can - there's an init bash acript you can provide where you can do whatever you wish 🙂
@<1541954607595393024:profile|BattyCrocodile47> to answer the original question, I'm not sure each user having their own credentials is such a deviation from the way AWS expected you to use their credentials anyway - seems to me this is their standard mode of operation - you come in to work, use your MFA to generate a new token, than use it for 12 hours 🙂
Basically you can configure ClearML to just use the existing AWS credentials set on the machine without changing the clearml.conf every time (that can be the developer's workstation, or the training machine), so anything you would normally be doing with AWS setup is already supported
Glad I can help Eric!
More or less, it may not be the best (or a good at all) approach but it seems like a decent workaround.
You can read in the credentials in with Boto
session = boto3.Session(profile_name='some-profile')
dev_s3_client = session.client('s3')
from the shared config but yes you may need a separate script to update the values (which I don't think needs to necessarily use Boto) or maybe load the clearml.conf from a shared space (of course that adds other dependencies)
The Config vault vault definitely seems to address this (it's also an Enterprise feature). It looks like you can update config through the UI and merge into the config file while ClearML and ClearML Agent are used but I'm not sure whether that could still cause any blips. It comes with this "Productivity tip: Keep the vault disabled while you edit your configuration, and enable it when the configuration is ready."
Yay! Man, I want to do ClearML with "hard mode" (non-enterprise, self-hosted) first, before trying to sell BENlabs (my work) on it. I could see us paying for enterprise to get the Hyper Datasets and Vault features if our scientists/developers fall in love with it--they probably will if we can get them to adopt it since right now we have a homemade system that isn't nearly as nice as ClearML.
@<1523701087100473344:profile|SuccessfulKoala55> how exactly do you configure ClearML to use the credentials on the client machine? Is there a way to set the AWS profile somehow (maybe from the clearml.conf
file to achieve something similar to what @<1550289509273309184:profile|CooperativeBeetle24> was saying?
For example, we just created an "MLOps Tooling" AWS account. And our developers/scientists all have an AWS profile called [mlops-tooling]
in their ~/.aws/credentials
file (well, actually it's in the ~/.aws/config
file since we use AWS SSO AKA AWS Identity Center).
- Setting the profile in
clearml.conf
would be a really nice solution to this for us because we use the exact workflow you just described. (Except, we don't use thedefault
profile, is that what you were suggesting?). Is this possible? I don't see a setting called "aws_profile" in the boto section ofclearml.conf
-
I guess our auto-scaled instances could do something similar, but with an IAM role. Would giving those instanecs a role and leaving their
clearml.conf
files alone achieve that? -
And laaaaaast question: we'll need to authorize our AWS Autoscaled ClearML agents to
git clone
our repositories. I'm assuming if a developer's laptop is being used as an agent, ClearML will just use their SSH keys at the default location of~/.ssh/id_rsa*
.
Can we have the autoscaled ClearML EC2 instances in AWS run a script to download those keys from AWS SecretsManager on startup? (that would work nicely because, presumably, the instances will have an IAM role that allows that).
Thanks Vasil! Can you elaborate on what you mean by using boto3? Do you mean writing a script using boto that pulls the credentials down and writes to the user's clearml.conf
Also, I've been seeing references to "credentials vault" in the docs. I can see this is the problem that it solves.
Just chiming in regarding the vault feature, yes it does address this issue and allowed centralized configuration as well as per-user configuration :)