Hum, must be more arcane then, I guess the official support would be able to provide an answer, they usually answer within 24 hours
upsy sorry didnt read the entire backtrace
I got some credentials issues to in some pipelines steps and I solved it using
task = Task.current_task()
task.setup_aws_upload(...)
It can allows you to explicitaly specify credentials
If you're using Helm it would be at the service level in your values.yml
, not pod level
If you feel you have a specific enough issue you can also post a github issue and link this thread to it
FierceHamster54 Should we write to some other chat?
Hello FierceHamster54
We specified credantials of s3 bucket in clearml.fileserverSecret and clearml.fileserverKey in values.yml of clearml and specified adress of s3 bucket in fileServerUrlReference in values.yml of clearml-agent.
When we run the task remotely we get an error:
clearml.storage - ERROR - Failed creating storage object s3:// Reason: Missing key and secret for S3 storage access ( s3://)
clearml.metrics - WARNING - Failed uploading to s3:// ('NoneType' object has no attribute 'upload')
clearml.metrics - ERROR - Not uploading 1/1 events because the data upload failed
FierceHamster54 sorry, we got the same error ((
from clearml import Task
task = Task.init(project_name='example', task_name='task template')
task = Task.current_task()
task.setup_aws_upload(bucket="bucket",
host="host:port",
key="key",
secret="secret",
multipart=True, secure=False,verify=True)
task.execute_remotely(queue_name='default')
task.upload_artifact("list", [1, 2, 3])
Hello, FierceHamster54 !
Can you please clarify, where in the clearml-agent's values.yaml we can specify the credentials? We've found there only fileServerUrlReference
Or we have to go to some other file?
The only other place that I see, is the values.yaml of the main helm chart clearml
(not clearml-agent). But there I think we can set the credentials for the default built-in Fileserver, not for some external bucket