Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello. In Which Yaml Files For Kubernetes And Where Should I Specify The Aws File_Server And Credentials For Clearml-Agent So That When Running Tasks Remotely In The Queue, The Artifacts Are Saved In The S3 Bucket?

Hello.
In which YAML files for Kubernetes and where should I specify the AWS file_server and credentials for clearml-agent so that when running tasks remotely in the queue, the artifacts are saved in the S3 bucket?

  
  
Posted 10 months ago
Votes Newest

Answers 13


If you're using Helm it would be at the service level in your values.yml , not pod level

  
  
Posted 10 months ago

If you feel you have a specific enough issue you can also post a github issue and link this thread to it

  
  
Posted 10 months ago

No that's the right one

  
  
Posted 10 months ago

Thank you) we will try

  
  
Posted 10 months ago

@<1523702000586330112:profile|FierceHamster54> sorry, we got the same error ((

from clearml import Task

task = Task.init(project_name='example',  task_name='task template')

task = Task.current_task()
task.setup_aws_upload(bucket="bucket",
                      host="host:port",
                      key="key",
                      secret="secret",
                      multipart=True, secure=False,verify=True)

task.execute_remotely(queue_name='default')

task.upload_artifact("list", [1, 2, 3])
  
  
Posted 10 months ago

Hello @<1523702000586330112:profile|FierceHamster54>
We specified credantials of s3 bucket in clearml.fileserverSecret and clearml.fileserverKey in values.yml of clearml and specified adress of s3 bucket in fileServerUrlReference in values.yml of clearml-agent.

When we run the task remotely we get an error:

clearml.storage - ERROR - Failed creating storage object s3:// Reason: Missing key and secret for S3 storage access ( s3://)
clearml.metrics - WARNING - Failed uploading to s3:// ('NoneType' object has no attribute 'upload')
clearml.metrics - ERROR - Not uploading 1/1 events because the data upload failed

  
  
Posted 10 months ago

I got some credentials issues to in some pipelines steps and I solved it using

task = Task.current_task()
task.setup_aws_upload(...)

It can allows you to explicitaly specify credentials

  
  
Posted 10 months ago

Hello, @<1523702000586330112:profile|FierceHamster54> !
Can you please clarify, where in the clearml-agent's values.yaml we can specify the credentials? We've found there only fileServerUrlReference

Or we have to go to some other file?
The only other place that I see, is the values.yaml of the main helm chart clearml (not clearml-agent). But there I think we can set the credentials for the default built-in Fileserver, not for some external bucket

  
  
Posted 10 months ago

Hum, must be more arcane then, I guess the official support would be able to provide an answer, they usually answer within 24 hours

  
  
Posted 10 months ago

upsy sorry didnt read the entire backtrace

  
  
Posted 10 months ago

As env variables

  
  
Posted 10 months ago

Thanks a lot! @<1523702000586330112:profile|FierceHamster54>

  
  
Posted 10 months ago

@<1523702000586330112:profile|FierceHamster54> Should we write to some other chat?

  
  
Posted 10 months ago