Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Everyone, I'M Currently Trying To Set Up S3 As A File Server For Storing All My Data (Artifacts, Datasets, Etc.) Using Clearml, But I'M Encountering Some Issues With The Configuration. I Am Getting This Error

Hi everyone,

I'm currently trying to set up S3 as a file server for storing all my data (artifacts, datasets, etc.) using ClearML, but I'm encountering some issues with the configuration.
I am getting this error clearml.storage - ERROR - Failed creating storage object None Reason: Missing key and secret for S3 storage access ( None )
ValueError: Could not get access credentials for ' None ' , check configuration file ~/clearml.conf

  
  
Posted one month ago
Votes Newest

Answers 3


After copying the clearml.conf file to /opt/clearml/config/ , the error has disappeared, but the files still aren't being reflected on S3.

using below code to upload the file.

task = Task.init(project_name="Test Project", task_name="Test Task", output_uri="
")
# task = Task.init(project_name="Test Project", task_name="Test Task")
task.set_parameter(name='disable_caching', value=True, description='Disable caching for this task')
task.upload_artifact(name="/content/sample_data/README.md", artifact_object={"key": "value"})
# artifact = task.artifacts.get("/content/sample_data/README.md")  
# print(artifact.get_local_copy())
  
  
Posted one month ago

Here’s the relevant portion of my clearml.conf file located at ~/clearml.conf :

sdk {
    aws {
        s3 {
            key: "AWS_ACCESS_KEY"
            secret: "AWS_SECRET_KEY"
            region: "us-east-1"
            use_credentials_chain: true

            credentials: [
                {
                    bucket: "clearml-s3"
                    key: "AWS_ACCESS_KEY"
                    secret: "AWS_SECRET_KEY"
                    region: "us-east-1"
                    use_credentials_chain: true
                }
            ]
        }

        boto3 {
            pool_connections: 512
            max_multipart_concurrency: 16
        }
    }

    storage {
        default_output_uri: " [None](s3://clearml-s3/) "
    }
}

I'm using a t3a.large instance and running the default Docker Compose file on Ubuntu.

Despite following the documentation, I’m having trouble getting it to work. If anyone has experience setting up S3 with ClearML or can point me in the right direction, I would really appreciate your help!

Thanks in advance!

  
  
Posted one month ago

Hey @<1523701070390366208:profile|CostlyOstrich36> could you help me out? I’m having trouble figuring out why it’s not working. I’d really appreciate your help!

  
  
Posted one month ago