Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Configured S3 Storage In My Clearml.Conf File On A Worker Machine. Then I Run Experiment Which Produced A Small Artifact And It Doesn'T Appear In My Cloud Storage. What Am I Doing Wrong? How To Make Artifacts Appear On My S3 Storage? Below Is A Sample O

I configured S3 storage in my clearml.conf file on a worker machine. Then I run experiment which produced a small artifact and it doesn't appear in my cloud storage. What am I doing wrong? How to make artifacts appear on my S3 storage?
Below is a sample of clearml.conf with S3 configuration.
s3 { key: "mykey" secret: "mysecret" region: "myendpoint" credentials: [ specifies key/secret credentials to use when handling s3 urls (read or write) { bucket: "mybucket" key: "mykey" secret: "mysecret" }, ] }

  
  
Posted 2 years ago
Votes Newest

Answers 41


@<1526734383564722176:profile|BoredBat47> Yeah. This is an example:

 s3 {
            key: "mykey"
            secret: "mysecret"
            region: "us-east-1"
            credentials: [
                 {
                     bucket: "
"
                     key: "mykey"
                     secret: "mysecret"
                    region: "us-east-1"
                  },
            ]
}
# some other config
default_output_uri: "
"
  
  
Posted 2 years ago

` from random import random
from clearml import Task, TaskTypes

args = {}
task: Task = Task.init(
project_name="My Proj",
task_name='Sample task',
task_type=TaskTypes.inference,
auto_connect_frameworks=False
)
task.connect(args)
task.execute_remotely(queue_name="default")
value = random()
task.get_logger().report_single_value(name="sample_value", value=value)
with open("some_artifact.txt", "w") as f:
f.write(f"Some random value: {value}\n")
task.upload_artifact(name="test_artifact", artifact_object="some_artifact.txt") `

  
  
Posted 2 years ago

May I know where to set the cert to in env variable?

  
  
Posted 2 years ago

` s3 {
# S3 credentials, used for read/write access by various SDK elements

        # default, used for any bucket not specified below
        key: "mykey"
        secret: "mysecret"
        region: " ` ` "

        credentials: [

             {
                 bucket: "mybucket"
                 key: "mykey"
                 secret: "mysecret"
                 region: " ` ` "
              }, `
  
  
Posted 2 years ago

I think that will work, but I'm not sure actually. I know for sure that something like us-east-2 is supported

  
  
Posted 2 years ago

check the output_uri parameter in Task.init

  
  
Posted 2 years ago

@<1523701435869433856:profile|SmugDolphin23> @<1523701087100473344:profile|SuccessfulKoala55>
2023-02-03 20:38:14,515 - clearml.metrics - WARNING - Failed uploading to <my-endpoint> (HTTPSConnectionPool(host=' e ndpoint', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1131)'))))
2023-02-03 20:38:14,517 - clearml.metrics - ERROR - Not uploading 1/2 events because the data upload failed

  
  
Posted 2 years ago

@<1523701087100473344:profile|SuccessfulKoala55> I figured where to find a region but we don't have an AWS dashboard. We have a custom S3 solution for our own enterprise servers like many companies do, data is not stored on amazon servers. That is why we have and endpoint which is an URL starting with http:// If I would connect to our bucket via boto3 I would pass endpoint to a client session with endpoint_url

  
  
Posted 2 years ago

Oh, it's configured o agent machine, got you

  
  
Posted 2 years ago

The code is run from another machine where clearml.conf configured to connect to ClearML server, no other configurations are provided

  
  
Posted 2 years ago

@<1523701435869433856:profile|SmugDolphin23> I actually don't know where to get my region for the creds to S3 I am using. From what I figured, I have to plug in my sk, ak and bucket into credentials in agent and output URI must be my S3 endpoint — complete URI with protocol. Is it correct?

  
  
Posted 2 years ago
176K Views
41 Answers
2 years ago
2 years ago
Tags
Similar posts