Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Configured S3 Storage In My Clearml.Conf File On A Worker Machine. Then I Run Experiment Which Produced A Small Artifact And It Doesn'T Appear In My Cloud Storage. What Am I Doing Wrong? How To Make Artifacts Appear On My S3 Storage? Below Is A Sample O

I configured S3 storage in my clearml.conf file on a worker machine. Then I run experiment which produced a small artifact and it doesn't appear in my cloud storage. What am I doing wrong? How to make artifacts appear on my S3 storage?
Below is a sample of clearml.conf with S3 configuration.
s3 { key: "mykey" secret: "mysecret" region: "myendpoint" credentials: [ specifies key/secret credentials to use when handling s3 urls (read or write) { bucket: "mybucket" key: "mykey" secret: "mysecret" }, ] }

  
  
Posted 2 years ago
Votes Newest

Answers 41


A bit overwhelmed by configuration, since it has an agent, a server and bunch of configuration files, easy to mess up

  
  
Posted 2 years ago

SmugDolphin23 I added a region, run experiment again. Didn't work

  
  
Posted 2 years ago

@<1523701435869433856:profile|SmugDolphin23> I didn't use a region at first and that was not working. Now I use a region and it still doesn't work.
From the boto3 inside a Python I could create a session where I specify ak and sk, and create a client from the session where I pass service_name and endpoint_url. It works just fine

  
  
Posted 2 years ago

I think that will work, but I'm not sure actually. I know for sure that something like us-east-2 is supported

  
  
Posted 2 years ago

it's the same file you added your s3 creds to

  
  
Posted 2 years ago

@<1523701087100473344:profile|SuccessfulKoala55> Hey, Jake, getting back to you. I couldn't be able to resolve my issue. I can access my bucket by any means just fine, e.g. by S3 CLI client. All the tools I use require 4 params: AK, SK, endpoint, bucket. I wonder why ClearML doesn't have explicit endpoint parameter and you have to use output_uri for it and why is there a region when other tools don't require it.

  
  
Posted 2 years ago

Oh, it's configured o agent machine, got you

  
  
Posted 2 years ago

Hi again, @<1526734383564722176:profile|BoredBat47> ! I actually took a closer look at this. The config file should look like this:

        s3 {
            key: "KEY"
            secret: "SECRET"
            use_credentials_chain: false

            credentials: [
                {
                    host: "myendpoint:443"  # no http(s):// and no s3:// prefix, also no bucket name
                    key: "KEY"
                    secret: "SECRET"
                    secure: true  # if https
                },
            ]
        }
        default_output_uri: "
"  # notice the s3:// prefix (not http(s))

The region should be optional, but try setting it as well if it doesn't work

  
  
Posted 2 years ago

SmugDolphin23

  
  
Posted 2 years ago

@<1523701087100473344:profile|SuccessfulKoala55> I figured where to find a region but we don't have an AWS dashboard. We have a custom S3 solution for our own enterprise servers like many companies do, data is not stored on amazon servers. That is why we have and endpoint which is an URL starting with http:// If I would connect to our bucket via boto3 I would pass endpoint to a client session with endpoint_url

  
  
Posted 2 years ago

` from random import random
from clearml import Task, TaskTypes

args = {}
task: Task = Task.init(
project_name="My Proj",
task_name='Sample task',
task_type=TaskTypes.inference,
auto_connect_frameworks=False
)
task.connect(args)
task.execute_remotely(queue_name="default")
value = random()
task.get_logger().report_single_value(name="sample_value", value=value)
with open("some_artifact.txt", "w") as f:
f.write(f"Some random value: {value}\n")
task.upload_artifact(name="test_artifact", artifact_object="some_artifact.txt") `

  
  
Posted 2 years ago
160K Views
41 Answers
2 years ago
2 years ago
Tags
Similar posts