Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey, Guys, Are Any Of Your Have An Experience Of Configuring Clearml Agent To Use A Custom Non-Aws S3 Cloud Which Uses Http Endpoint. If So, How Did You Do It? I Would Really Like To Use Clearml But I Can'T Figure Out How To Set Up It Properly. I Tried Do

Hey, guys, are any of your have an experience of configuring ClearML agent to use a custom non-AWS S3 cloud which uses http endpoint. If so, how did you do it? I would really like to use ClearML but I can't figure out how to set up it properly. I tried dozen of different ways of filling the values up: output_uri, host, bucket etc., no luck. It's really a shame that something so simple like connecting to an S3 cloud requires so much fiddling and juggling and documentation has very little info.

  
  
Posted one year ago
Votes Newest

Answers 24


My current setup is:
sdk.development.default_output_uri=< None > # no port, no bucket
sdk.aws.s3.key=<my-access-key>
sdk.aws.s3.secret=<my-secret-key>
sdk.aws.s3.region=<my-region> # I think it can be skipped but somewhere in the clearml code it says that it must be specified if it's not default like us-east-1 or something
sdk.aws.s3.credentials.bucket=<my-bucket> # just a bucket name
sdk.aws.s3.credentials.host=< None : 443> # the same as output_uri and plugged 443 port in
sdk.aws.s3.credentials.key=<my-access-key>
sdk.aws.s3.credentials.secret=<my-secret-key>
sdk.aws.s3.credentials.region=<my-region>
sdk.aws.s3.credentials.secure=true

  
  
Posted one year ago

Thanks for the reply! We have a custom S3 server, it has an URL — endpoint like https://<some-domain>.<sub-domain>. I've read in docs that when you provide credentials.host — port must be specified. @<1523701070390366208:profile|CostlyOstrich36>

  
  
Posted one year ago

Hi @<1526734383564722176:profile|BoredBat47> , it should be very easy and I've done it multiple times. For the quickest fix you can use api.files_server in clearml.conf

  
  
Posted one year ago

Maybe @<1523701087100473344:profile|SuccessfulKoala55> has more insight into this 🙂

  
  
Posted one year ago

@<1523701070390366208:profile|CostlyOstrich36>

  
  
Posted one year ago

Runs perfectly with Minio too 🙂

  
  
Posted one year ago

How are you currently setting it up?

  
  
Posted one year ago

@<1523701087100473344:profile|SuccessfulKoala55> It's the URL I use when creating boto3 session from Python like this fro example

s3 = self.session.client(
    service_name='s3',
    endpoint_url=endpoint,
    verify=False
)
  
  
Posted one year ago

sdk.development.default_output_uri=<

no port, no bucket

@<1526734383564722176:profile|BoredBat47> this points to the s3 server?

  
  
Posted one year ago

@<1523701087100473344:profile|SuccessfulKoala55> No port needed when accessing this URL from things like boto3 or s3-client CLI

  
  
Posted one year ago

session = boto3.Session(
    aws_access_key_id=self.access_key,
    aws_secret_access_key=self.secret_key)
  
  
Posted one year ago

He tried to help me in another thread but I still couldn't make things work

  
  
Posted one year ago

If it points to your own S3 server, it must have a port

  
  
Posted one year ago

@<1523701087100473344:profile|SuccessfulKoala55> Right

  
  
Posted one year ago

Sorry, guys, maybe I am not expressing myself clear or it's something I am missing, I am not a native speaker so I'll try to reformulate. What we have is enterprise solution built on S3 technology, I don't have an access to servers on where it's run, I don't have a port. All I have been provided with are: secret key, access key, endpoint that looks like a regular web URL and a bucket name. Using these creds I can access this cloud storage just fine by any means except ClearML

  
  
Posted one year ago

Docstring from inside the boto3 lib says:

:param endpoint_url: The complete URL to use for the constructed
    client. Normally, botocore will automatically construct the
    appropriate URL to use when communicating with a service.  You
    can specify a complete URL (including the "http/https" scheme)
    to override this behavior.  If this value is provided,
    then ``use_ssl`` is ignored.

I want ClearML to use my endpoint

  
  
Posted one year ago

@<1526734383564722176:profile|BoredBat47> , what happens if you configure it like @<1523701087100473344:profile|SuccessfulKoala55> is suggesting?

  
  
Posted one year ago

No port needed when accessing this URL from things like boto3 or s3-client CLI

A port is needed when using it through ClearML

  
  
Posted one year ago

My question could be this: what's get plugged into endpoint_url in boto3 client inside ClearML?

  
  
Posted one year ago

@<1523701070390366208:profile|CostlyOstrich36> You mean using port in credentials.host ?

  
  
Posted one year ago

This is what it does when you specify a port...

  
  
Posted one year ago

After I run my experiment I have a console error that says I am missing security headers. This is a custom XML response. The same behaviour could be achieved when just trying to curl the endpoint or plug it in the browser. When I run e.g. boto3 client where I explicitly specify endpoint, ak, sk and bucket I could do whatever I want. So it seems to me ClearML is trying to get to this endpoint in some incorrect way

  
  
Posted one year ago

@<1523701087100473344:profile|SuccessfulKoala55> So I have to provide a host for it to work and no other way around it?

  
  
Posted one year ago

@<1526734383564722176:profile|BoredBat47> if your S3 server is using https (which I assume it does) the port will be 443

  
  
Posted one year ago