Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Why Is Async_Delete Not Working?

Why is async_delete not working?

  • bucket is not right in logs
  • This is really misleading in web ui, because it says "success" although async_delete failed misserably.
  • Im using latest versions
  • Self hosted cleraml, self hosted s3
    image
    image
  
  
Posted 9 months ago
Votes Newest

Answers 80


Hi @<1590514584836378624:profile|AmiableSeaturtle81> , you need to add the port to the credentials when you input them in the webUI

  
  
Posted 8 months ago

I dont have a region. I guess I will wait till tomarrow then?

  
  
Posted 9 months ago

unable to see the images with that link tho

  
  
Posted 8 months ago

ok, slight update. It seems like artifacts are uploading now to bucket. Maybe my folder explorer used old cache or something.
However, reported images are uploaded to fileserver instead of s3

here is the script im using to test things. Thanks
image
image

  
  
Posted 9 months ago

@<1523701070390366208:profile|CostlyOstrich36> Hello, im still unable to understand how to fix this

  
  
Posted 8 months ago

you might want to prefix both the host in the configuration file and the uri in Task.init / StorageHelper.get with s3. if the script above works if you do that

  
  
Posted 9 months ago

It looks like im moving forward

Setting url in clearml.conf without "s3" as suggested works (But I dont add port ther, not sure if it breaks something, we dont have a port)
host: " our-host.com "

Then in test_task.py
task: clearml.Task = clearml.Task.init(
project_name="project",
task_name="task",
output_uri=" None ",
)

I think connection is created
What im getting now is bucket error, i suppose I have to specify it somewhere?
image

  
  
Posted 9 months ago

I tried it with port, but still having the same issue
Tried it with/without secure and multipart
image
image
image

  
  
Posted 9 months ago

py file:
task: clearml.Task = clearml.Task.init(
project_name="project",
task_name="task",
output_uri=" None ",
)

clearml.conf:
{
# This will apply to all buckets in this host (unless key/value is specifically provided for a given bucket)
host: " our-host.com "
key: "xxx"
secret: "xxx"
multipart: false
secure: true
}
image

  
  
Posted 9 months ago

there is a typing in clearm.conf i sent you on like 87, there should be "key" not "ey" im aware of it

  
  
Posted 9 months ago

2024-02-08 11:23:52,150 - clearml.storage - ERROR - Failed creating storage object

Reason: Missing key and secret for S3 storage access (

)

(edited)

This looks unrelated, to the hotfix, it looks like you misconfigured something and therefor failing to write to s3

  
  
Posted 9 months ago

we do, yes. Changing it to https in settings doesnt help

  
  
Posted 9 months ago

Hi @<1590514584836378624:profile|AmiableSeaturtle81> ! To help us debug this: are you able to simply use the boto3 python package to interact with your cluster?
If so, how does that code look like? This would give us some insight on how the config should actually look like or what changes need to be made.

  
  
Posted 9 months ago

removing it doesnt fix the problem

  
  
Posted 9 months ago

image

  
  
Posted 8 months ago

This is unrelated to your routers. There are two things at play here. The configuration of WHERE the data will go - output_uri and the other is the clearml.conf that you need to setup with credentials. I am telling you, you are setting it wrong. Please follow documentation.

  
  
Posted 9 months ago

So from our IT guys i now know that
"s3" part of url is subdomain, we use it in all other libs like boto3 and cloudpathlib, never had any problems
This is where the crash happens inside the clearml Task
image

  
  
Posted 9 months ago

@<1590514584836378624:profile|AmiableSeaturtle81> weren't you using https for the s3 host? maybe the issue has something to do with that?

  
  
Posted 9 months ago

@<1523701435869433856:profile|SmugDolphin23> Setting it without http is not possible as it auto fills them back in

  
  
Posted 8 months ago

Bump, still waiting, closing in on a month since we are unable to deploy. We have team of 10+ people

  
  
Posted 9 months ago

host: "my-minio-host:9000"

The port should be whatever port that is used by your S3 solution

  
  
Posted 9 months ago

digging deeper it seems like a parsing issue
image

  
  
Posted 9 months ago

@<1523701435869433856:profile|SmugDolphin23> Any news?

  
  
Posted 8 months ago

Hi @<1590514584836378624:profile|AmiableSeaturtle81> , any non-AWS S3-like storage must have a port in this setup, how did you configure the SDK?
Also the two ways you're showing are the same - the popup will fill in the details in the settings page

  
  
Posted 8 months ago

in the code, the output uri should be with None :<PORT>

  
  
Posted 9 months ago

I know these keys work, url and everything else works because I use these creds daily

  
  
Posted 9 months ago

In which ui? Because there are two ways to do it. When clicking on artifacti url there is a popup (but has no way to change host url)
Our s3 host doesnt have port (didnt specify port in clearml.conf anywhere and upload works)
image
image
image

  
  
Posted 8 months ago

it looks like problem is the host field, whenever I add it I get:
2024-01-22 13:27:16,489 - clearml.storage - ERROR - Failed creating storage object None Reason: Missing key and secret for S3 storage access ( None )

  
  
Posted 9 months ago

Yes, credetials seems to work
Im trying to figure out not why I dont see the uploaded files / folders

  • I checked maybe clearml task uses fileserver instead but i dont see any files in fileserver folder
  • Nothing is uploaded in bucket (i will ask IT guy to check if im uploading any files in logs)
    image
  
  
Posted 9 months ago

This is an actual AWS S3 bucket?

  
  
Posted 9 months ago