Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AmiableSeaturtle81
Moderator
32 Questions, 118 Answers
  Active since 14 July 2023
  Last activity 21 days ago

Reputation

0

Badges 1

115 × Eureka!
0 Votes
1 Answers
561 Views
0 Votes 1 Answers 561 Views
Another annoying problem we have is that task deletion doesnt delete any files at all. I have selected deletion of images But nothing is deleted. I have to w...
8 months ago
0 Votes
80 Answers
22K Views
0 Votes 80 Answers 22K Views
Why is async_delete not working? - bucket is not right in logs - This is really misleading in web ui, because it says "success" although async_delete failed ...
8 months ago
Show more results questions
0 I'Ve Had This Bug Where Every Few Weeks All My Current Running Experiments Are Stopped And Then Deleted. This Has Now Happend Like 3-4 Times. I Dont Understand What Is Causing It. Model Files, Debug Images Are Saved In Fileserver Folder, But The Task Itse

hi, thanks for reaching out. Getting desperate here.
Yes, its self hosted
No, only currently running experiments are deleted (task itself is gone, but debug images and models are present in fileserver folder)

What I do see is some random elastisearch errors popping up from time to time

[2024-01-05 09:16:47,707] [9] [WARNING] [elasticsearch] POST None ` [status:N/A requ...

8 months ago
0 What Could Be The Reason That Im Not Getting Any Scalars Reported To Clearml Using Example Script?

I solved the problem.
I had to add tensorboard loggger and pass it to pytorch_lightning trainer logger=logger
Is that normal?

8 months ago
0 Why Is Async_Delete Not Working?

it looks like problem is the host field, whenever I add it I get:
2024-01-22 13:27:16,489 - clearml.storage - ERROR - Failed creating storage object None Reason: Missing key and secret for S3 storage access ( None )

8 months ago
0 Why Is Async_Delete Not Working?

I cant get the conf credentials to work
Specifying it like this gives me:
Exception has occurred: ValueError
Could not get access credentials for ' None ' , check configuration file ~/clearml.conf
image

8 months ago
0 I'Ve Had This Bug Where Every Few Weeks All My Current Running Experiments Are Stopped And Then Deleted. This Has Now Happend Like 3-4 Times. I Dont Understand What Is Causing It. Model Files, Debug Images Are Saved In Fileserver Folder, But The Task Itse
  1. is 50GB elastisearch normal? Have you seen it. elsewhere or are we doing something wrong, one thing I think is that we are probably logging too frequently
  2. Is it possible to somehow clean up this?
8 months ago
0 Why Is Async_Delete Not Working?

also, if I try to set the url to None it auto replaces it with None : None

6 months ago
0 Why Is Async_Delete Not Working?

Specifying it like this, gets me different error:

Exception has occurred: ValueError

  • Insufficient permissions (delete failed) for None
    botocore.exceptions.ClientError: An error occurred (IllegalLocationConstraintException) when calling the DeleteObject operation: The me-south-1 location constraint is incompatible for the region specific endpoint this request was sent to.

During handling of the above exception, another exception occurred:

File "/home/ma...

8 months ago
0 Why Is Async_Delete Not Working?
  • Here is how client side clearml.conf looks like together with the script im using to create the tasks. Uploads seems to work and is fixed thanks to you guys 🙌
    image
    image
    image
6 months ago
0 Why Is Async_Delete Not Working?
  1. This is how web UI configurations looks like
    image
6 months ago
0 Why Is Async_Delete Not Working?

The problem is that clearml.conf s3 config doesnt support empty region field, even empty strings crashes it

7 months ago
0 Hi, I’Ve Set Up A Clearml Server With The Default_Output_Uri Pointed To S3. We’Re Planning To Migrate From S3 To Azure Blob Storage. Is There A Direct Way To Migrate The Data, Or Should We Simply Transfer The Data From S3 To Azure And Update The Default_

We had a similar problem. Clearml doesnt support data migration (not that I know of)
So you have two ways to fix this:

  • Recreate the dataset when its already in Azure
  • Edit each elasticsearch database file entry to point to new destination (we did this)
3 months ago
0 Why Is Async_Delete Not Working?

Yes, credetials seems to work
Im trying to figure out not why I dont see the uploaded files / folders

  • I checked maybe clearml task uses fileserver instead but i dont see any files in fileserver folder
  • Nothing is uploaded in bucket (i will ask IT guy to check if im uploading any files in logs)
    image
7 months ago
0 Why Is Async_Delete Not Working?

we use Ceph Storage Cluster, interface to it is the same as S3
I dont get what I have misconfigured.
The only thing I have not added is "region" field in clearml.conf because we literally dont have, its a self hosted cluster.
You can try and replicate this s3 config I have posted earlier.

7 months ago
0 Hello, Please Dont Tell Me I Just Deleted Something: I Wanted To Do Two Things:

I purged all docker images and it still doesnt seem right
I see no side panel and it doesnt ask for login name

one year ago
0 Hello, Please Dont Tell Me I Just Deleted Something: I Wanted To Do Two Things:

But it seems like the data is gone, not sure how to get them back

one year ago
0 Does Dataset.Add_Files Support Uploading From S3 Uri? I Have No Problem Uploading To S3 But Cant Use Data That Is Already In S3? Or Am I Dong Something Wrong? I Read In Documentation That Add_External_Files Supports This Feature, But I Want To Be Able To

Our datasets are more than 1TB in size and will grow in size (probably 4TB and up), this means we also need 4TB local storage just to upload the dataset back in zipped format. This is not a good solution.

What we can do I guess is do the downloading locally by some chunks of files?
Download locally 100 files, add_to_clearml dataset, repeat

5 months ago
0 Why Is Async_Delete Not Working?

But there are stil some wierd issues, i cannot see the files uploaded in bucket

7 months ago
Show more results compactanswers