Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
StaleElk72
Moderator
2 Questions, 12 Answers
  Active since 17 March 2023
  Last activity one year ago

Reputation

0

Badges 1

12 × Eureka!
0 Votes
12 Answers
913 Views
0 Votes 12 Answers 913 Views
Hi! I use self-hosted server. I uploaded datasets with clearml-data . After a while I am fetch one of them clearml-data get --copy shows_test --id 155299fcad...
one year ago
0 Votes
6 Answers
843 Views
0 Votes 6 Answers 843 Views
one year ago
0 Hi! I Use Self-Hosted Server. I Uploaded Datasets With

okay. I think I see the pattern. datasets that I added from storage server itself have "localhost" in uri of the files. because clearml.conf on the server has it like that. datasets that I added remotely - have old IP address

one year ago
0 Hi! I Use Self-Hosted Server. I Uploaded Datasets With

on the server itself there is clearml.conf with:

# ClearML SDK configuration file
api {
    # Notice: 'host' is the api server (default port 8008), not the web server.
    api_server: 

    web_server: 

    files_server: 
one year ago
0 Hi! I Use Self-Hosted Server. I Uploaded Datasets With

any docs where I can learn a bit more on structure of database? I managed to connect to MongoDB container. databases:

> show dbs
admin    0.000GB
auth     0.000GB
backend  0.027GB
config   0.000GB
local    0.000GB

I assume backend..so

> use backend
> show collections
company
model
project
queue
settings
task
task__trash
url_to_delete
user
versions

nothing related to dataset. I would assume dataset is a task, but not sure

one year ago
0 Hi! I Use Self-Hosted Server. I Uploaded Datasets With

tbh i have no experience with mongodb. from what I can see, its a nested schema. smth like:

execution -> artifacts -> { hash1_output: {uri: ...},  hash2_output: {uri: ... }, ... }

cant compose a compelling find for it

one year ago
0 Hi! I Use Self-Hosted Server. I Uploaded Datasets With

added couple of prints to dataset object. it seems cleaml hardcodes IP for state.json URL. The problem is that server migrated to a new IP. Is there a way to change IP that is hardcoded?

one year ago
one year ago
0 Hi, All I Have Some Issues Uploading A Big (100Gb) Dataset To Self-Hosted Clearml Server. Is There Any Tricks I Should Be Aware When Launching The Server? Maybe Configuring Timeout Or Giving More Resources? Right Now The Upload Freezes And In The Web-Int

its a directory (sha generation step actually successfull:

Generating SHA2 hash for 1136604 files

as in github issue). given previous experience, i would expect it to be uploaded as multiple zip files.

yes, I dont use s3. i have a dedicated machine with raid configured, were clearml server is running.

one year ago
0 Hi, All I Have Some Issues Uploading A Big (100Gb) Dataset To Self-Hosted Clearml Server. Is There Any Tricks I Should Be Aware When Launching The Server? Maybe Configuring Timeout Or Giving More Resources? Right Now The Upload Freezes And In The Web-Int

I am not sure about that. I have another dataset of similar structure which is smaller (40gb) and which succeeded to be uploaded. Seems like the how it works - first it computes sha for all the files, but during uploading - aggregates small files in to zip archives approx 512 mb each.

one year ago
0 Hi! I Use Self-Hosted Server. I Uploaded Datasets With

now i cant download neither of them 😕 would be nice if address of the artifacts (state and zips) was assembled on the fly and not hardcoded into db. if you have any tips how to fix it in the mongo db that would be great. I found this tip on model relocation: None . I think I need smth really similar but for datasets

one year ago