Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Guys, I Think There'S Something Wrong On

Hi guys, I think there's something wrong on https://app.clear.ml . The font etc is changing to caps and . 's are being introduced into the path of project folders

  
  
Posted 2 years ago
Votes Newest

Answers 16


`
import os
import glob
from clearml import Dataset

DATASET_NAME = "Bug"
DATASET_PROJECT = "ProjectFolder"
TARGET_FOLDER = "clearml_bug"
S3_BUCKET = os.getenv('S3_BUCKET')

if not os.path.exists(TARGET_FOLDER):
os.makedirs(TARGET_FOLDER)

with open(f'{TARGET_FOLDER}/data.txt', 'w') as f:
f.writelines('Hello, ClearML')

target_files = glob.glob(TARGET_FOLDER + "/**/*", recursive=True)

# upload dataset

dataset = Dataset.create(dataset_name=DATASET_NAME, dataset_project=DATASET_PROJECT)
dataset.add_files(TARGET_FOLDER)
dataset.upload(
show_progress=True,
verbose=False,
output_url=S3_BUCKET,
compression=None,
)

getting a local copy of the dataset

dataset_folder = Dataset.get(
dataset.id
).get_local_copy()

target_files = glob.glob(TARGET_FOLDER + "//*", recursive=True)
downloaded_files = glob.glob(dataset_folder + "/
/*", recursive=True)

# test upload

assert target_files
assert downloaded_files
assert [os.path.basename(x) for x in target_files] == [
os.path.basename(x) for x in downloaded_files
] `

  
  
Posted 2 years ago

Yes correct . In fact we cannot see any old datasets

  
  
Posted 2 years ago

Looks like it's picking up the projects but then viewing on the UI they disappear

  
  
Posted 2 years ago

image

  
  
Posted 2 years ago

image

  
  
Posted 2 years ago

image

  
  
Posted 2 years ago

I thought new datasets do not appear, I didn't realize old ones were missing - am I getting it right?

  
  
Posted 2 years ago

It was the CORS like error we were getting

  
  
Posted 2 years ago

RobustRat47 are you using the new SDK? You should see these in the new Datasets panel

  
  
Posted 2 years ago

(deepmirror) ryan@ryan:~$ python -c "import clearml print(clearml.__version__)" 1.1.4

  
  
Posted 2 years ago

python upload_data_to_clearml_copy.py Generating SHA2 hash for 1 files 100%|████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 733.91it/s] Hash generation completed 0%| | 0/1 [00:00<?, ?it/s] Compressing local files, chunk 1 [remaining 1 files] 100%|████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 538.77it/s] File compression completed: total size 130 bytes, 1 chunked stored (average size 130 bytes) Uploading compressed dataset changes 1/1 (1 files 130 bytes) to BUCKET Upload completed (130 bytes)

  
  
Posted 2 years ago

I think this is a bug

  
  
Posted 2 years ago

This was the error I was getting from uploads using the old SDK
has been rejected for invalid domain. heap-2443312637.js:2:108655 Referrer Policy: Ignoring the less restricted referrer policy "no-referrer-when-downgrade" for the cross-site request:

  
  
Posted 2 years ago

Hi SuccessfulKoala55 yes I can see the one upload using 1.6.1 but all old datasets have now been remove. I guess you want people to start moving over?

  
  
Posted 2 years ago

Can you try to go into 'Settings' -> 'Configuration' and verify that you have 'Show Hidden Projects' enabled?

  
  
Posted 2 years ago

Same with new version
(deepmirror) ryan@ryan:~/GitHub/deepmirror/ml-toolbox$ python -c "import clearml; print(clearml.__version__)" 1.6.1
Generating SHA2 hash for 1 files 100%|███████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 2548.18it/s] Hash generation completed Uploading dataset changes (1 files compressed to 130 B) to BUCKET File compression and upload completed: total size 130 B, 1 chunked stored (average size 130 B)

  
  
Posted 2 years ago