python upload_data_to_clearml_copy.py Generating SHA2 hash for 1 files 100%|████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 733.91it/s] Hash generation completed 0%| | 0/1 [00:00<?, ?it/s] Compressing local files, chunk 1 [remaining 1 files] 100%|████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 538.77it/s] File compression completed: total size 130 bytes, 1 chunked stored (average size 130 bytes) Uploading compressed dataset changes 1/1 (1 files 130 bytes) to BUCKET Upload completed (130 bytes)
`
import os
import glob
from clearml import Dataset
DATASET_NAME = "Bug"
DATASET_PROJECT = "ProjectFolder"
TARGET_FOLDER = "clearml_bug"
S3_BUCKET = os.getenv('S3_BUCKET')
if not os.path.exists(TARGET_FOLDER):
os.makedirs(TARGET_FOLDER)
with open(f'{TARGET_FOLDER}/data.txt', 'w') as f:
f.writelines('Hello, ClearML')
target_files = glob.glob(TARGET_FOLDER + "/**/*", recursive=True)
# upload dataset
dataset = Dataset.create(dataset_name=DATASET_NAME, dataset_project=DATASET_PROJECT)
dataset.add_files(TARGET_FOLDER)
dataset.upload(
show_progress=True,
verbose=False,
output_url=S3_BUCKET,
compression=None,
)
getting a local copy of the dataset
dataset_folder = Dataset.get(
dataset.id
).get_local_copy()
target_files = glob.glob(TARGET_FOLDER + "//*", recursive=True)
downloaded_files = glob.glob(dataset_folder + "//*", recursive=True)
# test upload
assert target_files
assert downloaded_files
assert [os.path.basename(x) for x in target_files] == [
os.path.basename(x) for x in downloaded_files
] `
Can you try to go into 'Settings' -> 'Configuration' and verify that you have 'Show Hidden Projects' enabled?
Looks like it's picking up the projects but then viewing on the UI they disappear
RobustRat47 are you using the new SDK? You should see these in the new Datasets panel
Yes correct . In fact we cannot see any old datasets
Hi SuccessfulKoala55 yes I can see the one upload using 1.6.1 but all old datasets have now been remove. I guess you want people to start moving over?
I thought new datasets do not appear, I didn't realize old ones were missing - am I getting it right?
(deepmirror) ryan@ryan:~$ python -c "import clearml print(clearml.__version__)" 1.1.4
Same with new version(deepmirror) ryan@ryan:~/GitHub/deepmirror/ml-toolbox$ python -c "import clearml; print(clearml.__version__)" 1.6.1
Generating SHA2 hash for 1 files 100%|███████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 2548.18it/s] Hash generation completed Uploading dataset changes (1 files compressed to 130 B) to BUCKET File compression and upload completed: total size 130 B, 1 chunked stored (average size 130 B)
This was the error I was getting from uploads using the old SDKhas been rejected for invalid domain. heap-2443312637.js:2:108655 Referrer Policy: Ignoring the less restricted referrer policy "no-referrer-when-downgrade" for the cross-site request: