Reputation
Badges 1
17 × Eureka!When down dataset using python code, it show a littile bit more information like this one
@<1523701070390366208:profile|CostlyOstrich36> the tab model is empty as well ;'(
@<1523701087100473344:profile|SuccessfulKoala55> The following is how I create the dataset and how I am trying to retrieve it. Is there any other way to retrieve, without actually download dataset (copying) and use the direct link access
My cache folder is /mnt/ssd2t/clearml
@<1523701435869433856:profile|SmugDolphin23> Great, I am able to do it now. Thank you so much 🙇
@<1578193419065364480:profile|SillyLobster91> Here you go
@<1523701070390366208:profile|CostlyOstrich36> haha, how could I miss it. Anw, thank you for your answer 🙇
@<1523701070390366208:profile|CostlyOstrich36> Following is the video
can you give me an example of use of direct_access
setup 🙇
I have large dataset, would like to register the dataset in ClearML without uploading real dataset. When running a task, I can get the dataset without creating a cache folder
I us ed upload button => fail
I us e save clipboard and p as te it to clearml report => fail as w elll
@<1523701087100473344:profile|SuccessfulKoala55> Is there any way .get_local_copy
return direct link of those local files items. Data is duplicated. Those data existed in local storage, but they are still needed to save in cached folder
@<1523701070390366208:profile|CostlyOstrich36>
I tried to add CORS configuration like this but it doesn't work
None
5 days ago, I still can run without downloading issue
Thank you very much for your help
@<1590152178218045440:profile|HarebrainedToad56> I just download to another folder and find it working. Then I return to recheck the state.json file of the original folder then I realize that the original folder has corrupted state.json file. I don't know why it happened. But after I deleted the state.json file, I can start to download to the cache folder again 🙇
When I run, it doesn't show much information
@<1590152178218045440:profile|HarebrainedToad56> strange thing is that we downloaded the dataseet previously, but now, when I look at the local, there is no folder in the cache folder
calculate_metrics.py
we just download the dataset, then we postprocess later