CharmingStarfish14 can you check something from code, just to see if this would solve the issue?
so probably the metadata was too large to fit... Any way to describe the metadata and its scope?
Sure, AgitatedDove14 !
I will get to it next week. Thank you for the answer!
Hi DrabOwl94 , how did you create/save/finalize the dataset?
Hi DrabOwl94
I think that if I understand you correctly you have a Lot of chunks (which translates to a lot of links to small 1MB files, because this is how you setup the chunk size). Now apparently you have reached the maximum number of chunks per specific Dataset version (at the end this meta-data is stored in a document with limited size, specifically 16MB).
How many chunks do you have there?
(In other words what's the size of the entire dataset in MBs)
so 78000 entries ...
wow a lot! would it makes sens to do 1G chunks ? any reason for the initial 1Mb chunk size ?
So you are saying 156 chunks, with each chunk about ~6500 files ?
check the latest RC, it solved an issue with dataset uploading,
Let me check if it also solved this issue
AgitatedDove14
Hello, Martin. Any news about this issue?
We really want to use ClearML for datasets that are hundreds GB worth of data.
Are you saying the ClearML is not able to do that?
DrabOwl94 how many 1M files did you end up having ?
DrabOwl94 can you attach a code snippet? This error basically means you've hit the maximum size allowed for the task's BSON document, but the dataset itself should be uploaded as an artifact