Are datasets of size 90GB considered HyperDatasets?
It happened again. get_local_copy()
worked as expected, but then when I tried:.get_mutable_local_copy(local_data_path, overwrite=True, raise_on_error=False)
contents of evry 'data' folder on the share were deleted and the same error was displayed.
No, the samll test dataset has only 32MB. I created the dataset by using Dataset.create(...)
datasset.add_files(...)
and then dataset.finalize()
. I unfortunately dont have s3. I poked around in the saved data on the share and it seems that for some reaseon folders 'data' to 'data_11' have their contents deleted. Whats even weirder is that they were deleted right at the time when i first tried to get a mutable copy today, the other folders are untouched since monday when i created the dataset. I will remake the dataset again, but any ideas why this happened?
Hi DeliciousKoala34 , is there also an exceptionally large amount of files in that Dataset? How do you create the dataset? What happens if you use something like s3 if you have available?