its a directory (sha generation step actually successfull:
Generating SHA2 hash for 1136604 files
as in github issue). given previous experience, i would expect it to be uploaded as multiple zip files.
yes, I dont use s3. i have a dedicated machine with raid configured, were clearml server is running.
In that case I assume this is just a series of a lot of small (?) uploads which take a lot of time
I end up using dvc for the dataset management. It doesnt have fancy UI, but works flawlessly with large datasets
I am not sure about that. I have another dataset of similar structure which is smaller (40gb) and which succeeded to be uploaded. Seems like the how it works - first it computes sha for all the files, but during uploading - aggregates small files in to zip archives approx 512 mb each.
Hi @<1547390422483996672:profile|StaleElk72> , are you getting an error at any point? This is indeed a large file, and I assume you're uploading it to eh ClearML fileserver, and not to some object storage like S3?