Also to get past the problem via hack you can split the file via several zips through your code before uploading
Hmmm interesting. According to bytes it looks like 2GB. What type is the file?
Also how are you uploading? Because if you don't zip the folder and upload withtask.upload_artifact('local folder', artifact_object=os.path('<PATH_TO_FOLDER>'))
This should work
HiĀ SmoggyGoat53
There is a storage limit on the file server (basically 2GB per file limit), thisĀ is the cause of the error.
You can upload the 10GB to any S3 alike solution (or a shared folder). Just set the "output_uri" on the Task (either at Task.init or with Task.output_uri = " s3://bucket ")