Reputation
Badges 1
14 × Eureka!Pretty sure it's not the reason. Now I've encountered this issue on 5+ dataset I'm using on different projects. Some worked quite well before, but not recently.
Error message as the image says.
The file size is 415MB, but the download "succeeds" at 107MB.
It shows we're still in the free tier quota
AgitatedDove14 In our case, redownload doesn't help because it leads to the same result. The download gets interrupted due to network error.
But it's a good start to tell finishing and succeeding apart
AgitatedDove14 Yes I think that's the problem. And if there's also a way to keep resuming the download when using sdk, our python code will work like before. That's basically all we need.
I'm not familiar with that either. But downloading with chrome browser and some perseverance to keep clicking continue does work. It's quite cool.
Anyway, thanks for the help. As a workaround, we will avoid large file uploading from now on. Look forward to hearing from you if you guys manage to reproduce the issue or implement a fix.
https://clearml.slack.com/archives/CTK20V944/p1642735039222200?thread_ts=1642731461.221700&cid=CTK20V944
Like I said here, using browser doesn't work. It has the same behavior.
Since the error says network error, is it possible because I'm in Taiwan? Like downloading from Asia leads to this kind of issue.
AgitatedDove14
https://github.com/allegroai/clearml/issues/552
Just did. Hope the format looks okay.
AgitatedDove14 Earlier my colleague said he managed to download the dataset with browser by keeping "resuming" the download once it stops due to network error. So no I don't think it's the problem of the file itself...
Yes this always happens. Task creation and dataset upload both work fine
Say a 400+MB dataset. The download will fail at like 80MB. Doesn't matter whether using SDK or from clearML experiment page.