On the original 30GB dataset, it took just a few seconds to go from uploading the last chunk of data to "File compression and upload completed" so I find it weird that the upload of the update is hanging indefinitely while processing and without utilizing the disk at all.
@<1523701070390366208:profile|CostlyOstrich36> I'll be glad for any ideas of what might be happening
Huh. So it looks like this was an issue of spawning too many upload workers which overwhelmed the fileserver limited to a single core...? When I limited max_workers in upload() on the client side, it went smoothly with no hanging. Funny thing is I had no issues with this using sync_folder() which I used for the original data upload, hence my perceived difference in performance despite similar file sizes.
Hi @<1547028074090991616:profile|ShaggySwan64> , so the issue is when writing to the files server? Is it possible that the machine itself is having a hard time to write the data?
It does feel like the server is struggling since webUI is also having trouble loading debug sample artifacts during the upload. But I'm not sure why that would be the case. The client console is hanging after "uploading dataset changes" and I can see the fileserver.py process putting load on the server cpu but don't see any files being added or changed on the local fileserver folder. Is there a way to check what is the fileserver doing? I don't see anything suspicious in log.
I should probably add that a lot of the update is file modifications...