H @<1523701087100473344:profile|SuccessfulKoala55> , the data is stored locally for sure - we have more data than even the 100GB artifacts storage would allow. Plus I doubt that the actual dataset would be counted against the metrics storage quota, right?
Your second remark is pretty much what the email support told me before sending me here. The problem is that I still don't know how I can
a) prevent my datasets taking up so much metrics space (can I disable previews?)
b) find and remove any unneeded data so I can continue using clearML
At the same time I'm also currently looking into self-hosting the open source version to get rid of these limits, but there seems to be no path whatsoever for migrating the existing data on SaaS to a self-hosted instance, which is very frustrating.
Any idea what I could do? I'm starting to wonder how anyone could use the dataset functionality productively if it just fills up the metric quota like this...