That sounds interesting. Will this also work in a on-prem hosting environment?
Hi ObedientTurkey46 , this capability is only covered in the Hyperdatasets feature. There you can both chunk and query specific metadata.
None
Hi,
I am training with a significant amount of images. I have created a clearml Dataset containing two folders: images and labels. Until now I have created a local-copy (dataset.get_local_copy()), and worked with the data in ‘data_set’ path. I now want (a) to migrate to clearml agents, and (b) not load the entire dataset before training.
How can I sequentially load parts of the dataset? dataset.get_num_chunks(include_parents=True) return 0 …
Thank you!
That sounds interesting. Will this also work in a on-prem hosting environment?
Hi ObedientTurkey46 , this capability is only covered in the Hyperdatasets feature. There you can both chunk and query specific metadata.
None