thank you @<1705029129084080128:profile|SmoothPigeon36> glad to hear you enjoy ClearML.
I believe we identified the source of the problem, and working on a bug fix.
@<1523701087100473344:profile|SuccessfulKoala55> here a screen capture to show you what I mean
Hi @<1705029129084080128:profile|SmoothPigeon36> , can you perhaps provide a concrete example?
Thanks, I'll pass it on to the UI team for to try and reproduce
there is no limitation reporting the same images on different iterations. what we think might be the issue that multiple images are reported with the same iteration + metric + variant, using report_image
can this be the case?
Hi @<1705029129084080128:profile|SmoothPigeon36>
we found out that unintentionally, debug sample will display up to 3 images with the same iteration/metric/variant. which was not the initial intent. while image viewer support 1 image per this combination.
I have 2 questions.
- can you confirm that this is indeed the issue in your case.
- and if so, was reporting more than 1 image for the same combination intentional?
Just a +1 here. When we use the same name for 3 differents image, the thumbnail show 3 different images, but when clicking on any of them, only one is displayed. No way to display the others
hi @<1523703436166565888:profile|DeterminedCrab71> thank you for looking into this! I'm not sure if this is the problem. What I'm trying to do is simply displaying some of the images produced by my data generator in clearml. I have a callback that at the beginning of each epoch, for each image in the batch, simply calls report_image
on every element of the batch. Because data augmentation is part of the elements I want to investigate, the same images are reported at every epoch. However, if no augmentation is selected, it's possible that images are identical for each iteration.
I just check that, you're right! The iteration and metrics are definitely the same (same epoch, metric is "debug_samples"
), but the variant was initially different (it was a simple f"{batch}{elem}"
). The bug started showing up when I started using the variant (or series) to show the label as text. As a consequences, I end up with samples from the same batch, with the same label being reported with the same iteration+metric+variant.
adding a running index to the variant name should take care of this issue
I just did that after realising the issue, thank you! 🙂