Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Everyone! I'Ve Been Using Clearml For A While And I Love It, But I Recently Noticed A Strange Issue. From The "Debug Samples" Tab, Where I Can See Some Training Samples From My Dataset, When I Click On Some Thumbnails The Image I Can See Does Not Match

hi everyone! I've been using ClearML for a while and I love it, but I recently noticed a strange issue. From the "debug samples" tab, where I can see some training samples from my dataset, when I click on some thumbnails the image I can see does not match with the image I clicked on. It doesn't happen all the time and it only happens with some thumbnails. I haven't been able to figure out what causes it. I thought it was a caching issue but it's not. Since I haven't been able to replicate it consistently (seems to be a random issue) I haven't open a bug ticket for it. Has anyone experienced it before? Any way I can figure out what's happening?
edit: to add to that, if I look at the browser console for the thumbnail and the image in full screen, I can see that when it's working properly it loads the exact same image. When the issue arises, the paths of thumbnail and full screen image are diffent

  
  
Posted 6 months ago
Votes Newest

Answers 12


thank you @<1705029129084080128:profile|SmoothPigeon36> glad to hear you enjoy ClearML.
I believe we identified the source of the problem, and working on a bug fix.

  
  
Posted 6 months ago

@<1523701087100473344:profile|SuccessfulKoala55> here a screen capture to show you what I mean

  
  
Posted 6 months ago

Hi @<1705029129084080128:profile|SmoothPigeon36> , can you perhaps provide a concrete example?

  
  
Posted 6 months ago

Thanks, I'll pass it on to the UI team for to try and reproduce

  
  
Posted 6 months ago

there is no limitation reporting the same images on different iterations. what we think might be the issue that multiple images are reported with the same iteration + metric + variant, using report_image
can this be the case?

  
  
Posted 5 months ago

that's amazing, thank you!!

  
  
Posted 6 months ago

Hi @<1705029129084080128:profile|SmoothPigeon36>
we found out that unintentionally, debug sample will display up to 3 images with the same iteration/metric/variant. which was not the initial intent. while image viewer support 1 image per this combination.
I have 2 questions.

  • can you confirm that this is indeed the issue in your case.
  • and if so, was reporting more than 1 image for the same combination intentional?
  
  
Posted 6 months ago

Just a +1 here. When we use the same name for 3 differents image, the thumbnail show 3 different images, but when clicking on any of them, only one is displayed. No way to display the others

  
  
Posted 5 months ago

hi @<1523703436166565888:profile|DeterminedCrab71> thank you for looking into this! I'm not sure if this is the problem. What I'm trying to do is simply displaying some of the images produced by my data generator in clearml. I have a callback that at the beginning of each epoch, for each image in the batch, simply calls report_image on every element of the batch. Because data augmentation is part of the elements I want to investigate, the same images are reported at every epoch. However, if no augmentation is selected, it's possible that images are identical for each iteration.

  
  
Posted 5 months ago

I just check that, you're right! The iteration and metrics are definitely the same (same epoch, metric is "debug_samples" ), but the variant was initially different (it was a simple f"{batch}{elem}" ). The bug started showing up when I started using the variant (or series) to show the label as text. As a consequences, I end up with samples from the same batch, with the same label being reported with the same iteration+metric+variant.

  
  
Posted 5 months ago

adding a running index to the variant name should take care of this issue

  
  
Posted 5 months ago

I just did that after realising the issue, thank you! 🙂

  
  
Posted 5 months ago
623 Views
12 Answers
6 months ago
5 months ago
Tags
Similar posts