Unanswered
Hi Everyone! I Started Using Clearml To Track Some Of Our Experiments Last Week (Currently Using The Pro Tier), But I’M Having Some Issues Trying To Compare The Plots Of Two Experiments. Each Experiment Has Three Tables As Plots - One As A Plot With A Sin
I’m reporting Validation Report (best)
every time I find a better model, and report Validation Report (latest)
every time. The Evaluation Report
is something I run after the training itself is complete, so it’s not tied to a specific iteration (I’m passing None).
So, if this only shows the latest iteration, the solution would be to report all three series at the last iteration? Is there a a different way to report plots that are not tied to an iteration?
170 Views
0
Answers
one year ago
one year ago