AmusedParrot89 are you reporting these for every iteration, or once?
SuccessfulKoala55 - yes, plots are reported every iteration.
AmusedParrot89 - the plot comparison indeed compares the latest iteration of the experiments. I will see if this can be better indicated somewhere
AmusedParrot89 I'll have to check regarding the None
value for the iteration, but for sure this means it's overwriting the last report every time you report a new one
AmusedParrot89 - I see the logic in displaying the last iteration per metric in the compare screen. We will need to think if this won't cause any other issues.
In the mean time - may I ask you to open a github issue - so it will be easier to track?
I’m reporting Validation Report (best)
every time I find a better model, and report Validation Report (latest)
every time. The Evaluation Report
is something I run after the training itself is complete, so it’s not tied to a specific iteration (I’m passing None).
So, if this only shows the latest iteration, the solution would be to report all three series at the last iteration? Is there a a different way to report plots that are not tied to an iteration?
AmusedParrot89 - let me check this and get back to you