I see that trains is automatically plotting scalars like epoch_acc, epoch_loss. It is not clear where it is picking these from. In one of the runs, I see a epoch_lr but I don't see it in another. Hence, I'm having to manually call report_scalar on the variables of interest. This causes duplication of plots in some case.
figured it out..I can use the logger.report_scalar to do this
Hi FriendlyKoala70 , trains will report all the tensorboard graphs, I'm assuming that's who is creating the epoch_lr graph. On top of it, you can always report manually with logger (as you pointed). Does that make sense to you?