hi ScaryBluewhale66
Can please elaborate, i am not sure that i get your question ? What do you need to compare ?
I see! Thanks AnxiousSeal95 and SweetBadger76 !
I have another related questions. Are the items in + metric
I can select only the items in Results -> Scalars
.
Hi SweetBadger76
For example, I want to compare accuracy (the metrics I'm interested) among different experiments. This metrics isn't automatically recorded by ClearML so I want to manually record it.
I've found a workaround to achieve it (as mentioned in the original message), but I'm still wondering if there is any suggestion except using logger.report_scalar
?
Hi ScaryBluewhale66 , I believe the new server that's about the be released soon (this \ next week), we'll allow you to report a "single value metric". so if you want to report just a number per experiment you can, then you can also compare between runs.
report_scalar pernits to manually report a scalar series. This is the dedicated function. There could be other ways to report a scalar, for example through tensorboard - in this case you would have to report to tensorboard, and clearML will automatically report the values
report_scalar() with a constant iteration, is a hack that you can use in the meantime 🙂