Hi, I have a question regarding automation.HyperParameterOptimizer() . We are seeing that objective being logged (on the table of experiments - results/plots...
2 years ago
I always save the checkpoint of the min/max loss so that wont be a problem. We where having numerical discrepancies with the loss value for the checkpoint and the objective reported on the hyperparam, thats how we noted.
In case of overfitting (using val loss) the last and the min might not be even near, but maybe hyperparam aborts in those cases? I am not too familiar on when hyperparam optmizer does abort an experiment
Ok we will give it a try, but out of curiosity, whats the point on doing a hyperparam search on the value of the loss on the last epoch of the experiment vs using the value of the min_loss on the full experiment?