Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I Have A Question Regarding

Hi, I have a question regarding automation.HyperParameterOptimizer() . We are seeing that objective being logged (on the table of experiments - results/plots) is the one for the last epoch and not the best one for that experiment, so picking up the best experiment by objective doesn't guarantee we are getting the best possible result (there might be an experiment with higher objective value -not first on the list- that may contain the best possible loss for all experiments).

We are using this configuration:
"objective_metric_title": "epoch_loss", "objective_metric_series": "validation: epoch_loss", "objective_metric_sign": "min",Now we realized there is an option tu use "min_global" on the sign, is this what we need?

  
  
Posted 2 years ago
Votes Newest

Answers 10


I think it does make sense and is what we where looking for

  
  
Posted 2 years ago

We are giving it a try

  
  
Posted 2 years ago

In case of overfitting (using val loss) the last and the min might not be even near, but maybe hyperparam aborts in those cases? I am not too familiar on when hyperparam optmizer does abort an experiment

  
  
Posted 2 years ago

thank you!

  
  
Posted 2 years ago

but out of curiosity, whats the point on doing a hyperparam search on the value of the loss on the last epoch of the experiment

The problem is that you might end up with global min that is really nice, but it was 3 epochs ago, and you have the last checkpoint ...
BTW, global min and last min should not be very diff if the model converge, wdyt?

  
  
Posted 2 years ago

but maybe hyperparam aborts in those cases?

from the hyperparam perspective it will be trying to optimize the global minimum, basically "ignoring" the last value reported. Does that make sense ?

  
  
Posted 2 years ago

... the one for the last epoch and not the best one for that experiment,

well

Now we realized there is an option tu use

"min_global"

on the sign, is this what we need?

Yes 🙂 (or max_global)

  
  
Posted 2 years ago

Ok we will give it a try, but out of curiosity, whats the point on doing a hyperparam search on the value of the loss on the last epoch of the experiment vs using the value of the min_loss on the full experiment?

  
  
Posted 2 years ago

sure thing 🙂

  
  
Posted 2 years ago

I always save the checkpoint of the min/max loss so that wont be a problem. We where having numerical discrepancies with the loss value for the checkpoint and the objective reported on the hyperparam, thats how we noted.

  
  
Posted 2 years ago
922 Views
10 Answers
2 years ago
one year ago
Tags
Similar posts