Unanswered
Hi Everyone!
Is Anybody Using Log-Scale Parameter Ranges For Hyper-Parameter Optimization? It Seems That There Is A Bug In The Hpbandster Module. I'M Getting Negative Learning Rates..
This code snipet produces numbers in the range from 10 to 1000 instead of [10^-3, 10]. This could be fixed by changing https://github.com/allegroai/clearml/blob/master/clearml/automation/parameters.py#L168 :
Now:values = [v*step_size for v in range(0, int(steps))]
Should be:values = [self.min_value + v * step_size for v in range(0, int(steps))]
I've tested it locally and it behaves as expected. Also, it would allow for negative values which aren't supported at the moment.
168 Views
0
Answers
2 years ago
one year ago