Unanswered
Hi Everyone!
Is Anybody Using Log-Scale Parameter Ranges For Hyper-Parameter Optimization? It Seems That There Is A Bug In The Hpbandster Module. I'M Getting Negative Learning Rates..
Look here AgitatedDove14 :
https://github.com/allegroai/clearml/blob/master/clearml/automation/hpbandster/bandster.py#L356
There is no implementation for LogUniformParameterRange, but since it is an instance of UniformParameterRange (by inheritance), this method will return values between [-3, .., 1] for my example. It should either raise an Exception or return [0.001, ..., 1].
182 Views
0
Answers
2 years ago
one year ago