What do you think? Thanks for your feedback!
GreasyLeopard35 I think you are on to something, I think UniformParameterRange just misses a min value:
https://github.com/allegroai/clearml/blob/fcad50b6266f445424a1f1fb361f5a4bc5c7f6a3/clearml/automation/parameters.py#L168
Should be:[self.min_value + v*step_size for v in range(0, int(steps))]
Look here AgitatedDove14 :
https://github.com/allegroai/clearml/blob/master/clearml/automation/hpbandster/bandster.py#L356
There is no implementation for LogUniformParameterRange, but since it is an instance of UniformParameterRange (by inheritance), this method will return values between [-3, .., 1] for my example. It should either raise an Exception or return [0.001, ..., 1].
` from clearml.automation.parameters import LogUniformParameterRange
sampler = LogUniformParameterRange(name='test', min_value=-3.0, max_value=1.0, step_size=0.5)
sampler.to_list()
Out[2]:
[{'test': 1.0},
{'test': 3.1622776601683795},
{'test': 10.0},
{'test': 31.622776601683793},
{'test': 100.0},
{'test': 316.22776601683796},
{'test': 1000.0},
{'test': 3162.2776601683795}] `
Moreover, the LogUniformParameterRange is not implemented for hpbanster optimizer and results in a range from values [-3, 1] since LogUniformParameterRange inherits from UniformParameterRange. See https://github.com/allegroai/clearml/blob/master/clearml/automation/hpbandster/bandster.py#L355
GreasyLeopard35
I can update that the fix to UniformIntegerParameterRange should be pushed with tomorrows release 🙂
(which would fix in turn LogUniformParameterRange)
Hi AgitatedDove14 ,
The get_value() method works fine. The issue is in to_list(), which calls super().to_list(), which in turn returns a list starting at 0 (thus only positive values). My suggested modification to http://UniformParameterRange.to _list() would return a list starting at self.min_value (which could be negative) instead.
from clearml.automation.parameters import LogUniformParameterRange
sampler = LogUniformParameterRange(name='test', min_value=-3.0, max_value=1.0, step_size=0.5)
http://sampler.to _list()
This code snipet produces numbers in the range from 10 to 1000 instead of [10^-3, 10]. This could be fixed by changing https://github.com/allegroai/clearml/blob/master/clearml/automation/parameters.py#L168 :
Now:values = [v*step_size for v in range(0, int(steps))]
Should be:values = [self.min_value + v * step_size for v in range(0, int(steps))]
I've tested it locally and it behaves as expected. Also, it would allow for negative values which aren't supported at the moment.
But the missing implementation of LogUniformRange for hpbandster still causes problems.
wdym?
GreasyLeopard35 from the implementation:
https://github.com/allegroai/clearml/blob/fcad50b6266f445424a1f1fb361f5a4bc5c7f6a3/clearml/automation/parameters.py#L215
Which basically returns the "self.base" (default) 10 to the power of the selected value:10**-3 = 0.001
So how would I get a negative value ?
But the missing implementation of LogUniformRange for hpbandster still causes problems.
Hmm GreasyLeopard35 can you specify the range you are passing to the HPO, as well as the type of optimization class ? (grid/random/optuna etc.)