Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Everyone! Is Anybody Using Log-Scale Parameter Ranges For Hyper-Parameter Optimization? It Seems That There Is A Bug In The Hpbandster Module. I'M Getting Negative Learning Rates..

Hi everyone!

Is anybody using log-scale parameter ranges for hyper-parameter optimization? It seems that there is a bug in the hpbandster module. I'm getting negative learning rates..

  
  
Posted one year ago
Votes Newest

Answers 15


But the missing implementation of LogUniformRange for hpbandster still causes problems.

wdym?

  
  
Posted one year ago

Awesome!

  
  
Posted one year ago

But the missing implementation of LogUniformRange for hpbandster still causes problems.

  
  
Posted one year ago

Yup, I think that's it! 🙂

  
  
Posted one year ago

This code snipet produces numbers in the range from 10 to 1000 instead of [10^-3, 10]. This could be fixed by changing https://github.com/allegroai/clearml/blob/master/clearml/automation/parameters.py#L168 :

Now:
values = [v*step_size for v in range(0, int(steps))]
Should be:
values = [self.min_value + v * step_size for v in range(0, int(steps))]

I've tested it locally and it behaves as expected. Also, it would allow for negative values which aren't supported at the moment.

  
  
Posted one year ago

Moreover, the LogUniformParameterRange is not implemented for hpbanster optimizer and results in a range from values [-3, 1] since LogUniformParameterRange inherits from UniformParameterRange. See https://github.com/allegroai/clearml/blob/master/clearml/automation/hpbandster/bandster.py#L355

  
  
Posted one year ago

Hi AgitatedDove14 ,
The get_value() method works fine. The issue is in to_list(), which calls super().to_list(), which in turn returns a list starting at 0 (thus only positive values). My suggested modification to http://UniformParameterRange.to _list() would return a list starting at self.min_value (which could be negative) instead.

  
  
Posted one year ago

Look here AgitatedDove14 :
https://github.com/allegroai/clearml/blob/master/clearml/automation/hpbandster/bandster.py#L356

There is no implementation for LogUniformParameterRange, but since it is an instance of UniformParameterRange (by inheritance), this method will return values between [-3, .., 1] for my example. It should either raise an Exception or return [0.001, ..., 1].

  
  
Posted one year ago

GreasyLeopard35 I think you are on to something, I think UniformParameterRange just misses a min value:
https://github.com/allegroai/clearml/blob/fcad50b6266f445424a1f1fb361f5a4bc5c7f6a3/clearml/automation/parameters.py#L168
Should be:
[self.min_value + v*step_size for v in range(0, int(steps))]

  
  
Posted one year ago

GreasyLeopard35
I can update that the fix to UniformIntegerParameterRange should be pushed with tomorrows release 🙂
(which would fix in turn LogUniformParameterRange)

  
  
Posted one year ago

Hmm GreasyLeopard35 can you specify the range you are passing to the HPO, as well as the type of optimization class ? (grid/random/optuna etc.)

  
  
Posted one year ago

` from clearml.automation.parameters import LogUniformParameterRange
sampler = LogUniformParameterRange(name='test', min_value=-3.0, max_value=1.0, step_size=0.5)
sampler.to_list()

Out[2]:
[{'test': 1.0},
{'test': 3.1622776601683795},
{'test': 10.0},
{'test': 31.622776601683793},
{'test': 100.0},
{'test': 316.22776601683796},
{'test': 1000.0},
{'test': 3162.2776601683795}] `

  
  
Posted one year ago

GreasyLeopard35 from the implementation:
https://github.com/allegroai/clearml/blob/fcad50b6266f445424a1f1fb361f5a4bc5c7f6a3/clearml/automation/parameters.py#L215
Which basically returns the "self.base" (default) 10 to the power of the selected value:
10**-3 = 0.001
So how would I get a negative value ?

  
  
Posted one year ago

What do you think? Thanks for your feedback!

  
  
Posted one year ago

from clearml.automation.parameters import LogUniformParameterRange
sampler = LogUniformParameterRange(name='test', min_value=-3.0, max_value=1.0, step_size=0.5)
http://sampler.to _list()

  
  
Posted one year ago
510 Views
15 Answers
one year ago
one year ago
Tags