Reputation
Badges 1
15 × Eureka!Python 3.9 runs fine, but there's an issue with the pytorch datatloaders that seems to be related to that python version. Clearml version is 1.6.2 and the agents are 1.3.0.
The command that's prompted already contains the reference to "python=3.1", so this won't work.
Hi AgitatedDove14 ,
The get_value() method works fine. The issue is in to_list(), which calls super().to_list(), which in turn returns a list starting at 0 (thus only positive values). My suggested modification to http://UniformParameterRange.to _list() would return a list starting at self.min_value (which could be negative) instead.
Moreover, the LogUniformParameterRange is not implemented for hpbanster optimizer and results in a range from values [-3, 1] since LogUniformParameterRange inherits from UniformParameterRange. See https://github.com/allegroai/clearml/blob/master/clearml/automation/hpbandster/bandster.py#L355
from clearml.automation.parameters import LogUniformParameterRange
sampler = LogUniformParameterRange(name='test', min_value=-3.0, max_value=1.0, step_size=0.5)
http://sampler.to _list()
What do you think? Thanks for your feedback!
Look here AgitatedDove14 :
https://github.com/allegroai/clearml/blob/master/clearml/automation/hpbandster/bandster.py#L356
There is no implementation for LogUniformParameterRange, but since it is an instance of UniformParameterRange (by inheritance), this method will return values between [-3, .., 1] for my example. It should either raise an Exception or return [0.001, ..., 1].
But the missing implementation of LogUniformRange for hpbandster still causes problems.
It isn't reproducible. I had a stupid typo in my script parsing the arguments twice. Thanks anyways, you got me on the right track! :)
Hi,
thanks for the prompt reply, AgitatedDove14 . Here are some more details:
I am executing locally (i.e. I set args['run_as_service'] = False as in https://github.com/allegroai/clearml/blob/400c6ec103d9f2193694c54d7491bb1a74bbe8e8/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py#L45 ). Everything was fine until some network issues occurred and my task was aborted, When I restart it, I see these double configurations in the UI.
However, I've just noticed th...
CostlyOstrich36 Any idea what could be going wrong?
This code snipet produces numbers in the range from 10 to 1000 instead of [10^-3, 10]. This could be fixed by changing https://github.com/allegroai/clearml/blob/master/clearml/automation/parameters.py#L168 :
Now:values = [v*step_size for v in range(0, int(steps))]
Should be:values = [self.min_value + v * step_size for v in range(0, int(steps))]
I've tested it locally and it behaves as expected. Also, it would allow for negative values which aren't supported at the moment.