Hi GiganticMole91 ,
Can you please elaborate on what are you trying to do exactly?
Doesn't HyperParameterOptimizer change parameters out of the box?
Yeah, that makes sense. The only drawback is that you'll get a single point that all lines will go through in the Parallel Coordinates plot when the optimization finishes 🙂
Hi CostlyOstrich36
I have created a base task on which I'm optimizing hyperparameters. With clearml-param-search
I could use --params-override
to set a static parameter, which should not be optimized, e.g. changing the number of epochs for all experiments. It seems to me that this capability is not present in HyperParameterOptimizer
. Does that make sense?
From the example on https://clear.ml/docs/latest/docs/apps/clearml_param_search/ :clearml-param-search {...} --params-override '{"name": "epochs", "value": 30}'
Hi GiganticMole91 . You could use something like
` from clearml.automation import DiscreteParameterRange
HyperParameterOptimizer(
...,
hyper_parameters=[DiscreteParameterRange("epochs", values=[100]), ...] # epochs is static, ...
represent the other params
) to get the same behaviour
--params-override ` provides