Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Guys, I'M Trying To Familiarize Myself With Hyperparameter Optimization Using Clearml. It Seems Like There Is A Discrepancy Between

Hi guys,
I'm trying to familiarize myself with Hyperparameter Optimization using ClearML.
It seems like there is a discrepancy between clearml-param-search CLI tool and the HyperParameterOptimizer class; the params-override flag is apparently only available in the CLI tool and not in the HyperParameterOptimizer class.

Is there a way to override some parameters in the base task that we optimize with the class approach, say changing the number of epochs?

Edit: I'm on ClearML v1.6.2

  
  
Posted one year ago
Votes Newest

Answers 4


Hi GiganticMole91 . You could use something like
` from clearml.automation import DiscreteParameterRange

HyperParameterOptimizer(
...,
hyper_parameters=[DiscreteParameterRange("epochs", values=[100]), ...] # epochs is static, ... represent the other params
) to get the same behaviour --params-override ` provides

  
  
Posted one year ago

Hi CostlyOstrich36
I have created a base task on which I'm optimizing hyperparameters. With clearml-param-search I could use --params-override to set a static parameter, which should not be optimized, e.g. changing the number of epochs for all experiments. It seems to me that this capability is not present in HyperParameterOptimizer . Does that make sense?

From the example on https://clear.ml/docs/latest/docs/apps/clearml_param_search/ :
clearml-param-search {...} --params-override '{"name": "epochs", "value": 30}'

  
  
Posted one year ago

Hi GiganticMole91 ,

Can you please elaborate on what are you trying to do exactly?

Doesn't HyperParameterOptimizer change parameters out of the box?

  
  
Posted one year ago

Yeah, that makes sense. The only drawback is that you'll get a single point that all lines will go through in the Parallel Coordinates plot when the optimization finishes 🙂

  
  
Posted one year ago
228 Views
4 Answers
one year ago
8 months ago
Tags