Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey All, Very New To Clearml! I Am Trying To Design An Hpo Setup Using The Optuna Configuration, And I'M Working On Getting My Template Trainer Set Up. The Issue I'M Having Is It'S Unclear To Me How To Define One Of My Hyperparameters Whose Size Is Dynami

Hey all, very new to clearML! I am trying to design an HPO setup using the optuna configuration, and I'm working on getting my template trainer set up. The issue I'm having is it's unclear to me how to define one of my hyperparameters whose size is dynamic. As a simple example

hidden_layers_1 = []
for i in range(num_layers1):
a = trial.suggest_int(f"a1_{i}", 0, 4)
m = trial.suggest_int(f"m1_{i}", 1, 4)
hidden_layers_1.append([a, m])

in this example num_layers is a hyperparameter, but a1_{i} and m1_{i} are also hyperparameters. For optimization, num_layers can be either 1, 2 or 3 - but this means that there can be either 1, 2, or 3 values for a1_{i} and m1_{i} , which themselves can be 0 to 4 or 1 to 4, respectively. For my template trainer, though, I obviously won't be using the suggest_int syntax and providing a range. I need to figure out a way to define this so that I can pass arguments that can be passed to clearML as hyperparameters, but also how to define that hyperparameter within the hyper_parameters list inside the HyperParameterOptimizer arguments. In my mind I'm imagining a brute force solution but I'm really hoping there's a cleaner, more clever way to go about it.

If anyone has any suggestions - or if I did a terrible job explaining this and you could use some more information - please let know. Thanks!

  
  
Posted one month ago
Votes Newest

Answers 3


if anyone is curious, I figured out a solution in which I can simply define hyperparameters that I might not actually use depending on the value of num_layers. This works because of the limited size range, but curious to think about this more for higher dynamic ranges.

  
  
Posted one month ago

you could also take the route of NOT specifying num_layers, and instead write your own code to create a set of viable layer designs to choose from and pass that as a parameter, so optuna selects from a countable set instead of suggesting integer values .

the downside of this is the lack of gradient information in the optimization process

  
  
Posted one month ago

youre basically asking to sample from a distribution where not all parameters are mutually independent .

the short answer is no- this is not directly supported . optuna needs each hyperparam to be independent, so its up to you to handle the dependencies between parameters yourself unfortunately .

your solution of defining them independently and then using num_layers to potentially ignore other parameters is a valid one .

  
  
Posted one month ago
153 Views
3 Answers
one month ago
one month ago
Tags