Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello! Does Anyone Know How To Do

Hello! does anyone know how to do HPO when your parameters are in a Hydra configuration file? What is the correct way to do this (e.g declare the params for the optimizer, do I need to use task.connect ?)
I can’t find any info in the docs. Thanks.

  
  
Posted one year ago
Votes Newest

Answers 16


hmm….. probably simpler/cleaner if I do

hpo_params = {
'param1':cfg.param_1, ...
}

task.connect(hpo_params)

Thoughts?

  
  
Posted one year ago

so if I want to refer to batch_size in my_hydra_config.yaml :

# dummy config file
trainer:
    params:
         batch_size: 32

do I pass this to the HyperParameterOptimizer as:

Hydra/trainer/params/batch_size ??

@<1523701205467926528:profile|AgitatedDove14> 👆 ? Thanks

  
  
Posted one year ago

image

  
  
Posted one year ago

OmegaConf is the configuration, the overrides are in the Hyperparameters "Hydra" section
None

  
  
Posted one year ago

Will this work?

task.connect(OmegaConf.to_object(cfg))

assuming cfg is my Hydra dict

  
  
Posted one year ago

Hey @<1523701205467926528:profile|AgitatedDove14> in the WebUI the hydra configuration object is under CONFIGURATION OBJECTS > OmegaConf

So should this be OmegaConf/trainer.batch_size ?

  
  
Posted one year ago

@<1523701205467926528:profile|AgitatedDove14> Got the overrides working with Hydra/params.batch_size thank you 🙏

  
  
Posted one year ago

Thanks @<1523701205467926528:profile|AgitatedDove14> happy to PR on the docs 😉

  
  
Posted one year ago

from this video tutorial None :
“…the name of the hyperparameter consist of the section is reported to followed by a slash then its name…”

So following that confuses me because I can’t see my Hydra parameters under Hyperparameters > Hydra
and this is why I thought, ok well, perhaps use OmegaConf/params.batch_size

Is this another opportunity to improve the documentation? Happy to help if so.

  
  
Posted one year ago

hmmm… probably not if I don’t have a reference that clearml can update right?….

What about:

hpo_params = OmegaConf.to_object(cfg)
...
task.connect(hpo_params)

And then I use hpo_params in the code. This way I give clearml a chance to update the object.

Would this work? Thanks

  
  
Posted one year ago

Glad to hear!
(yeah @<1603198134261911552:profile|ColossalReindeer77> I'm with you the override is not intuitive, I'll pass the info to the technical writers, hopefully they can find a way to make it easier to understand)

  
  
Posted one year ago

try Hydra/trainer.params.batch_size
hydra separates nesting with "."

  
  
Posted one year ago

image

  
  
Posted one year ago

so it’s not intuitive to me to try Hydra/params.batch_size I will try it nonetheless as you suggested.

  
  
Posted one year ago

Hi @<1603198134261911552:profile|ColossalReindeer77>

Hello! does anyone know how to do

HPO

when your parameters are in a

Hydra

Basically hydra parameters are overridden with "Hydra/param"
(this is equivalent to the "override" option of hydra in CLI)

  
  
Posted one year ago

Hi @<1523701205467926528:profile|AgitatedDove14> , I see _allow_omegaconf_edit_ under HYPERPARAMETERS > Hydra

  
  
Posted one year ago
924 Views
16 Answers
one year ago
one year ago
Tags
Similar posts