Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Hello! Does Anyone Know How To Do

Hello! does anyone know how to do HPO when your parameters are in a Hydra configuration file? What is the correct way to do this (e.g declare the params for the optimizer, do I need to use task.connect ?)
I can’t find any info in the docs. Thanks.

Posted 10 months ago
Votes Newest

Answers 16

Glad to hear!
(yeah @<1603198134261911552:profile|ColossalReindeer77> I'm with you the override is not intuitive, I'll pass the info to the technical writers, hopefully they can find a way to make it easier to understand)

Posted 10 months ago

OmegaConf is the configuration, the overrides are in the Hyperparameters "Hydra" section

Posted 10 months ago

try Hydra/trainer.params.batch_size
hydra separates nesting with "."

Posted 10 months ago

from this video tutorial None :
“…the name of the hyperparameter consist of the section is reported to followed by a slash then its name…”

So following that confuses me because I can’t see my Hydra parameters under Hyperparameters > Hydra
and this is why I thought, ok well, perhaps use OmegaConf/params.batch_size

Is this another opportunity to improve the documentation? Happy to help if so.

Posted 10 months ago

Hi @<1603198134261911552:profile|ColossalReindeer77>

Hello! does anyone know how to do


when your parameters are in a


Basically hydra parameters are overridden with "Hydra/param"
(this is equivalent to the "override" option of hydra in CLI)

Posted 10 months ago

hmm….. probably simpler/cleaner if I do

hpo_params = {
'param1':cfg.param_1, ...



Posted 10 months ago

so if I want to refer to batch_size in my_hydra_config.yaml :

# dummy config file
         batch_size: 32

do I pass this to the HyperParameterOptimizer as:

Hydra/trainer/params/batch_size ??

@<1523701205467926528:profile|AgitatedDove14> 👆 ? Thanks

Posted 10 months ago


Posted 10 months ago

so it’s not intuitive to me to try Hydra/params.batch_size I will try it nonetheless as you suggested.

Posted 10 months ago

Will this work?


assuming cfg is my Hydra dict

Posted 10 months ago

Hey @<1523701205467926528:profile|AgitatedDove14> in the WebUI the hydra configuration object is under CONFIGURATION OBJECTS > OmegaConf

So should this be OmegaConf/trainer.batch_size ?

Posted 10 months ago

@<1523701205467926528:profile|AgitatedDove14> Got the overrides working with Hydra/params.batch_size thank you 🙏

Posted 10 months ago

Thanks @<1523701205467926528:profile|AgitatedDove14> happy to PR on the docs 😉

Posted 10 months ago


Posted 10 months ago

hmmm… probably not if I don’t have a reference that clearml can update right?….

What about:

hpo_params = OmegaConf.to_object(cfg)

And then I use hpo_params in the code. This way I give clearml a chance to update the object.

Would this work? Thanks

Posted 10 months ago

Hi @<1523701205467926528:profile|AgitatedDove14> , I see _allow_omegaconf_edit_ under HYPERPARAMETERS > Hydra

Posted 10 months ago
16 Answers
10 months ago
10 months ago
Similar posts