Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey Everyone! I’M Currently Trying To Set Up Hyperparameter Optimization With Clearml On A Base Experiment Using Hydra. I Got Everything Working From The Examples. However, The Child Experiments Started By The Optimization Don’T Seem To Actually Get The N

Hey everyone!
I’m currently trying to set up hyperparameter optimization with clearml on a base experiment using hydra. I got everything working from the examples. However, the child experiments started by the optimization don’t seem to actually get the new parameters. They appear changed on the clearml server but the experiment results are identical which means that the actual experiments don’t get the parameter changes. So far I’ve tried setting parameters with
hyper_parameters=[ UniformIntegerParameterRange( "Hydra/model/num_cells", min_value=64, max_value=512, step_size=64, )]and
hyper_parameters=[ UniformIntegerParameterRange( "General/model/num_cells", min_value=64, max_value=512, step_size=64, )]but neither seem to work….

Has anyone so far managed to successfully get hyperparameter optimization working with hydra?

  
  
Posted 2 years ago
Votes Newest

Answers 20


Yes, the screenshot is from one of the HPO child tasks

  
  
Posted 2 years ago

Cool, let me know if I can help in anyway. I can also try mocking up a small example on my side

  
  
Posted 2 years ago

What about the base task?

  
  
Posted 2 years ago

Hi FranticLobster32 , what version of ClearML, of Agent & Hydra are you using?

  
  
Posted 2 years ago

In the task hyper parameters section you have a section called Hydra. In that section there should be a configuration called _allow_omegaconf_edit_ , what is it set to?

  
  
Posted 2 years ago

but sadly when you compare different experiments with different parameters, all the scalar graphs are identical which shouldn’t be the case

  
  
Posted 2 years ago

clearml 1.6.2 clearml-agent 1.3.0 hydra-core 1.1.2

  
  
Posted 2 years ago

I mean the version of the SDK

  
  
Posted 2 years ago

Just making sure we cover all bases - you changed updated the optimized to use a base task with _allow_omegaconf_edit_ : True

  
  
Posted 2 years ago

That’s set to false (the default)

  
  
Posted 2 years ago

It needs to be in the base task

  
  
Posted 2 years ago

Aight. Thanks for the information. I'll take a look and see if it reproduces for me as well 🙂

  
  
Posted 2 years ago

Yes, the base task is the same, minus the parameters I’m trying to overwrite with the HPO. Here’s the screenshot from the base task

  
  
Posted 2 years ago

WebApp: 1.6.0-213 • Server: 1.6.0-213 • API: 2.20
hydra-core 1.1.2

  
  
Posted 2 years ago

image

  
  
Posted 2 years ago

Please try setting it to True, that should fix it

  
  
Posted 2 years ago

CostlyOstrich36
I’m fully confident at this point that changing parameters on Hydra experiments doesn’t work with clearML. Thus I have reverted to converting Hydra hparams to normal hparams
` @hydra.main(config_path="conf", config_name="config")
def main(config: DictConfig):

task = Task.init(
    config.logging.trackerParams.project,
    config.logging.trackerParams.experimentName,
)

config = task.connect(OmegaConf.to_container(config)) `which seems to work.

All of this being said, I’d very much like Hydra + HPO to work as intended. So I am thinking should I escalate this to an issue on Github and provide a minimal example?

  
  
Posted 2 years ago

Yep, looks like that for me as well

  
  
Posted 2 years ago

Tried it with both parameter naming schemes ( "Hydra/*" and "General/*" ) but it didn’t work 😕

  
  
Posted 2 years ago

It should look something like this

  
  
Posted 2 years ago