Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Have An Environment Error When Running Hpo:

I have an environment error when running HPO:
``RuntimeError: Dataset '/home/rini-debian/git-stash/lvgl-ui-detector/datasets/ui_randoms.yaml' error❌❌ '/home/rini-debian/git-stash/lvgl-ui-detector/datasets/ui_randoms.yaml' does not exist`

This runtime error is due to the fact that this argument parameter is tracked in the ClearML task, as it should be.

However, when I run my HPO, I do not want this argument, since it is created when executing my training function and adjusted for the current environment.

How can I tell the HPO to not use these hyper parameters or override them when queuing tasks?

  
  
Posted 6 months ago
Votes Newest

Answers 5


How can I adjust the parameter overrides from tasks spawned by the hyperparameter optimizer?

My template task has some environment depending parameters that I would like to clear for the newly spawned tasks, as the function that is run for each tasks handles the environment already.

  
  
Posted 6 months ago

Oh I see, glad you found the problem!

  
  
Posted 6 months ago

Hey. I should have closed this..

The thing that I was looking for is called set_parameter on the task.
The HPO uses a task I created previously and I had trouble with that, since it contained a path, which wasn't available on the colab instance.
I fixed my code, so it always updates this parameter depending on the environment.

It was less of an HPO issue, more of a programming failure on the function, which didn't properly update the parameter, even though I thought it should.

  
  
Posted 6 months ago

Hi @<1694157594333024256:profile|DisturbedParrot38> ! If you want to override the parameter, you could add a DiscreteParameterRange to hyper_paramters when calling HyperParameterOptimizer . The DiscreteParameterRange should have just 1 value: the value you want to override the parameter with.
You could try setting the parameter to an empty string in order to mark it as cleared

  
  
Posted 6 months ago

Back when I wrote this, I thought HPO does something magical for overwriting the general args of the task when cloning.
Turns out it just was my code that was missing a more explicit set_parameter for this environment path.

  
  
Posted 6 months ago
518 Views
5 Answers
6 months ago
6 months ago
Tags
Similar posts