Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Are There Any Resources On How I Can Implement Hyperparameter Optimisation Using Ray Tune On Clearml?

Are there any resources on how I can implement Hyperparameter Optimisation using Ray Tune on ClearML?

  
  
Posted 7 months ago
Votes Newest

Answers 2


@<1523701435869433856:profile|SmugDolphin23> Quick question, does the SearchStrategy use a Bayesian Optimisation?

  
  
Posted 6 months ago

Hi @<1581454875005292544:profile|SuccessfulOtter28> ! You could take a look at how the HPO was built using optuna: None .
Basically: you should create a new class which inherits from SearchStrategy . This class should convert clearml hyper_parameters to some parameters the Ray Tune understands, then create a Tuner and run the Ray Tune hyper paramter optimization.
The function Tuner will optimize params for should be a function which creates a new clearml task, this task being a clone of the task you want to optimize. Then the values of the objectives you want to optimize for are fetched in this function and evaluated (from the cloned task). In the optuna.py , the function that does all of this is objective .

  
  
Posted 7 months ago
590 Views
2 Answers
7 months ago
6 months ago
Tags