Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Hi There, I Am Looking For A Way To Use The Hyperparameter Optimization Tool From Clear Ml, For Different Algorithms I Build. The Idea Is, That I Can Use A Command Line To Specify Which Algorithm I Want To Optimize, And Let Clear Ml Do The Rest. The Issue

Hi there, I am looking for a way to use the hyperparameter optimization tool from Clear ML, for different algorithms I build. The idea is, that I can use a command line to specify which algorithm I want to optimize, and let clear ML do the rest. The issue is, that apparently the hyperparamter optimization of clear ml works in a way, that it calls the function where the task was created to run the training. So, if I have two function, one for the training and one for the optimization, I cannot call them in the same file, because after the training function, the optimize function would be called everytime, that leads to a lot of chaos, so my learning was that I have to separate the task from the optimization. However, if I want to use, for example, argparse to create the training, then I would need to pass in the arguments everytime this function gets called from the optimizer, which is not possible fully auomated. So i am kind of stuck how to make my idea work. Any ideas?

Posted 5 months ago
Votes Newest


Hi @<1649946171692552192:profile|EnchantingDolphin84> , what about this example?
Add argparser to change the configuration of the HyperParameterOptimizer class.

What do you think?

Posted 5 months ago
1 Answer
5 months ago
5 months ago
Similar posts