Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Unanswered
Hey, I Was Wondering How Can I Do Hparams Tuning With Trains? Couldn'T Find Anything On The Documentation


AgitatedDove14 Yes that's exactly what I have when I create the UniformParameterRange() but it's still not found as a hyper parameter
I am using the learning rate and the other parameters in the model when I train by calling keras.optimizers Adam(...) with all the Adam configs
Wish I could've sent you the code but it's on another network not exposed to the public..

I'm completely lost

Edit:
It's looking like this:

opt = Adam(**configs['training_configuration']['optimizer_params']['Adam'])
model.compile(optimizer=opt, ........more params......)

Configs:
....more params....
training_configuration:
optimizer_params:
Adam:
learning_rate: 0.1
decay: 0
.....more params....

and at the beginning of the code I do task.connect(configs['training_configuration'], name="Train") which I do see the right params under Train in the UI
later on the hparams script I do: UniformParameterRange('Train/optimizer_params/Adam/learning_rate', ....the rest of the min max step params.....)
(with the rest of the code like in the example)

The thing is, on each of the drafts in the UI, I do see it's updating the right parameter under Train/optimizer_params/Adam/learning_rate with the step and everything. But at the script it says it can't find the hyper parameter and also it's finishing real quick so I know it's not really doing anything

  
  
Posted 4 years ago
143 Views
0 Answers
4 years ago
one year ago