AgitatedDove14 Yes that's exactly what I have when I create the UniformParameterRange()
but it's still not found as a hyper parameter
I am using the learning rate and the other parameters in the model when I train by calling keras.optimizers
Adam(...)
with all the Adam configs
Wish I could've sent you the code but it's on another network not exposed to the public..
I'm completely lost
Edit:
It's looking like this:
opt = Adam(**configs['training_configuration']['optimizer_params']['Adam'])
model.compile(optimizer=opt, ........more params......)
Configs:....more params....
training_configuration:
optimizer_params:
Adam:
learning_rate: 0.1
decay: 0
.....more params....
and at the beginning of the code I do task.connect(configs['training_configuration'], name="Train")
which I do see the right params under Train in the UI
later on the hparams script I do: UniformParameterRange('Train/optimizer_params/Adam/learning_rate', ....the rest of the min max step params.....)
(with the rest of the code like in the example)
The thing is, on each of the drafts in the UI, I do see it's updating the right parameter under Train/optimizer_params/Adam/learning_rate
with the step and everything. But at the script it says it can't find the hyper parameter and also it's finishing real quick so I know it's not really doing anything