Unanswered
Hi,
I Have Some Questions About Hyperparameter Optimization. We Have A Setup Where We Use Pytorchlightning Cli With Clearml For Experiment Tracking And Hyperparameter Optimization.
Now, All Our Configurations Are Config-File Based. Sometime We Have Linke
Hi CurvedHedgehog15 , thanks for replying!
I guess that one could modify the config with variable interpolation (similar to how it's done in YAML, e.g. ${encoder.layers}
) - however, it seems to be quite invasive to specify that in our trainer script 😞
193 Views
0
Answers
2 years ago
one year ago