Unanswered
In Relation To Pytorch Lightning V1.X, Usage In Combination With Trains Has Become Much Smoother (Just Pure Tensorboard).
However, When Checking The "Configuration" Tab Of An Experiment, It'S Empty.
How Do I Get Trains To Log The Hyperparameters?
I'Ve Tr
DefeatedCrab47 If I remember correctly v1+ has their arguments coming from argparse .
Are you using this feature ? 2. How do you set the TB HParam ? Currently Trains does not support TB HParams, the reason is the set of HParams needs to match a single experiment. Is that your case?
171 Views
0
Answers
4 years ago
one year ago