Reputation
Badges 1
9 × Eureka!Yes and I double check in python and I get the dictionnary with: Args/...
The log of my optimizer looks like this:
` Task: {'template_task_id': '6f3bf2ecbb964ff3b2a6111c34cb0fa3', 'run_as_service': False}
2021-03-30 10:45:25,413 - trains.automation.optimization - WARNING - Could not find requested hyper-parameters ['Args/patch_size', 'Args/nb_conv', 'Args/nb_fmaps', 'Args/epochs'] on base task 6f3bf2ecbb964ff3b2a6111c34cb0fa3
2021-03-30 10:45:25,433 - trains.automation.optimization - WARNING - Could not find requested metric ('dice', 'dice') report on base task 6f3...
For the train.py do I need a setup.py file in my repo to work corerctly with the agent ? For now it is just the path to train,py
Ok so I installed the last version of clearml and the hyperparameters are found now
It was run with the exact same version. And I got the same message with "epochs" only.
Hi AgitatedDove14
The code is on a private repo (clearml-agent is configure with ssh key and get the code correctly) Otherwise I run the code directly on my computer. The code was previously ran in a task and the task seems to be correctly loaded. I get the right id from the get_task
function.
When the optimizer try to run the first batch of hyperparameter I get this error message in the log ` /home/local/user/.clearml/venvs-builds/3.7/bin/python: can't open file 'train.py': [Errno...
` an_optimizer = HyperParameterOptimizer(
base_task_id="6f3bf2ecbb964ff3b2a6111c34cb0fa3",
hyper_parameters=[ DiscreteParameterRange('Args/patch_size', values=[32, 64, 128]),
DiscreteParameterRange('Args/nb_conv', values=[2, 3, 4]),
DiscreteParameterRange('Args/nb_fmaps', values=[30, 35, 40]),
DiscreteParameterRange('Args/epochs', values=[30]),
], objective_metric_title='valid_average_dice_epoch',
objective_metric_series='valid_average_dice_epoch',
objective_metric_sign='max',
max_number_of_...