Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello! I Think I'Ve Found A Bug, But Couldn'T Fix It Completely To Make A Pull Request. I Want To Optimizer Hyperparameters With Trains.Automation But:

Hello! I think I've found a bug, but couldn't fix it completely to make a pull request.

I want to optimizer hyperparameters with trains.automation but:
In last trains versions hyperparamets in configuration tab are grouped into 'Args' key. But in code (trains.automation.optimization line 564) it it was not handled. This one I could fix I could not get objective function (trains.automation.optimization line 569). Task object has such a parameter but has one key with an empty string as a value. This one I could not fix.Because of the second thing an optimization does not work as it does not now values of an optimized scalar.

P.S. pictures are in thread

  
  
Posted 4 years ago
Votes Newest

Answers 30


you are correct, I was referring to the template experiment

  
  
Posted 4 years ago

Cause I ran for a few epochs only

  
  
Posted 4 years ago

Unfortunately I still can't get it

Adding General prefix for parameters doesn't work as task parameters have no prefixes. The also doesn't have 'General' key returned (pictures 1, 2 are screen shots of my base experiment, picture 3 is key of returned task parameters dictionary) The last_metrics argument is still empty. But my template experiment actually has reported scalars (picture 4) and I use right experiment id (picture 5)

  
  
Posted 4 years ago

Should I run the template experiment till the end (I mean when it'not not improving) or I can run for a few epochs?

  
  
Posted 4 years ago

I want to optimizer hyperparameters with trains.automation but: ...

Yes you are correct, in case of the example code, it should be "General/..." if you have ArgParser, it should be "Args/..." Yes it looks like the metric is wrong, it should be "epoch_accuracy" & "epoch_accuracy"

  
  
Posted 4 years ago

I'm sorry but I didn't get you about original experiment. By original you mean the experiment I use as a template?

  
  
Posted 4 years ago

Yes, sorry, that wasn't clear 🙂

  
  
Posted 4 years ago

`

Example use case:

an_optimizer = HyperParameterOptimizer(
# This is the experiment we want to optimize
base_task_id=args['template_task_id'],
# here we define the hyper-parameters to optimize
hyper_parameters=[
UniformIntegerParameterRange('General/layer_1', min_value=128, max_value=512, step_size=128),
UniformIntegerParameterRange('General/layer_2', min_value=128, max_value=512, step_size=128),
DiscreteParameterRange('General/batch_size', values=[96, 128, 160]),
DiscreteParameterRange('General/epochs', values=[30]),
],
# this is the objective metric we want to maximize/minimize
objective_metric_title='epoch_accuracy',
objective_metric_series='epoch_accuracy',
# now we decide if we want to maximize it or minimize it (accuracy we maximize)
objective_metric_sign='max',
# let us limit the number of concurrent experiments,
# this in turn will make sure we do dont bombard the scheduler with experiments.
# if we have an auto-scaler connected, this, by proxy, will limit the number of machine
max_number_of_concurrent_tasks=2,
# this is the optimizer class (actually doing the optimization)
# Currently, we can choose from GridSearch, RandomSearch or OptimizerBOHB (Bayesian optimization Hyper-Band)
# more are coming soon...
optimizer_class=aSearchStrategy,
# Select an execution queue to schedule the experiments for execution
execution_queue='1xGPU',
# Optional: Limit the execution time of a single experiment, in minutes.
# (this is optional, and if using OptimizerBOHB, it is ignored)
time_limit_per_job=10.,
# Check the experiments every 12 seconds is way too often, we should probably set it to 5 min,
# assuming a single experiment is usually hours...
pool_period_min=0.2,
# set the maximum number of jobs to launch for the optimization, default (None) unlimited
# If OptimizerBOHB is used, it defined the maximum budget in terms of full jobs
# basically the cumulative number of iterations will not exceed total_max_jobs * max_iteration_per_job
total_max_jobs=10,
# set the minimum number of iterations for an experiment, before early stopping.
# Does not apply for simple strategies such as RandomSearch or GridSearch
min_iteration_per_job=10,
# Set the maximum number of iterations for an experiment to execute
# (This is optional, unless using OptimizerBOHB where this is a must)
max_iteration_per_job=30,
) `

  
  
Posted 4 years ago

Yes 😅 It actually worked. Thank you! Now I got values from scalars

  
  
Posted 4 years ago

This is a screen with messages of an optimization process

  
  
Posted 4 years ago

here's my script

  
  
Posted 4 years ago

PungentLouse55 could you test again with the latest from the GitHub? I think the issue should be solved:
pip install git+

  
  
Posted 4 years ago

How can I get them? I think I follow the example from documentation, but I cant get it.
Is it ok that my template experiment is now at draft 'state'?

  
  
Posted 4 years ago

PungentLouse55 from the screenshot I assume the experiment template you are trying to optimize is not the one from the trains/examples 🙂
In that case, and based on the screenshots, the prefix is "Args/" as this is the section name.
Regrading objective metric, again based on your screenshots:
objective_metric_title="Accuracy" objective_metric_series="Validation"Make sense ?

  
  
Posted 4 years ago

YEY!

  
  
Posted 4 years ago

Hope you are not tired of me. But I am using trains 0.16.1 and adding prefix does not work. I found the place where dict with keys <prefix/key>:value are transformed to nested dict with <prefix>:{<key>: value} (see screenshots). Im sorry for my annoyance but I believe there is misandastanding between us and you think that prefixes work 🙂

  
  
Posted 4 years ago

Hi PungentLouse55
it depends on the trains-server version you are running.
If the trains-server >= 0.16 then you have to add "Args/" prefix. If you are running an older version, then you should not add any prefix.

  
  
Posted 4 years ago

PungentLouse55 I'm checking something here, you might stumbled on a bug in parameter overriding. Updating here soon ...

  
  
Posted 4 years ago

I'll try again, but I did it in this way 😢
Maybe I am wrong and did something wrong

  
  
Posted 4 years ago

A few epochs is just fine

  
  
Posted 4 years ago

These are the last_metrics values the task object has

  
  
Posted 4 years ago

In order for the sample to work you have to run the template experiment once. Then the HP optimizer will find the best HP for it.

  
  
Posted 4 years ago

Hi PungentLouse55
Are you referring to the example code ?

  
  
Posted 4 years ago

Could you please answer the last question? 🙃
I'm I right that the is a bug with the first situation I've described ('Args' parameter)? Or I do smth wrong and It should work with prefixes? Because it does not work if I add prefix

  
  
Posted 4 years ago

PungentLouse55 , make sure you fix the metric objective and args:
Add "General/" prefix to the list of arguments to optimize, and change the objective metric from "Accuracy" to "epoch_accuracy"

  
  
Posted 4 years ago

I'll make sure we fix the example, because as you pointed, it is broken :(

  
  
Posted 4 years ago

PungentLouse55 you can find the metrics in the "original" (aka base template) experiment.

  
  
Posted 4 years ago

yeah

  
  
Posted 4 years ago

Hi PungentLouse55

Hope you are not tired of me

Lol 🙂 No worries

I am using trains 0.16.1

Are you referring to the trains-server version or the python package ? (they are not the same and can be of totally different versions)

  
  
Posted 4 years ago

Thank you! I'll try

  
  
Posted 4 years ago
1K Views
30 Answers
4 years ago
one year ago
Tags
Similar posts