Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi I Was Running An Hyperparameter Optimization Task Using The Optuna Optimizer And Even Though The Hyperparameteroptimizer’S Argument Is Set To

Hi I was running an Hyperparameter Optimization task using the Optuna Optimizer and even though the HyperParameterOptimizer’s argument is set to max_number_of_concurrent_tasks=100 it only seems to spawn 20 tasks. Is there something I am doing incorrectly?
` import argparse

from clearml import Task
from clearml.automation import DiscreteParameterRange, HyperParameterOptimizer
from clearml.automation.optuna import OptimizerOptuna

from utils.hpo import job_complete_callback

task = Task.init(
project_name='***', task_name='***', task_type=Task.TaskTypes.optimizer,
reuse_last_task_id=False
)

parser = argparse.ArgumentParser()

parser.add_argument('--project_name', type=str, default='***')
parser.add_argument('--task_name', type=str, default='***')
parser.add_argument('--execution_queue', type=str, default='***')

args = parser.parse_args()
args = task.connect(args)

task.execute_remotely()

optimizer = HyperParameterOptimizer(
base_task_id=Task.get_task(project_name=args.project_name, task_name=args.task_name).id,
hyper_parameters=[
DiscreteParameterRange(
'Args/***',
values=['***', '***', '***', '...'],
),
],
objective_metric_title='metrics',
objective_metric_series='***',
objective_metric_sign='max_global',
max_number_of_concurrent_tasks=100,
optimizer_class=OptimizerOptuna,
execution_queue=args.execution_queue,
spawn_project=None,
save_top_k_tasks_only=5,
time_limit_per_job=None,
pool_period_min=5,
total_max_jobs=None,
min_iteration_per_job=10,
max_iteration_per_job=100,
)

optimizer.set_report_period(5)
optimizer.start(job_complete_callback=job_complete_callback)
optimizer.wait()
optimizer.stop() `

  
  
Posted one year ago
Votes Newest

Answers 2


Thx so much đź‘Ť

  
  
Posted one year ago

Hi UpsetBlackbird87
This is an Optuna decision on how many concurrent tests to run simultaneously.
You limited it to 100, but remember Optuna does a Bayesian optimization process, where it decides on the best set of arguments based on the performance of the previous set, this means it will first try X trials, then decide on the next batch.
That said you can a pruner to Optuna specifying how it should start
https://optuna.readthedocs.io/en/v1.4.0/reference/pruners.html#optuna.pruners.MedianPruner
HyperParameterOptimizer(..., optimizer_kwargs=dict( optuna_pruner=optuna.pruners.MedianPruner(n_startup_trials=50, n_warmup_steps=30, interval_steps=10)))Make sense ?

  
  
Posted one year ago
522 Views
2 Answers
one year ago
one year ago
Tags