Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I Am Trying To Use The Parameterset For Hyper-Parameter Tuning With Dependencies, An Example Of How I Use It: Parameterset([{“Prm1”:1, “Prm2": 1},{“Prm1”:2, “Prm2":2}]) But I Get A Warning :

Hi,
I am trying to use the ParameterSet for hyper-parameter tuning with dependencies,
an example of how i use it:
ParameterSet([{“prm1”:1, “prm2": 1},{“prm1”:2, “prm2":2}])
But i get a warning :
clearml.automation.optimization - WARNING - Could not find requested hyper-parameters [None] on base task ***
It is because the paramererSet name is None
How do i use parameterSet correctly?

Thanks!

  
  
Posted 2 years ago
Votes Newest

Answers 15


SweetBadger76 CostlyOstrich36 after trying to run the same code (and ignoring the warning) i get a different error:
ValueError: HyperParameter type <class ‘clearml.automation.parameters.ParameterSet’> not supported yet with OptimizerBOHB
I looked at the “OptimizerOptuna” code, (clearml/automation/optuna/optuna.py) and i saw that the paramset is really not suppurted.
Which optimizer support parameterset?

  
  
Posted 2 years ago

MoodySheep3 , a screenshot would be useful just to understand the structure via UI 🙂

  
  
Posted 2 years ago

Concerning how to use ParameterSet :
I first declare the set
my_param_set = ParameterSet([ {'General/batch_size': 32, 'General/epochs': 30}, {'General/batch_size': 64, 'General/epochs': 20}, {'General/batch_size': 128, 'General/epochs': 10} ])This is a very basic example, it is also possible to use more complex things into the set (see https://clear.ml/docs/latest/docs/references/sdk/hpo_parameters_parameterset/ for UniformParameter Range usage in ParameterSet).

Then i do as you did 🙂
`

optimizer = HyperParameterOptimizer(
# specifying the task to be optimized, task must be in system already so it can be cloned
base_task_id=args['template_task_id'],
# setting the hyper-parameters to optimize
hyper_parameters=[
my_param_set,
], `
If you need something more complete/complex, do not hesitate to ask 🙂 I kept simple here because you already do it fine

ps. concerning the warning, i am going to enquire, and will keep you updated

  
  
Posted 2 years ago

Thanks! I added an issue

  
  
Posted 2 years ago

parameter_optimization_space = [{
type = “ParameterSet”
name = null
values = [{
General/data_module = “
General/model = “

General/homogeneous = true
General/model_kwargs/pooling_flow = “source_to_target”
General/trainer_kwargs/epochs = 2
}]
},
{
type = “UniformParameterRange”
name = “General/data_module_kwargs/

min_value = 0.001
max_value = 0.005
step_size = 0.001
include_max = true
},
{
type = “UniformIntegerParameterRange”
name = “General/data_module_kwargs/batch_size”
min_value = 2
max_value = 16
step_size = 2
include_max = true
},
{
type = “UniformIntegerParameterRange”
name = “General/model_kwargs/number_of_hidden_layers”
min_value = 2
max_value = 5
step_size = 1
include_max = true
},
{
type = “UniformParameterRange”
name = “General/trainer_kwargs/default_lr”
min_value = 0.0001
max_value = 0.01
step_size = null
include_max = true
},
{
type = “DiscreteParameterRange”
name = “General/model_kwargs/global_pooling_reduce”
values = [“mean”, “max”, “add”]
}]

  
  
Posted 2 years ago

Is it enough ? CostlyOstrich36

  
  
Posted 2 years ago

hi MoodySheep3
I think that you use ParameterSet the way it is supposed to be 🙂
When I run my examples, I also get this warning - which is weird ! because
This is just a warning, the script continues anyway (and reaches end without issue) Those HP exists - and all the sub tasks corresponding to a given parameters set find them !

  
  
Posted 2 years ago

image

  
  
Posted 2 years ago

Just to keep you updated, as promised 🙂
we have found the bug and will release a fix asap. for that too i will keep you updated 🙂

  
  
Posted 2 years ago

Can you please add a screenshot of how the hyper params show in the UI for you?

  
  
Posted 2 years ago

Can you maybe provide me an example of how to use the parameterset?

  
  
Posted 2 years ago

Hi MoodySheep3 ,

Can you please provide screenshots from the experiment - how the configuration looks like

  
  
Posted 2 years ago

` hyper_task = Task.init(project_name="***",
task_name="hyper-param-tuning",
task_type=Task.TaskTypes.optimizer,
reuse_last_task_id=False)

optimizer = HyperParameterOptimizer(
# specifying the task to be optimized, task must be in system already so it can be cloned
base_task_id=task.id,
# setting the hyper-parameters to optimize
hyper_parameters=[
ParameterSet([{"General/data_module": "", "General/model": "", "General/": True,
"General/model_kwargs/
": "***", "General/trainer_kwargs/epochs":100}]),
UniformParameterRange('General/data_module_kwargs/abundance_cutoff', min_value=0.001 , max_value=0.005, step_size=0.001),
UniformIntegerParameterRange('General/data_module_kwargs/batch_size', min_value=2, max_value=16, step_size=2),
UniformIntegerParameterRange('General/model_kwargs/number_of_hidden_layers', min_value=2, max_value=5, step_size=1),
UniformParameterRange('General/trainer_kwargs/default_lr', min_value=0.0001, max_value=0.01),
DiscreteParameterRange('General/model_kwargs/***', ["mean", "max", "add"]),
],
# setting the objective metric we want to maximize/minimize
objective_metric_title='val_loss',
objective_metric_series='val_loss',
objective_metric_sign='min',

# setting optimizer
optimizer_class=OptimizerOptuna,

# configuring optimization parameters
pool_period_min=2,
execution_queue='default',
max_number_of_concurrent_tasks=1,
optimization_time_limit=10.,
compute_time_limit=15,
total_max_jobs=20,
min_iteration_per_job=50,
max_iteration_per_job=150000,

) `

  
  
Posted 2 years ago

Last (very) little thing : could you please open a Github issue for this irrelevant warning 🙏 ? It makes sense to register on GH those bugs, because our code and releases are hosted there.
Thank you !
http://github.com/allegroai/clearml/issues

  
  
Posted 2 years ago

Also how are you running the HPO? From the examples?

  
  
Posted 2 years ago
1K Views
15 Answers
2 years ago
8 months ago
Tags