Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Unanswered
Hi, I Am Trying To Use The Parameterset For Hyper-Parameter Tuning With Dependencies, An Example Of How I Use It: Parameterset([{“Prm1”:1, “Prm2": 1},{“Prm1”:2, “Prm2":2}]) But I Get A Warning :


` hyper_task = Task.init(project_name="***",
task_name="hyper-param-tuning",
task_type=Task.TaskTypes.optimizer,
reuse_last_task_id=False)

optimizer = HyperParameterOptimizer(
# specifying the task to be optimized, task must be in system already so it can be cloned
base_task_id=task.id,
# setting the hyper-parameters to optimize
hyper_parameters=[
ParameterSet([{"General/data_module": "", "General/model": "", "General/": True,
"General/model_kwargs/
": "***", "General/trainer_kwargs/epochs":100}]),
UniformParameterRange('General/data_module_kwargs/abundance_cutoff', min_value=0.001 , max_value=0.005, step_size=0.001),
UniformIntegerParameterRange('General/data_module_kwargs/batch_size', min_value=2, max_value=16, step_size=2),
UniformIntegerParameterRange('General/model_kwargs/number_of_hidden_layers', min_value=2, max_value=5, step_size=1),
UniformParameterRange('General/trainer_kwargs/default_lr', min_value=0.0001, max_value=0.01),
DiscreteParameterRange('General/model_kwargs/***', ["mean", "max", "add"]),
],
# setting the objective metric we want to maximize/minimize
objective_metric_title='val_loss',
objective_metric_series='val_loss',
objective_metric_sign='min',

# setting optimizer
optimizer_class=OptimizerOptuna,

# configuring optimization parameters
pool_period_min=2,
execution_queue='default',
max_number_of_concurrent_tasks=1,
optimization_time_limit=10.,
compute_time_limit=15,
total_max_jobs=20,
min_iteration_per_job=50,
max_iteration_per_job=150000,

) `

  
  
Posted 2 years ago
163 Views
0 Answers
2 years ago
one year ago