Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I Am Running An Optimization Task With Optimizeroptuna (Using Your Doc

Hi, I am running an optimization task with OptimizerOptuna (using your doc https://clear.ml/docs/latest/docs/references/sdk/hpo_optimization_hyperparameteroptimizer/ ) and I was wondering if it possible to specify 'patience' of pruning algorithm?
If I run 'regular' optuna script I can specify optuna callback via :
callbacks = [tf.keras.callbacks.EarlyStopping(patience=2),
optuna.integration.TFKerasPruningCallback(trial, 'val_accuracy'),] ?
in your source code I saw in line 65 of optuna.py : if trial.should_prune()
but I didn't find any option to control the trial..
Thank you :)

  
  
Posted 3 years ago
Votes Newest

Answers 6


Thank you for the clarification, everything is clear now 🙂

  
  
Posted 3 years ago

AbruptWorm50 my apologies I think I mislead you you, yes you can pass geenric arguments to the optimizer class, but specifically for optuna, this is disabled (not sure why)
Specifically to your case, the way it works is:
your code logs to tensorboard, clearml catches the data and moves it to the Task (on clearml-server), optuna optimization is running on another machine, trail valies are maanually updated (i.e. the clearml optimization pulls the Task reported metric from the server and updates optuna, optuna early stopping is called (i.e. trial.should_prune()), if the trial need to be stopped, the clearml-optimization aborts the Task (the one running on a different machine)

Does that make sense ?
Specifically, what would be the part you would want modify?
(Notice again the Optuna process is not actually running on the same machine, even though in reality it can be the same one, this is not the same process, this is how it scales ti multi machines so quickly with clearml-agent)

  
  
Posted 3 years ago

Interesting I am only now seeing **optimizer_kwargs it seems that it will fix my problem. Is it too much to ask if you could add an example of how to initiate the optuna object with the kwargs (mainly how to initiate 'trial', 'study', 'objective' arguments) ? 🙂

  
  
Posted 3 years ago

Hi AbruptWorm50

I was wondering if it possible to specify 'patience' of pruning algorithm?

Any of the kwargs passed to **optimizer_kwargs will be directly passed to the optuna obejct
https://github.com/allegroai/clearml/blob/2e050cf913e10d4281d0d2e270eea1c7717a19c3/clearml/automation/optimization.py#L1096

It should allow you to control the parameters, no?

Regrading the callback, what exactly do you think to put there?
Is the callback this enough?
https://github.com/allegroai/clearml/blob/9624f2c715df933ff17ed5ae9bf3c0a0b5fd5a0e/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py#L23

  
  
Posted 3 years ago

Which file are you referring to? Can you link it?

  
  
Posted 3 years ago
1K Views
6 Answers
3 years ago
one year ago
Tags