Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello! I Have A Quick Question About The Clearml Hyperparameter Optimizations Module. Is It Possible To Use It Without Using The Clearml Agent System? In Other Words, Launch A Script From A Few Machines Manually But The Hyperparameters Are Given From Cle

Hello!
I have a quick question about the ClearML Hyperparameter Optimizations module.

Is it possible to use it without using the clearml agent system?
In other words, launch a script from a few machines manually but the hyperparameters are given from clearml instead of a config file.

If so is there any doc/examples about this?

Thanks!

  
  
Posted one year ago
Votes Newest

Answers 7


Alright, thanks again for the answers 🙂

I'll take a deeper look at everything you mentioned but, sadly, I doubt this would work for me.

  
  
Posted one year ago

Great! Thank you

If I understand correctly that means many of the arguments in HyperParameterOptimizer become meaningless right?
execution_queue is not relevent anymore
total_max_jobs is determined by how many machine I launch the script
same for max_number_of_concurrent_tasks ?

Maybe to clarify, I was looking for something with the more classic Ask-and-Tell interface
https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/009_ask_and_tell.html
https://scikit-optimize.github.io/stable/auto_examples/ask-and-tell.html

  
  
Posted one year ago

That's not possible, right?

That's actually what the "start_locally" does, but the missing part is starting it on another machine without the agent (I mean it totally doable, and if important I can explain how, but this is probably not what you are after)

I really need to have a dummy experiment pre-made and have the agent clone the code, set up the env and run everything?

The agent caches everything, and actually can also just skip installing the env entirely. which would mean very little overhead.
If you want the agent to just run on your local machine with no venv installation what so ever:
CLEARML_AGENT_SKIP_PIP_VENV_INSTALL=1 CLEARML_AGENT_SKIP_PYTHON_ENV_INSTALL=1 clearml-agent daemon --queue "default" --foregroundnotice I added --foreground so it is running everything in the foreground instead of as background service so it is easier to debug / stop

  
  
Posted one year ago

Hi MistakenDragonfly51

Is it possible to use it without using the clearml agent system?

Yes it is, which would mean everything is executed locally
basically:
an_optimizer.start_locally()instead of this line
https://github.com/allegroai/clearml/blob/51af6e833ddc5a8ba1efaaf75980f58616b25e85/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py#L121

  
  
Posted one year ago

Yeah, I think I understand. The thing I was missing is that I wanted to not use the agent and just call my code directly.
That's not possible, right?
I really need to have a dummy experiment pre-made and have the agent clone the code, set up the env and run everything?

  
  
Posted one year ago

if so is there any doc/examples about this?

Good point, passing to docs 🙂
https://github.com/allegroai/clearml/blob/51af6e833ddc5a8ba1efaaf75980f58616b25e85/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py#L123
I mean it is mentioned, but we should highlight it better

  
  
Posted one year ago

execution_queue

is not relevent anymore

Correct

total_max_jobs

is determined by how many machine I launch the script

Actually this is the number of concurrent subprocesses that are launched on Your machine. Notice that local execution means all experiments are launched on the machine that started the HPO process.

Maybe to clarify, I was looking for something with the more classic Ask-and-Tell interface

so the way to connect "ask" in the model, is to just run an agent on that machine.
Notice that you can just create a dedicated queue and pip install the agent oon that machine, and run it in venv mode, which means very little infrastructure

Does that make sense ?

  
  
Posted one year ago
968 Views
7 Answers
one year ago
one year ago
Tags