Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Am Getting This Specific Message When Trying To Run Hyper Parameters Optimization (Running Remotely My Task). Does It Affect My Flow? Do I Have Something To Worry About?

I am getting this specific message when trying to run Hyper parameters optimization (running remotely my task).
Does it affect my flow?
Do I have something to worry about?

  
  
Posted 2 years ago
Votes Newest

Answers 5


Hi EmbarrassedSpider34 , whats the version you are running with?

  
  
Posted 2 years ago

Although I didn't understand why you mentioned

torch

in my case?

Just a guess 🙂 other frameworks do multi-process as well,

I would guess it relates to parallelization of Tasks execution of the

HyperParameterOptimizer

class?

Yes that might be it, it's basically by product of using python "Process" class for multiprocessing. we are working on a fix, not a trivial unfortunately

  
  
Posted 2 years ago

Hi EmbarrassedSpider34
Long story (see below) short, yes you can ignore this warning :)

Specifically, torch is spinning processes and killing them, every process will have a reference to the parent semaphore (for internal clearml bookkeeping), now python is not very good with this kind of thing (and it is getting better on newer python verions), bottom line python "think" someone lost a semaphore, but there reality is that subprocess never created it in the first place. Does that make sense ?

  
  
Posted 2 years ago

clearml 1.2.0 pypi_0 pypi

  
  
Posted 2 years ago

That helps a lot!
Thanks Martin.
Although I didn't understand why you mentioned torch in my case?
Since I don't use it directly, I guess somewhere along the way multiprocessing does get activated (in HPO)
I would guess it relates to parallelization of Tasks execution of the HyperParameterOptimizer class?

  
  
Posted 2 years ago