Hi EmbarrassedSpider34 , whats the version you are running with?
Although I didn't understand why you mentioned
torch
in my case?
Just a guess 🙂 other frameworks do multi-process as well,
I would guess it relates to parallelization of Tasks execution of the
HyperParameterOptimizer
class?
Yes that might be it, it's basically by product of using python "Process" class for multiprocessing. we are working on a fix, not a trivial unfortunately
Hi EmbarrassedSpider34
Long story (see below) short, yes you can ignore this warning :)
Specifically, torch is spinning processes and killing them, every process will have a reference to the parent semaphore (for internal clearml bookkeeping), now python is not very good with this kind of thing (and it is getting better on newer python verions), bottom line python "think" someone lost a semaphore, but there reality is that subprocess never created it in the first place. Does that make sense ?
That helps a lot!
Thanks Martin.
Although I didn't understand why you mentioned torch
in my case?
Since I don't use it directly, I guess somewhere along the way multiprocessing does get activated (in HPO)
I would guess it relates to parallelization of Tasks execution of the HyperParameterOptimizer
class?