Still, My problem is calling
pipe.start()
crashes.
is supposed to kill the process2022-08-19 09:17:56,626 - clearml - WARNING - Terminating local execution process
This is what it writes before killing the local process./opt/homebrew/anaconda3/envs/py39/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 16 leaked semaphore objects to clean up at shutdown warnings.warn('resource_tracker: There appear to be %d '
This is a python cleanup warning (you can safely ignore it)
You should replace pipe.start()
with pipe.start_locally(run_pipeline_steps_locally=False)
it should do what you need (i.e. run the pipeline logic locally but the components remotely)
wdyt?
Hi FancyWhale93pipe.start()
should actually stop the local pipeline logic execution and fire it on the "services queue".
The idea is that you can launch the pipeline locally, but the actual execution of the entire logic is remote.
You can have the pipeline running locally if you call pipe.start_locally
or also run the steps locally (as sub processes) with pipe.start_locally(run_pipeline_steps_locally=False)
BTW: based on your example, a more intuitive code might be the pipeline decorator example, you can see it here: https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py
just one more question AgitatedDove14 .
How can I do the same thing to pipe.start_locally(run_pipeline_steps_locally=False)
with decorators?
Oh. If the behavior is intentional, I think I must call it via subprocess.
Thanks for the answers. AgitatedDove14
Thanks for the reply. Yes, I got the point.
Still, My problem is calling pipe.start()
crashes.
I want to run an API server that starts the pipeline on request, and I don't want it kills the API server.
Ah! sorry I just found the doc 😉
https://clear.ml/docs/latest/docs/pipelines/pipelines_sdk_function_decorators#running-the-pipeline
Good day AgitatedDove14