Hi,
I'm using  PipelineController  to launch remote pipelines from a local orchestration script. For each input file, I create a pipeline like this sequentially:
for file in files:
   pipeline = PipelineController(...)
   pipeline.add_step(...)
   pipeline.start(queue="default")
However, once  .start()  is called, ClearML prints:
ClearML Terminating local execution process - continuing execution remotely
...and the local Python process exits immediately.
I tried isolating each pipeline launch in a separate subprocess (via  subprocess.run()  from a Poetry-managed script), but the subprocess still gets terminated as soon as  pipeline.start()  is invoked.
- How can I launch multiple pipelines (via  PipelineController) from a local script without ClearML killing the process?
- Is there a supported way to enqueue remote pipelines  without  using  execute_remotely()so that I can launch many in sequence or in parallel?
 Thanks in advance!  🙌