Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello! I Have A Problem With Tutorial Client Code Crashes On Starting Pipelines Remotely Via

Hello!
I have a problem with tutorial client code crashes on starting pipelines remotely via pipe.start() . (have no problem running it locally with pipe.start_locally(run_pipeline_steps_locally=True) )
` from clearml import PipelineController
pipe = PipelineController(
name="test-pipe", project="test-project", version="1.0.1"
)

pipe.add_parameter(
name='url',
description='url to pickle file',
default=' '
)

def step_one(pickle_data_url: str, extra: int = 43):
data = [1,2,3,4]

return data

pipe.add_function_step(
name='step_one',
function=step_one,
function_kwargs=dict(pickle_data_url='${pipeline.url}'),
function_return=['data'],
cache_executed_step=True,
execution_queue='services'
)

pipe.start() /opt/homebrew/anaconda3/envs/py39/bin/python /Users/kyuwoo/workspace/clearml-test/pipeline-controller.py
...
2022-08-19 09:17:56,626 - clearml - WARNING - Terminating local execution process
/opt/homebrew/anaconda3/envs/py39/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 16 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d ' And remote tasks are running fine, and just client code crashes while executing pipe.start() ` .
What have I done wrong?

filed an issue on GitHub https://github.com/allegroai/clearml/issues/748

  
  
Posted 2 years ago
Votes Newest

Answers 6


Hi FancyWhale93
pipe.start() should actually stop the local pipeline logic execution and fire it on the "services queue".
The idea is that you can launch the pipeline locally, but the actual execution of the entire logic is remote.
You can have the pipeline running locally if you call pipe.start_locally or also run the steps locally (as sub processes) with pipe.start_locally(run_pipeline_steps_locally=False)
BTW: based on your example, a more intuitive code might be the pipeline decorator example, you can see it here: https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py

  
  
Posted 2 years ago

Thanks for the reply. Yes, I got the point.
Still, My problem is calling pipe.start() crashes.
I want to run an API server that starts the pipeline on request, and I don't want it kills the API server.

  
  
Posted 2 years ago

Still, My problem is calling

pipe.start()

crashes.

is supposed to kill the process
2022-08-19 09:17:56,626 - clearml - WARNING - Terminating local execution processThis is what it writes before killing the local process.
/opt/homebrew/anaconda3/envs/py39/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 16 leaked semaphore objects to clean up at shutdown warnings.warn('resource_tracker: There appear to be %d 'This is a python cleanup warning (you can safely ignore it)

You should replace pipe.start() with pipe.start_locally(run_pipeline_steps_locally=False) it should do what you need (i.e. run the pipeline logic locally but the components remotely)
wdyt?

  
  
Posted 2 years ago

Oh. If the behavior is intentional, I think I must call it via subprocess.
Thanks for the answers. AgitatedDove14

  
  
Posted 2 years ago

just one more question AgitatedDove14 .
How can I do the same thing to pipe.start_locally(run_pipeline_steps_locally=False) with decorators?

  
  
Posted 2 years ago

Ah! sorry I just found the doc 😉
https://clear.ml/docs/latest/docs/pipelines/pipelines_sdk_function_decorators#running-the-pipeline

Good day AgitatedDove14

  
  
Posted 2 years ago
1K Views
6 Answers
2 years ago
one year ago
Tags