Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Another Question On The Topic Of How A Remote Execution Of A Pipeline Kills The Calling Process (Previously Discussed

Another question on the topic of how a remote execution of a pipeline kills the calling process (previously discussed https://clearml.slack.com/archives/CTK20V944/p1657622607693089?thread_ts=1657582739.354619&cid=CTK20V944 but I still find it a weird design).
so, just to be sure.
If I were to try and run the code below which is meant to call for remote execution of the same pipeline with different args three times, it will exit after the first iteration?
from clearml.automation.controller import PipelineDecorator PipelineDecorator.set_default_execution_queue("default") for i_dataset in [0,1,2]: my_pipeline(i_dataset)If so, what is the recommended way of doing this?

  
  
Posted one year ago
Votes Newest

Answers 13


on the same topic. What if (I were able to iterate and) I wanted the pipelines calls to be blocking so that the next pipeline executes only after the previous one completes?

  
  
Posted one year ago

Hey PanickyMoth78 ,

my_pipeline is wrapped with @PipelineDecorator.pipeline ? some other decorator?

  
  
Posted one year ago

You can have

parents

as one of the

@PipelineDecorator.component

args. The step will be executed only after all the

parents

are executed and completed

Is there an example of using parents some place? Im not sure what to pass and also, how to pass a component from one pipeline that was just kicked off to execute remotely (which I'd like to block on) to a component of the next pipeline's run

  
  
Posted one year ago

yes
here is the true "my_pipeline" declaration:
@PipelineDecorator.pipeline( name="fastai_image_classification_pipeline", project="lavi-testing", target_project="lavi-testing", version="0.2", multi_instance_support="", add_pipeline_tags=True, abort_on_failure=True, ) def fastai_image_classification_pipeline( run_tags: List[str], i_dataset: int, backbone_names: List[str], image_resizes: List[int], batch_sizes: List[int], num_train_epochs: int, )

  
  
Posted one year ago

👍

What if (I were able to iterate and) I wanted the pipelines calls to be blocking so that the next pipeline executes only after the previous one completes

You can have parents as one of the @PipelineDecorator.component args. The step will be executed only after all the parents are executed and completed

  
  
Posted one year ago

Hi PanickyMoth78 , I have made a minimal example and indeed adding multi_instance_support=True prevents ClearML from killing the process, allowing you to launch pipelines in a loop 🙂

  
  
Posted one year ago

nice, so a pipeline of pipelines is sort of possible. I guess that whole script can be run as a (remote) task?

  
  
Posted one year ago

thanks KindChimpanzee37 . Where is that minimal example to be found?

  
  
Posted one year ago

oops, should it have been multi_instance_support=True ?

  
  
Posted one year ago

Every talk with ClearML can run remotely 🙂

  
  
Posted one year ago

Also, the answer to blocking on the pipeline might be in the .wait() function: https://clear.ml/docs/latest/docs/references/sdk/automation_controller_pipelinecontroller#wait-1

TimelyPenguin76 I can't seem to make it work though, on which object should I run the .wait() method?

  
  
Posted one year ago

I've also not figured out how to modify the examples above to wait for one pipline to end before the next begins

  
  
Posted one year ago

Hey PanickyMoth78
Here is an easy to reproduce, working example. Mind the multi_instance_support=True parameter in the pipeline itself. This code launches 3 pipelines for me just as it should 🙂
` from clearml.automation.controller import PipelineDecorator
import time

PipelineDecorator.set_default_execution_queue("default")

@PipelineDecorator.component()
def step_one():
time.sleep(2)

@PipelineDecorator.component()
def step_two():
time.sleep(2)

@PipelineDecorator.pipeline(name='custom pipeline logic', project='examples', version='0.2', multi_instance_support=True)
def executing_pipeline(_):
# Use the pipeline argument to start the pipeline and pass it ot the first step
print('launch step one')
step_one()

print('launch step two')
step_two()

if name == 'main':
# Start the pipeline execution logic.
for i_dataset in [0, 1, 2]:
executing_pipeline(i_dataset)

print('process completed') `
  
  
Posted one year ago
577 Views
13 Answers
one year ago
one year ago
Tags