You can have
parents
as one of the
@PipelineDecorator.component
args. The step will be executed only after all the
parents
are executed and completed
Is there an example of using parents some place? Im not sure what to pass and also, how to pass a component from one pipeline that was just kicked off to execute remotely (which I'd like to block on) to a component of the next pipeline's run
on the same topic. What if (I were able to iterate and) I wanted the pipelines calls to be blocking so that the next pipeline executes only after the previous one completes?
Also, the answer to blocking on the pipeline might be in the .wait()
function: https://clear.ml/docs/latest/docs/references/sdk/automation_controller_pipelinecontroller#wait-1
TimelyPenguin76 I can't seem to make it work though, on which object should I run the .wait()
method?
nice, so a pipeline of pipelines is sort of possible. I guess that whole script can be run as a (remote) task?
thanks KindChimpanzee37 . Where is that minimal example to be found?
yes
here is the true "my_pipeline" declaration:@PipelineDecorator.pipeline( name="fastai_image_classification_pipeline", project="lavi-testing", target_project="lavi-testing", version="0.2", multi_instance_support="", add_pipeline_tags=True, abort_on_failure=True, ) def fastai_image_classification_pipeline( run_tags: List[str], i_dataset: int, backbone_names: List[str], image_resizes: List[int], batch_sizes: List[int], num_train_epochs: int, )
Hey PanickyMoth78 ,
my_pipeline
is wrapped with @PipelineDecorator.pipeline
? some other decorator?
Hi PanickyMoth78 , I have made a minimal example and indeed adding multi_instance_support=True
prevents ClearML from killing the process, allowing you to launch pipelines in a loop 🙂
👍
What if (I were able to iterate and) I wanted the pipelines calls to be blocking so that the next pipeline executes only after the previous one completes
You can have parents
as one of the @PipelineDecorator.component
args. The step will be executed only after all the parents
are executed and completed
Every talk with ClearML can run remotely 🙂
I've also not figured out how to modify the examples above to wait for one pipline to end before the next begins
oops, should it have been multi_instance_support=True
?
Hey PanickyMoth78
Here is an easy to reproduce, working example. Mind the multi_instance_support=True
parameter in the pipeline itself. This code launches 3 pipelines for me just as it should 🙂
` from clearml.automation.controller import PipelineDecorator
import time
PipelineDecorator.set_default_execution_queue("default")
@PipelineDecorator.component()
def step_one():
time.sleep(2)
@PipelineDecorator.component()
def step_two():
time.sleep(2)
@PipelineDecorator.pipeline(name='custom pipeline logic', project='examples', version='0.2', multi_instance_support=True)
def executing_pipeline(_):
# Use the pipeline argument to start the pipeline and pass it ot the first step
print('launch step one')
step_one()
print('launch step two')
step_two()
if name == 'main':
# Start the pipeline execution logic.
for i_dataset in [0, 1, 2]:
executing_pipeline(i_dataset)
print('process completed') `