In the main pipeline I want to work with the secondary pipeline and other functions decorated with
PipelineDecorator
. Does ClearMl allow this? I have not been able to get it to work.
Usually when we think about pipelines or pipelines, the nested pipeline is just another Task you are running in the DAG (where the target queue is the services
queue).
When you say nested pipelines with decorators, what exactly do you have in mind ?
Beautiful. I have tested the new functionality with several use cases and it works just as I expected. Excellent work, as usual :D
Hi AgitatedDove14 ,
Any updates on the new ClearML release that fixes the bugs we mentioned in this thread? :)
I think it was just pushed, including nested call you have to use the new argument for the decorator, helper_function
https://github.com/allegroai/clearml/blob/400c6ec103d9f2193694c54d7491bb1a74bbe8e8/clearml/automation/controller.py#L2392
Mmm what would be the implications of not being part of the DAG? I mean, how could that step be launched if it is not part of the execution graph?
Just to get the full picture, are we expecting to see the newly created step (aka eager execution) on the original pipeline (i.e. as part od the DAG visualization) ?
I mean to use a function decorated with PipelineDecorator.pipeline
inside another pipeline decorated in the same way.
In the traceback attached below you can see that I am trying to use a component named user_config_creation
inside the create_user_configs
sub-pipeline. I have imported user_config_creation
inside create_user_configs
but a KeyError
is raised (however I assume the function has been imported correctly because no ImportError
or ModuleNotFoundError
occurred). Any clue on what might be going on? BTW, executing_pipeline
is the name of the main pipeline.Traceback (most recent call last): File "/user/project/new_pipeline.py", line 104, in <module> executing_pipeline() File "/user/anaconda3/envs/myenv/lib/python3.9/site-packages/clearml/automation/controller.py", line 2213, in internal_decorator func(**pipeline_kwargs) File "/user/project/new_pipeline.py", line 58, in executing_pipeline users_config_filenames = create_user_configs( File "/user/anaconda3/envs/myenv/lib/python3.9/site-packages/clearml/automation/controller.py", line 2213, in internal_decorator func(**pipeline_kwargs) File "/user/project/new_pipeline.py", line 29, in create_user_configs users_config_filenames[name] = user_config_creation( File "/user/anaconda3/envs/myenv/lib/python3.9/site-packages/clearml/automation/controller.py", line 2058, in wrapper _node = cls._singleton._nodes[_name] KeyError: 'user_config_creation'
Exactly, at first I was trying to call a component from another component, but it didn't work. Then I thought it would be more natural to do this using a pipeline, but it didn't recognize the user_config_creation
function despite I imported it as I would do under PipelineDecorator.component
. I really like the idea of enabling an argument to specify the components you are going to use in the pipeline so they are in the step's context! I will be eagerly waiting for that feature :D
I mean to use a function decorated with
PipelineDecorator.pipeline
inside another pipeline decorated in the same way.
Ohh... so would it make sense to add "helper_functions" so that a function will be available in the step's context ?
Or maybe we need a new to support "standalone" decorator?! Currently to actually "launch" the function step, you have to call it from the "pipeline" main logic function, but, at least in theory, one could do without the Pipeline itself... I really like this idea 🙂
Let me check a few things ...