yes, I'll better explain :
I have a task (a script created as a task), that can execute with different configurations
I want #n instances of the task to run with #n different configs
hence my pipeline can get a list of #n configs
and based on 'n', I'd like to have dynamic 'n' steps in the pipeline
thanks @<1523701205467926528:profile|AgitatedDove14>
yes argument saying always create from code
can be helpful
also, if we can edit the configuration objects of a pipeline, that can be beneficial too. which we're unable to do from UI
yes
argument saying always create from code
can be helpful
@<1523701523954012160:profile|ShallowCormorant89> any chance you can open a github issue on that, just so we do not forget ?
if we can edit the configuration objects of a pipeline, that can be beneficial too. which we're unable to do from UI
Actually you already can, after you clone the pipeline, you can press on details then go to configuration Tab, and edit the pipeline object. The format is HOCON (like json only does not break if missed a , " etc.). Should be quite self explanatory. Not full UI drag & drop, but definitely allows to add some changes (If this feature is used by more users we might add nicer UI, currently I'm not sure how many would actually like to edit pipelines)
@<1523701205467926528:profile|AgitatedDove14> sure, I'll open a issue.
thank you for briefing, you're right, cloning and editing is feasible. However, the pipeline experiment is not visible in the project experiment list.. they are hidden, which troubles in cloning the pipeline..
Hi @<1523701523954012160:profile|ShallowCormorant89>
This is generally based on number of agents, or am I missing something ? Also is it based on Task or decorated functions ?
yes, but the pipe starts running before we can edit it..
@<1523701205467926528:profile|AgitatedDove14> we @<1539417873305309184:profile|DangerousMole43> found an issue in the pipeline, that can be closely related to this.
- we have a pipeline running perfectly.
- The parent node fails for a valid reason, and the child nodes are skipped.3. but when we try to do a "New Run" from UI, it tries to follow the DAG of previous run (the run with all child nodes skipped) and the new run fails too.
- a single failed pipeline creates all further pipelines to fail.
- this shouldn't happen, is it possible to ask the "New Run" button to always clone a particular perfect running pipeline ?
@<1523701523954012160:profile|ShallowCormorant89> can you verify it is reproducible in 1.9.3 ? because if it is I'd like to fix that 🙂
will it be possible for us to configure the "new run" button in a way so that it always clones from a particular pipeline ?
What do you mean by "particular pipeline" ? by default it will clone the last successful one, and by right clicking a specific one you can run a copy of that one. what am I missing ?
However, the pipeline experiment is not visible in the project experiment list.
I mean press on the "full details" in the pipeline page
I see, so in theory you could call add_step with a pipeline parameter (i.e. pipe.add_parameter etc.)
But currently the implementation is such that if you are starting the pipeline from the UI
(i.e. rerunning it with a different argument), the pipeline DAG is deserialized from the Pipeline Task (the idea that one could control the entire DAG externally without changing the code)
I think a good idea would be to actually allow the pipeline class to have an argument saying always create from code, and then use a pipeline argument to control that, wdyt?
(BTW: if you are using pipelines from decorator function, then there is no real issue just have a for loop inside the pipeline and run multiple time the same component function, only with different arguments)
@<1523701205467926528:profile|AgitatedDove14> , hi, will it be possible for us to configure the "new run" button in a way so that it always clones from a particular pipeline ?
. but when we try to do a "New Run" from UI, it tries to follow the DAG of previous run (the run with all child nodes skipped) and the new run fails too.
This is odd, is this reproducible ? what's the clearml python package version ?