GiganticTurtle0 we had this discussion in the wrong thread, I moved it here.
Moved from the wrong thread
Martin.B [1:55 PM]
GiganticTurtle0 the sample mock pipeline seems to be running perfectly on the latest code from GitHub, can you verify ?
Martin.B [1:55 PM]
Spoke too soon, sorry 🙂 issue is reproducible, give me a minute here
Alejandro C [1:59 PM]
Oh, and which approach do you suggest to achieve the same goal (simultaneously running the same pipeline with different configurations using a single for loop)? (edited)
Alejandro C [2:00 PM]
Unless there is a straightforward way to support it...
Martin.B [2:01 PM]
So why wouldn't you have:
` @PipelineDecorator.component(return_values=["msg"], execution_queue="services", helper_functions=[step_one, ...., step_four])
def execute_orchestrator(config: dict):
pass # stuff
return str(msg)
@PipelineDecorator.pipeline(...)
def main_pipeline():
PLAYERS_NAMES = ["Frank", "Alexander", "John"]
PLAYERS_IDENTITIES = ["Renegade", "Observer", "Lazy"]
for player_name, player_identity in zip(PLAYERS_NAMES, PLAYERS_IDENTITIES):
print(f"Executing pipeline for {player_name}")
config = dict()
config["player_name"] = player_name
config["player_identity"] = player_identity
execute_orchestrator(config)
print(f"Pipeline finished for {player_name}", end="\n\n") `(edited)
Alejandro C [2:10 PM]
Mmm that is a very good alternative, this way I can leverage the newly-introduced nested components. However, I think it would be reasonable (plus natural) to concurrently run the same pipeline with different configurations. For example, suppose having a single agent that asynchronously orchestrate all of them (so that it is not necessary to spin up an agent for each instance of the executed pipeline). I would be happy to have that feature if it were not overly complicated to implement (I mean concurrent pipelines, I know the "asynchronous agent" is already available through the --services-mode
CLI option) (edited)
Martin.B [2:14 PM]
Mmm that is a very good alternative, this way I can leverage the newly-introduced nested components.
It actually worked! out of the box (almost you have to cast the "msg" return value), this is so cool!
However, I think it would be reasonable (plus natural) to concurrently run the same pipeline with different configurations.
Pipeline is a Task, the idea is you have one Task pipline with the decorator, then task pipeline triggering the decorator pipelines with diff arguments.
what do you think ?
(It might be solvable to have nested decorated pipelines, but it will make the pipeline nested inside the pipeline function, which I'm not sure looks pretty ... ) (edited)
Alejandro C [2:19 PM]
I see. And is it the same for the PipelineController
? I mean, I can create several instances of PipelineController
, store them in a list and call the 'start' method for each instance in a for loop? Would that work? Or does it follow the same rules as PipelineDecorator
?
Martin.B [3:06 PM]
Hmm PipelineController
follows the same logic (singleton), kind of like Task.init
What I'm thinking is something like this example:
https://github.com/allegroai/clearml/blob/0a543d55d0055c9499b8cefdf669135740de9ce6/examples/pipeline/pipeline_from_functions.py#L72
Where the function itself is a self contained pipeline decorator , wdyt ? is this clean enough ?