I see, you can manually do that with add steps, i.e.
for elem in map:
pipeline.add_step(..., elem)
or you can do that with full logic:
@PipelineDecorator.component(...)
def square_num(num):
return num**2
@PipelineDecorator.pipeline(...)
def map_flow(nums):
res = []
# This will run in parallel
for num in nums:
res.append(square_num)
# this is where we actually wait for the results
for r in res:
print_nums(r)
map_flow([1,2,3,5,8,13])
cool, thanks! the first one was what I had thought of but seemed unpythonic, so I'll give the second a shot
I could just loop through and create separate pipelines with different parameters, but seems sort of inefficient. the hyperparameter optimization might actually work in this case utilizing grid search, but seems like kind of a hack
those look like linear DAGs to me, but maybe I'm missing something. I'm thinking something like the map operator in Prefect where I can provide an array of ["A", "B", "C"]
and run the steps outlined with dotted lines independently for each of those are arguments