So, I have to package all the modules first and find a way to install that package at the beginning of pipeline execution to be about to use these module, am I right?
What do you mean by "modules first and find a way to install that package" ?
Are those modules already in wheels ? are they part a git repository?
(the pipeline component can also start inside a git repository it clones)
Great ascii tree π
GrittyKangaroo27 assuming you are doing:@PipelineDecorator.component(..., repo='.') def my_component(): ...
The function my_component
will be running in the repository root, so in thoery it could access the packages 1/2
(I'm assuming here directory "project" is the repository root)
Does that make sense ?
BTW: when you pass repo='.'
to @PipelineDecorator.component
it takes the current repository that exists on the local machine running the pipeline logic
AgitatedDove14 Nice! Iβll try this out
Hi GrittyKangaroo27
Is it possible to import user-defined modules when wrapping tasks/steps with functions and decorators?
Sure, any package (local included) can be imported, and will be automatically listed in the "installed packages" section of the pipeline component Task
(This of course assumes that on a remote machine you could do the "pip install <package")
Make sense ?
AgitatedDove14 Sorry for the confusing question. I mean I cannot use relative imports inside the βwrappingβ function.
In detail, my project have this directory structure
βββ project
βββ package1
β βββ build_pipeline.py
β βββ module1.py
β βββ module2.py
βββ package2
βββ init.py
βββ module3.py
βββ module4.py
βββ subpackage1
βββ module5.py
From build_pipeline.py, inside each βwrappingβ function, I cannot import module1and module2 of package1 and other modules in package2 as I usually do; so I think this is a limitation of building pipeline from functions and decorator.
BTW, happy weekend!