So, I have to package all the modules first and find a way to install that package at the beginning of pipeline execution to be about to use these module, am I right?
Hi GrittyKangaroo27
Is it possible to import user-defined modules when wrapping tasks/steps with functions and decorators?
Sure, any package (local included) can be imported, and will be automatically listed in the "installed packages" section of the pipeline component Task
(This of course assumes that on a remote machine you could do the "pip install <package")
Make sense ?
AgitatedDove14 Sorry for the confusing question. I mean I cannot use relative imports inside the “wrapping” function.
In detail, my project have this directory structure
└── project
├── package1
│ ├── build_pipeline.py
│ ├── module1.py
│ └── module2.py
└── package2
├── init.py
├── module3.py
├── module4.py
└── subpackage1
└── module5.py
From build_pipeline.py, inside each “wrapping” function, I cannot import module1and module2 of package1 and other modules in package2 as I usually do; so I think this is a limitation of building pipeline from functions and decorator.
BTW, happy weekend!
What do you mean by "modules first and find a way to install that package" ?
Are those modules already in wheels ? are they part a git repository?
(the pipeline component can also start inside a git repository it clones)
Great ascii tree 🙂
GrittyKangaroo27 assuming you are doing:@PipelineDecorator.component(..., repo='.') def my_component(): ...
The function my_component
will be running in the repository root, so in thoery it could access the packages 1/2
(I'm assuming here directory "project" is the repository root)
Does that make sense ?
BTW: when you pass repo='.'
to @PipelineDecorator.component
it takes the current repository that exists on the local machine running the pipeline logic