@PipelineDecorator.component(repo="..")
The imports are not recognized - they are not on the pythonpath of the task that the agent starts.
RoughTiger69 add the imports inside the functions itself, you cal also specify the, on the component@PipelineDecorator.component(..., package=["pcakge", "package==1.2.3"])
or@PipelineDecorator.component(...): import pandas as pd # noqa ...
So I had to add it explicitly via a docker init script
Oh yes, that makes sense, can't think of a better hack other than sys.path.append(os.path.join(os.path.dirname(__file__), "src"))
RoughTiger69 , can you please elaborate a bit on your use case?
and of course this solution forces me to do a git push for all the other dependent modules when creating the task…
AgitatedDove14 the emphasis is that the imports I am doing are not from external/pipe packages, they are just neighbouring modules to the function I am importing. Imports that rely on pip installed packages work well
they are just neighboring modules to the function I am importing.
So I think that is you specify the repo,, on the remote machine you will end with the code of the component sitting at the root folder of the repo, from there I assume you can import the rest, the root git path should be part of your PYTHONPATH automatically.
wdyt?
AgitatedDove14
the root git path should be part of your PYTHONPATH automatically
That’s true but it doesn’t respect the root package (sources root or whatever).
i.e. if all my packages are runder /path/to/git/root /src/
So I had to add it explicitly via a docker init script…
sure CostlyOstrich36
I have something like the following:
@PipelineDecorator.component(....) def my_task(...) from my_module1 import my_func1 from my_modeul2 import ....
my_module1 and 2 are modules that are a part of the same project source. they don’t come as a separate package.
Now when I run this in clearml, these imports don’t work.
These functions may require transitive imports of course, so the following doesn’t work:PipelineDecorator.component(helper_function=[my_func1])
Even when I add the repo:@PipelineDecorator.component(repo="..")
The imports are not recognized - they are not on the pythonpath of the task that the agent starts.
My current workaround is to run it in a docker, and add an init script that does something like this:
export PYTHONPATH=${PYTHONPATH}:/root/.clearml/venvs-builds/task_repository/my_repo.git/src exec "$@"