Hey @<1523701205467926528:profile|AgitatedDove14> , thanks for the reply!
We would like to avoid dockerizing all our repositories. And for the time being we have not used the decorators, but we can do that too.
The pipeline is instead built dynamically at the moment.
The issue is that the components do not have their dependency. For example:
def step_one(...):
from internal.repo import private
# do stuff
When step_one
is added as a component to the pipeline, it does not include the “internal.repo” as a package dependency, so it crashes.
How or why is this the issue?
The main issue is a missing requirement on the Task component, and this is why it is failing.
You can however manually specify package (and I'm assuming this will solve the issue), but it should have autodetected, no?
There's no decorator, just e.g.
def helper(foo: Optional[Any] = None):
return foo
def step_one(...):
# stuff
Then the type hints are not removed from helper and the code immediately crashes when being run
There's code that strips the type hints from the component function, just think it should be applied to the helper functions too :)
Pinging about this still, unresolved 🤔
ClearML does not capture our internal libraries and so our functions (pipeline steps) crash with missing modules.
So from foo.mod import
"translates" to foo-mod @ git+
None ..
?
I think this is the main issue, is this reproducible ? How can we test that?
… And it’s failing on typing hints for functions passed in pipe.add_function_step(…, helper_function=[…])
… I guess those aren’t being removed like the wrapped function step?
Then the type hints are not removed from helper and the code immediately crashes when being run
Oh yes I see your point, that does make sense (btw removing the type hints will solve the issue)
regardless let me make sure this is solved
Hi @<1523701083040387072:profile|UnevenDolphin73>
How can I ensure tasks in a pipeline have the same environment as the pipeline itself?
...
but the tasks (executed remotely) do not use that same environment?
Just verifying, we are talking about pipeline decorators?
We also wanted this, we preferred to create a docker image with all we need, and let the pipeline steps use that docker image
You can specify the docker on the decorator itself:
None
Regrading capturing the packages, if you import them inside the decorated package, they will be captured based on what is installed in the local (i.e. initial) environment. The idea is that the components are Not the same as the logic, basically the logic of the pipeline should not have any real package requirement, only the components (actually doing something), should. What am I missing ?
It is installed on the pipeline creating the machine.
I have no idea why it did not automatically detect it 😞