Then the type hints are not removed from helper and the code immediately crashes when being run
Oh yes I see your point, that does make sense (btw removing the type hints will solve the issue)
regardless let me make sure this is solved
And is this repo installed on the pipeline creating machine ?
Basically I'm asking how come it did not automatically detect it?
It is installed on the pipeline creating the machine.
I have no idea why it did not automatically detect it 😞
not sure about this, we really like being in control of reproducibility and not depend on the invoking machine… maybe that’s not what you intend
Pinging about this still, unresolved 🤔
ClearML does not capture our internal libraries and so our functions (pipeline steps) crash with missing modules.
… And it’s failing on typing hints for functions passed in pipe.add_function_step(…, helper_function=[…]) … I guess those aren’t being removed like the wrapped function step?
Hi @<1523701083040387072:profile|UnevenDolphin73>
How can I ensure tasks in a pipeline have the same environment as the pipeline itself?
...
but the tasks (executed remotely) do not use that same environment?
Just verifying, we are talking about pipeline decorators?
We also wanted this, we preferred to create a docker image with all we need, and let the pipeline steps use that docker image
You can specify the docker on the decorator itself:
None
Regrading capturing the packages, if you import them inside the decorated package, they will be captured based on what is installed in the local (i.e. initial) environment. The idea is that the components are Not the same as the logic, basically the logic of the pipeline should not have any real package requirement, only the components (actually doing something), should. What am I missing ?
Well the individual tasks do not seem to have the expected environment.
Yes. Though again, just highlighting the naming of foo-mod is arbitrary. The actual module simply has a folder structured with an implicit namespace:
foo/
mod/
__init__.py
# stuff
FWIW, for the time being I’m just setting the packages to all the packages the pipeline tasks sees with:
packages = get_installed_pkgs_detail()
packages = [f"{name}=={version}" if version else name for name, version in packages.values()]
packages = task.data.script.requirements.get('pip', task.data.script.requirements.get('poetry')) or packages
print(f"Task requirements:\n{packages}")
tmp_requirements_file = "tmp_reqs.txt"
with open(tmp_requirements_file, "w") as f:
f.write("\n".join(packages) if isinstance(packages, list) else packages)
# ...
pipe.add_function_step(..., packages=tmp_requirements_file)
Yes. Though again, just highlighting the naming of
foo-mod
is arbitrary. The actual module simply has a folder structured with an implicit namespace:
Yep I think this is exactly why it fails detecting it, let me check that
And it’s failing on typing hints for functions passed in
pipe.add_function_step(…, helper_function=[…])
… I guess those aren’t being removed like the wrapped function step?
Can you provide the log? I think I'm missing what exactly was added into the decorator that somehow fails the Task creation
So from foo.mod import "translates" to foo-mod @ git+ None .. ?