We’d be happy if ClearML captures that (since it uses e.g. pip, then we have the git + commit hash for reproducibility), as it claims it would 😅
Any thoughts CostlyOstrich36 ?
There's no decorator, just e.g.
def helper(foo: Optional[Any] = None):
return foo
def step_one(...):
# stuff
Then the type hints are not removed from helper and the code immediately crashes when being run
Then the type hints are not removed from helper and the code immediately crashes when being run
Oh yes I see your point, that does make sense (btw removing the type hints will solve the issue)
regardless let me make sure this is solved
Yes. Though again, just highlighting the naming of foo-mod
is arbitrary. The actual module simply has a folder structured with an implicit namespace:
foo/
mod/
__init__.py
# stuff
FWIW, for the time being I’m just setting the packages to all the packages the pipeline tasks sees with:
packages = get_installed_pkgs_detail()
packages = [f"{name}=={version}" if version else name for name, version in packages.values()]
packages = task.data.script.requirements.get('pip', task.data.script.requirements.get('poetry')) or packages
print(f"Task requirements:\n{packages}")
tmp_requirements_file = "tmp_reqs.txt"
with open(tmp_requirements_file, "w") as f:
f.write("\n".join(packages) if isinstance(packages, list) else packages)
# ...
pipe.add_function_step(..., packages=tmp_requirements_file)
Yes. Though again, just highlighting the naming of
foo-mod
is arbitrary. The actual module simply has a folder structured with an implicit namespace:
Yep I think this is exactly why it fails detecting it, let me check that
And it’s failing on typing hints for functions passed in
pipe.add_function_step(…, helper_function=[…])
… I guess those aren’t being removed like the wrapped function step?
Can you provide the log? I think I'm missing what exactly was added into the decorator that somehow fails the Task creation
I think this is the main issue, is this reproducible ? How can we test that?
Hi UnevenDolphin73 , when you say pipeline itself you mean the controller? The controller is only in charge of handling the components. Lets say you have a pipeline with many parts. If you have a global environment then it will force a lot of redundant installations through the pipeline. What is your use case?
Well the individual tasks do not seem to have the expected environment.
We have an internal mono-repo and some of the packages are required - they’re all available correctly for the controller, only some are required for the individual tasks, but the “magic” doesn’t happen 😞
That is, the controller does not identify them as a requirement, so they’re not installed in the tasks environment.
We also wanted this, we preferred to create a docker image with all we need, and let the pipeline steps use that docker image
That way you don’t rely on clearml capturing the local env, and you can control what exists in the env
It’s just that for the packages
argument, ClearML says:
If not provided, packages are automatically added based on the imports used inside the wrapped function.
So… 🤔
Pinging about this still, unresolved 🤔
ClearML does not capture our internal libraries and so our functions (pipeline steps) crash with missing modules.