clearml should detect the "main" packages used in the repository (not just the main-script), the derivatives will be installed automatically by pip when the agent is installing the environment, once the agent is done setting the environment, it updates back the Task with the full list of packages including all required packages.
So if everything works you should see "my_package" package in the "installed packages"
the assumption is that if you do:pip install "my_package"
It will set "pandas" as one of its dependencies, and pip will automatically pull pandas as well.
That way we do not list the entire venv you are running on, just the packages/versions you are using, and we let pip sort the dependencies when installing with the agent
Make sense ?
Python 3.7.3 (default, Dec 20 2019, 18:57:59) [GCC 8.3.0]
absl_py == 0.10.0
azure_storage_blob == 12.7.1
clearml == 0.17.4
google_cloud_storage == 1.35.0
Detailed import analysis
**************************
IMPORT PACKAGE absl_py
generate_tfrecord_pipeline.py: 3
IMPORT PACKAGE azure_storage_blob
clearml.storage: 0
IMPORT PACKAGE clearml
generate_tfrecord_pipeline.py: 1
IMPORT PACKAGE google_cloud_storage
clearml.storage: 0
no, my_package is never added manually
Could you test with the latest "cleaml"pip install git+
Task.add_requirement(".") should be supported now 🙂
What exactly do you get automatically on the "Installed Packages" (meaning the "my_package" line)?
and in dummy_module I have:
import pandas as pd
def func(args):
pd.read_csv(args.file)
in this example my main func is going to be the scripts that creates the pipeline controller
the thing is that I have to manually add all imports of packages that don't appear in my main script
Hi AgitatedDove14 , now I'm seeing under INSTALLED PACKAGES both "." and "my_package"..what could be the issue here?
In that case when you create the Tasks for the step,do not specify any packages/requirements, then the agent will just use the "requirements.txt" from the repository.
If you need you can also specify them when you create the Task itself see https://github.com/allegroai/clearml/blob/912f6f5ba2328b26de042de03f02de5802df360f/clearml/task.py#L608
https://github.com/allegroai/clearml/blob/912f6f5ba2328b26de042de03f02de5802df360f/clearml/task.py#L609
for it to work in a remote worker
MagnificentSeaurchin79 do you have the "." package listed under "installed packages" after you reset the Task ?
is "my_package" a local package ?
what is the output of:pip freeze | grep my_package
MagnificentSeaurchin79
"requirements.txt" is ignored if the Task has an "installed packges" section (i.e. not completely empty) Task.add_requirements('pandas') needs to be called before Task.init() (I'll make sure there is a warning if called after)
and what about those packages that are not being loaded because they don't appear in the main file?
I think that worked, because now I'm having a different issue..it says that cannot import pandas..I have it both in my requirements.txt
and in task.add_requirements('pandas')
So the "packages" are the packages you need in the steps themselves ?
and then when running in agent mode, it fails because my_package can't be installed using pip...so I have to manually edit the section and remove the "my_package"
so in my main file I have:
from my_package import dummy_module
dummy_module.func(args)
it fails because my_package using pip...so I have to manually edit the section and remove the "my_package"
MagnificentSeaurchin79 did you manually add both "." and my_package ?
If so, what was the reasoning to add my_package if pip cannot install it ?
it seems that I need to add it ( import pandas
) in the main file...even though I don't use it there...