the agent does not auto-refresh the configuration, after a conf file change you should restart the agent, after that it should present the new configuration when loading
Hi @<1523701168822292480:profile|ExuberantBat52>
I am trying to execute a pipeline remotely,
How are you creating your pipeline? and are you referring to an issue with the pipeline logic or is it a component that needs that repo installed ?
The extra_index_url
is not even showing..
I would just add git+
None to your requirements (either in the requirements.txt or even better as part of the pipeline/component where you also specify the repo to be used)
The agent will automatically push the crednetilas when it installs the repo as wheel.
wdyt?
btw: you might also get away with adding -e .
into the requirements.txt (but you will need to test that one)
Hi @<1523701205467926528:profile|AgitatedDove14> ,
Thank you for your prompt response.
I am using the functional pipeline API to create the steps. Where each step calls a function. My functions are stored in the files under the ap_pipeline
directory ( filters.py
, features.py
, etc..)
These are packaged as part of this repo.
The modules are imported inside of the clearml_pipeline.py
so it would look something like:
from ap_pipeline.features import func1, func2 ....
This works locally since ap_pipeline is installed using pip install -e .
Which installs the repo as an editable install.
The pipeline installs all the dependencies except the ap_pipeline
(this repo) which then causes the pipeline to fail as it will say that it can't find the module ap_pipeline
The second problem that I am running into now, is that one of the dependencies in the package is actually hosted in a private repo.
Add your private repo to the extra index section in the clearml.conf:
None
@<1523701205467926528:profile|AgitatedDove14> So I was able to get it to pull the package by defining packages=
None
The second problem that I am running into now, is that one of the dependencies in the package is actually hosted in a private repo.
I tried getting around it by defining the environment PIP_INDEX_URL
and passing it using log_os_environments
in the clearml.conf
and I am now getting this message:
md-ap-feature-engineering/.venv/lib/python3.11/site-packages/clearml_agent/external/requirements_parser/parser.py:49: UserWarning: Private repos not supported. Skipping.
warnings.warn('Private repos not supported. Skipping.')
Thanks @<1523701205467926528:profile|AgitatedDove14> restarting the agents did the trick!
How do you handle private repos in clearml for packages?
I added the following to the
clearml.conf
file
the conf file that is on the worker machine ?
I set my local laptop as an agent for testing purposes. I run the code on my laptop, it gets sent to the server which sends it back to my laptop. So the conf file is technically on the worker right?
I added the following to the clearml.conf
file
agent {
package_manager: {
# supported options: pip, conda, poetry
type: pip,
extra_index_url: ["my_url"],
},
}
For some reason the changes were not reflected, here are the logs from the agent:
agent.package_manager.type = pip
agent.package_manager.pip_version.0 = <20.2 ; python_version < '3.10'
agent.package_manager.pip_version.1 = <22.3 ; python_version >\= '3.10'
agent.package_manager.system_site_packages = false
agent.package_manager.force_upgrade = false
agent.package_manager.conda_channels.0 = pytorch
agent.package_manager.conda_channels.1 = conda-forge
agent.package_manager.conda_channels.2 = defaults
agent.package_manager.priority_optional_packages.0 = pygobject
agent.package_manager.torch_nightly = false
agent.package_manager.poetry_files_from_repo_working_dir = false