if this was possible we wouldn't need pip
now the question how to make the production stuff to work
Hi AverageRabbit65 , can you elaborate on what you're trying to do?
ClearML-Agent will automatically create a venv and install everything
the question how does ClearML know to create env and what files does it copy to the task/homes/yossi/.clearml/venvs-builds/3.7/
I think this is what you're looking for - https://clear.ml/docs/latest/docs/references/sdk/task#taskforce_requirements_env_freeze
I'm sorry. I think I wrote something wrong. I'll elaborate:
The SDK detects all the packages that are used during the run - The Agent will install a venv with those packages.
I think there is also an option to specify a requirements file directly in the agent.
Is there a reason you want to install packages from a requirements file instead of just using the automatic detection + agent?
Have you run experiments with the SDK? i.e added Task.init()
if a==True: import torch else: import tensorflow
And the pipeline runs with agents or locally?
and just making sure - by pipeline we're talking about the ClearML pipelines, correct?
https://clear.ml/docs/latest/docs/references/sdk/automation_controller_pipelinecontroller
the question how does ClearML know to create env and what files does it copy to the task
Either automatically detecting the packages in requirements.txt OR using the packages listed in the task itself
we have a diff name for this file in the repo
the only way any computer could figure this out is by running it
PIP Mode
By default, ClearML Agent works in PIP Mode, in which it uses
as the package manager. When ClearML runs, it will create a virtual environment (or reuse an existing one, see
). Task dependencies (Python packages) will be installed in the virtual environment.
how does it know what are the dependencies of a task?
or how can I add some libraries I'd like it to pip install in the new environment