Yep, I'm thinking about it, but for security reason we cant install docker on our local PC(.
I used nvcr pytorch image and instruct clearml to inherit global dependencies. No need to install torch and work well.
SpicyLion54 the ClearML agent will always create a venv - you can't provide your own venv, hence the best practice is using docker for that purpose 🙂
Alternatively, you can provide an extra_index_url
to the agent so it will also look for packages on a different server (you can simply install your own pypi mirror) - see https://github.com/allegroai/clearml-agent/blob/742cbf57670815a80a0c502ef61da12521e1e71f/docs/clearml.conf#L66
SuccessfulKoala55 Im install torch from local with add line in pip config like find-link = file:///home/user/pypi. It's work for pip install and clearml-agent looking in this dir(see it in logs), but don't find torch and raise error. Could not find pytorch with cuda support.
Hi SpicyLion54
the -f flag is not very stabe for pip (and cannot be added in requirements.txt). ClearML agent mwill automatically find the correct torch (from the torch repository) based on the cuda it detects in runtime.
This means it automatically translates torch==1.8.1 and will pull form the correct repo based on torch support table.