irrespective of what I actually have installed when running the script?
Ok i did a pip install -r requirements.txt and NOW it picks them up correctly
So packages have to be installed and not just be mentioned in requirements / imported?
Yes, it looks for them locally so it has all the specific versions you need.
If the "installed packages" is totally empty the agent will revert to looking for requirements.txt inside the repository.
Is it not possible to say just look at my requirements.txt file and the imports in the script?
I think there is a GitHub Issue for this feature
(basically the issue is, requirements.txt are very often not updated, and have no real version lock, so replicating a working env is always safer)
So packages have to be installed and not just be mentioned in requirements / imported?
Ok i did a pip install -r requirements.txt and NOW it picks them up correctly
I am doing Task.init but it’s not adding expected libraries imported in the script or from requirements.txt
Is it not possible to say just look at my requirements.txt file and the imports in the script?
does clearml expect them to be actuall installed to add them as installed packages for a task?
It should add itself to the list (assuming you will end up calling Task.init in your code)
It analyses the script code itself, going over all imports and adding only the directly imported packages
To confirm, if i have fresh venv with no dependency installed except clearml
I have a requirements.txt file in root, and a script at scripts/script1.py
The script1.py does task.init(), execute_remotely and then imports few dependenceies
Now I run python scripts/script1.py
And it should pick up the installed packages correctly?