Hi GrievingTurkey78
If you like to have the same environment in trains-agent
, you can use on your local machine the detect_with_pip_freeze
option, on you ~/trains.conf
file.
Just change detect_with_pip_freeze: true
( https://github.com/allegroai/trains/blob/master/docs/trains.conf#L168 is an example)
TimelyPenguin76 I found out its just one package that is causing the error ( cloudpickle
breaks everything). Is there a way to use Pigar but force a single package to have a version?
GrievingTurkey78 all the packages should have a version, do you have some without?
No, I have all the packages with a version. I just want to know if there is a way to override the requirements versions detected by Pigar when using detect_with_pip_freeze: false
. I have locally cloudpickle==1.4.1
but when running the code and sending the task to the node the environment uses cloudpickle==1.6.0
. I have to manually change the version on the UI. Is there a way to force this single package to have a version? Maybe on the requirments.txt or something similar
Using detect_with_pip_freeze: true
runs into package version not found for some of the ones I have locally.
Which one? If the package is located in additional artifact repositories, you can add it
Thats really cool! But I would still prefer avoid using pip_freeze, is there a way?
Basically we can have Pigar or freeze for getting the packages&versions (+ change and create a template in the UI), what is the specific scenario you have? maybe we can think about another solution
Pigar is capturing different versions that the ones I have installed on my local machine (not a problem except for one). I just want to force the version of that package in a way that I don’t have to manually change it from the UI for every experiment.
Hi GrievingTurkey78
How are you getting different version than what is used in run time? it analyzes the PYTHONPATH just as python does ? How can I reproduce it? Currently you can use Task.add_requirements(package_name, package_version=None)
This will not force it though, it is a recommendation (if it fails to find the package itself) maybe we can add force ?!What do you think?
AgitatedDove14 I am not sure why the packages get different versions, maybe since the package is not directly imported in my code it is possible to get a different version to what I have locally (?). Should all the libraries versions match exactly between local and the code that runs in the agent? The Task.add_requirements(package_name, package_version=None)
workaround works perfectly! I just add the previous version that doesn’t break the code. Yes, definitely a force flag could help or using the requirements.txt
as the rule of thumb for packages and versions.
GrievingTurkey78
maybe since the package is not directly imported in my code it is possible to get a different version to what I have locally (?).
If these are derivative packages (i.e. imported by other packages) they are not automatically logged when executing the Task manually (in order to keep the "installed packages as lean as possible on the one hand but specify also specify the important packages for you)
That said, when the "trains-agent" executed the task it will store nack the entire venv it created, including the derivative packages.
This might cause the difference in package version, as the "trains-agent "'s pip install will pick the latest derivative packages (based on the restrictions of the selected ones)
TheÂ
Task.add_requirements(package_name, package_version=None)
 workaround works perfectly!
Great!
Yes, definitely a force flag could help or using theÂ
requirements.txt
 as the rule of thumb for packages and versions.
If you need you can tell trains to store your entire venv:
set detect_with_pip_freeze: true
in trains.conf
https://github.com/allegroai/trains/blob/b2c830f34e0df1cf127c16526a93523c1da66cb4/docs/trains.conf#L169
If it helps, the next trains-agent will have an additional flag to ignore the "installed packages" section and only use the "requirements.txt" from the repo