From the successful task it says in the installed package dgl_cu113 == 0.9.0
I assume this is because I manually installed it in my local environment using pip instal -f
However, when trying to run hyperparameter tuning using the same job, it fails to install the package because it's not available via PyPI.
Therefore, I tried to add a requirement.txt file to the task. But that failed parsing with the error included above.
Maybe AnxiousSeal95 might have some input 🙂
Hi DefeatedMoth52 , so the reason why we don't support --find-links is that it is not in the requirements.txt standard (Or so I'm told 😄 )
What can be done is just putting the specific links to the wheel (something like https://data.dgl.ai/wheels/dgl-0.1.2-cp35-cp35m-macosx_10_6_x86_64.whl ) in the requirements.txt, and this should work. Makes sense?
CostlyOstrich36 I used it in my local environment to install the package. From ClearML webUI, I can see that the correct version is used for the task.
Looking at the source https://github.com/allegroai/clearml-agent/blob/9006c2d28f1fbaa42272473f23d67999cf56ab25/clearml_agent/external/requirements_parser/parser.py#L46 . It seems that this is not support by intention. Is there plan to change this in the future?
Can you give an example of how you installed it using --find-links
(or which command used it)
pip install dgl-cu113 -f
https://data.dgl.ai/wheels/repo.html
Hi DefeatedMoth52 , where have you been using --find-links
tag? When you run the experiment how does the package show up in the ClearML UI?