Reputation
Badges 1
8 × Eureka!On my page they don't appear next to the other. If the name of the table is the same between two experiments, I just have one of two tables showing (even if the values are different)
logs:
Successfully built numpy
Installing collected packages: numpy
Successfully installed numpy-1.23.5
WARNING: The directory '/Users/michaelresplandy/Library/Caches/pip' or its parent directory is not owned or is not writable by the current user. The cache has been disabled. Check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Collecting matplotlib==3.5.3
Downloading matplotlib-3.5.3.tar.gz (35.2 MB)
1678725812818 Ordinateur-portabl...
Hi ! With
Task.add_requirements('../../requirements.txt')
called before Task.init
Linked with that issue, I'm not able to compare the tables of two experiments, is it a known issue ?
This is why I'm wondering if running the initial experiment in local in a venv is the reason why clearml is struggling to reproduce the experiment
The experiment fails.
Yes this is my scenario.
Basically I'm using this command "Task.force_requirements_env_freeze(requirements_file='requirements.txt')" on a requirement file that I know is working in local (if i set up a venv with that requirement file, the script is running)
But when I clone and rerun the experiment, clearml isnt able to install the requirements (i checked and the same version of python is used)
I have a script. Before running it, I set up a venv, install the libraries from requirements.txt, then launch the script
I then try to relaunch the experiment from the ui but it keeps failing
Would you have a tutorial to suggest, to help me test the one click reproducibility ?
Thanks a lot John ! Let see if this works better with docker