Also, if you check the logs my package is actually built at step 4:
2023-05-03 10:07:58
Building wheels for collected packages: softgroup
Building wheel for softgroup (setup.py) ... ?25l-
2023-05-03 10:08:14
\ |
2023-05-03 10:08:19
/ - \
Looks like the -e
flag is ignored. But it should work either way 🤔
I ned to pip-install the package because i need to build some Cuda extensions
I don't think -e .
will work when running from the agent context
Is there any way i can do something equivalent to -e .
in the agent context?
in what order does the agent do things?
I assumed it was
- Start the docker container
- Run the docker setup bash script
- Pull the repo , checkout the commit, apply changes
- Install pip requirementsIn this case, i wouldn't have the correct version of the repo at the time the setup bash script runs
Can't you do that in the docker bash script?
I attached three logs:
- local_console_output : how i setup my local task. Important commands:
apt-install
that installs the same dependencies that are on thedocker_setup_bash_script
; andpip install -r requirements.txt
- local_task_output: clearml experiment console log. The error "the following arguments are required: config" is the expected behavior
- remote_task_output: clearml experiment console log obtained when i clone the local task and enqueue it for remote execution. Notice that the behavior is different: i get
ImportError: cannot import name 'ops' from 'softgroup.ops' (/root/.clearml/venvs-builds/3.7/task_repository/SoftGroup.git/softgroup/ops/__init__.py)
Not sure if i can because of some proprietary stuff on the code.
But i'll try writing a minimum working example on monday!
Basically: locally, when i run pip install -r requirements.txt
, the softgroup.ops
package is installed correctly. But not on the remote worker
I install the softgroup.ops
package via the last line in requirements.txt
, i.e. pip install -e .
Hi @<1556450111259676672:profile|PlainSeaurchin97> , can you share the full log and an example of how the requirements file looks?
Looks like it was a python thing, not a clearml thing!
Clearml correctly installs the .
from requirements.txt
, but the project from the working directory was conflicting with the installed package, so python couldn't find the compiled extension.
With some small changes to my repo, everything works