Hi EagerOtter28 , welcome 🙂 . a few comments:
agent.system_side_packages: true
will make ClearML at least use the packages installed into the system-python in the docker container, right?
Correct. The agent will always install a venv (minimal), and this flag will cause it to use packages installed in the system python instead of installing them in the venv.
I know there is
clearml-agent build
(
) - could this be used to build a general purpose image, or just to build an image specific to one task?
It's meant as a way to build a self-contained image for a specific task
Running pre-built environments in an agent works with
clearml-agent daemon --docker --standalone-mode
, right? Is there any documentation on the requirements for the standalone mode? I managed to reverse-engineer most of the requirements, but I am stuck at a point where the agent cannot find the git repo to use inside of the container.
Well, that's correct, although not mandatory. In standalone mode, the agent will basically not try to fetch anything, so no repo cloning, no requirements installation etc - it basically assumes everything (including the code) exists in the image.
Ah OK, thank you a lot for clarifying SuccessfulKoala55 ! 🙂 Then I guess in our case, we should just use our Dev image as default image of the docker agents. For debugging, it would be cool to avoid having to install libraries and a minimal venv everytime, but we do need the repo cloning, so I think we will not run in standalone mode.
For debugging, those 2-3mins setup time are annoying but for production use where jobs run for hours/days, it does not matter so much I guess 🤔
EagerOtter28 I’m running into a similar situation as you.
I think you could use --standalone-mode
and do the cloning yourself in the docker bash script that you can configure in the agent config.
LazyTurkey38 OK thank you for sharing! 🙂 I'll have a look in a few days 👍