Do we support GPUs in a) docker mode b) k8s glue?
yes on both
Is there a good reference to get started with k8s glue?
A few folks here already set it up, do you have a k8s cluster with GPU support ?
Hi JealousParrot68
spinning the clearml-agent with docker support (i.e. each experiment is running inside its own container):
https://clear.ml/docs/latest/docs/clearml_agent#docker-mode
Basically you can specify a default docker to use (per agent) and a specific docker container to use per Task (configured in the UI under execution at the bottom)
So on the ec2 instance (with the agent running), just install prior to running the agent:apt-get install poppler-utils
Would that mean if you are running 2-3 clearml agents for 2-3 projects that their environment has to be such that they could run each of the 3 projects (each having different requirements)?
What is the pattern to start an agent within the project specific docker container based on the task? Would that be handled via the service queue? Or can you already configure that on a task level providing a docker file?
Basic question - i am running clearml agent in a ubuntu ec2 machine. Does it use docker by default? I thought it uses docker only if I add the --docker
flag?
In the docker bash startup scriptapt-get install poppler-utils
I guess this is a advantage with docker mode. Will try that out as well sometime.
Is there a good reference to get started with k8s glue?
Do we support GPUs in a) docker mode b) k8s glue?