Unanswered
Hi!
I Am Setting Up A Few Clearml Agents To Run On A Local Gpu Server. They Have To Run In Their Own Docker Containers, Since We'Re Not Allowed To Install A Python Runtime On The Server Directly. I Got One Agent Running Listening To The Default Queue, Bu
Ho StrongHorse8 ,
Yes, each clearml agent can listen to a different queue and use a specific GPU, you can view all the use cases and example in this link https://clear.ml/docs/latest/docs/clearml_agent/#allocating-resources
155 Views
0
Answers
3 years ago
one year ago
Tags