Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi! I Am Setting Up A Few Clearml Agents To Run On A Local Gpu Server. They Have To Run In Their Own Docker Containers, Since We'Re Not Allowed To Install A Python Runtime On The Server Directly. I Got One Agent Running Listening To The Default Queue, Bu

Hi!

I am setting up a few ClearML agents to run on a local GPU server. They have to run in their own Docker containers, since we're not allowed to install a Python runtime on the server directly. I got one Agent running listening to the default queue, but I would like to start up two of them on different queue targeting two separate GPUs.

Is there an easy way to configure the allegroai/clearml-agent Docker container to use GPU 0/1 and listen to a queue that is not default?

  
  
Posted 2 years ago
Votes Newest

Answers 10


Thanks StrongHorse8
Where do you think would be a good place to put a more advanced setup? Maybe we should add an entry for DevOps? Wdyt?

  
  
Posted 2 years ago

For future reference, there's actually an easier way.

The entrypoint of the Docker container accepts CLEARML_AGENT_EXTRA_ARGS. So adding CLEARML_AGENT_EXTRA_ARGS=--queue new_queue_name --create-queue to your environment let's it work with the default clearml-agent image.

Unfortunately, nowhere to be found in the documentation, but you can see it in the repository: https://github.com/allegroai/clearml-agent/blob/master/docker/agent/entrypoint.sh

  
  
Posted 2 years ago

can you build your own docker image with clearml-agent installed in it?

  
  
Posted 2 years ago

🤔 Hmm, yes, I suppose I can do that.

Ah, yes, I found the Dockerfile on the clearml-agent already. Should be doable!

Thanks for the suggestion!

  
  
Posted 2 years ago

Hi StrongHorse8 , you want to run the agent inside a container or the agent to run your task in docker mode?

  
  
Posted 2 years ago

Ho StrongHorse8 ,

Yes, each clearml agent can listen to a different queue and use a specific GPU, you can view all the use cases and example in this link https://clear.ml/docs/latest/docs/clearml_agent/#allocating-resources

  
  
Posted 2 years ago

Hi Alon,

Thanks! I know that already. I am more looking for a solution to spin up the Docker container automatically without having to manually log into each one of them and start a clearml-agent from there.

Sorry for the confusion.

  
  
Posted 2 years ago

👍 great, so if you have an image with clearml agent, it should solve it 😀

  
  
Posted 2 years ago

Hi TimelyPenguin76

Both. The agent has to run inside a container and it will spin up sibling containers to run the tasks.

  
  
Posted 2 years ago

I guess you are using an on prem server and not cloud one (aws for example)

  
  
Posted 2 years ago
539 Views
10 Answers
2 years ago
one year ago
Tags
Similar posts