Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Dear Developers, I Encountered A Question That The Local Module Cannot Be Found When Pulling Task From Queue. I Opened A Issue Here

Dear developers, I encountered a question that the local module cannot be found when pulling task from queue. I opened a issue here https://github.com/allegroai/clearml/issues/503 hope someone can help me out

  
  
Posted 2 years ago
Votes Newest

Answers 33


Or can I enable agent in this kind of local mode?

You just built a local agent

  
  
Posted 2 years ago

So is there any tutorial on this topic

Dude, we just invented it 🙂
Any chance you feel like writing something in a github issue, so other users know how to do this ?

Guess I’ll need to implement job schedule myself

You have a scheduler, it will pull jobs from the queue by order, then run them one after the other (one at a time)

  
  
Posted 2 years ago

Guess my best chance is to check out the agent source code right?

  
  
Posted 2 years ago

image

  
  
Posted 2 years ago

image

  
  
Posted 2 years ago

Do you think the local agent will be supported someday in the future?

We can take this ode sample and extent it. can't see any harm in that.
It will enable very easy to ran "sweeps" without any "real agent" installed.

I'm thinking roll out multiple experiments at once

You mean as multiple subprocesses, sure if you have the memory for it

  
  
Posted 2 years ago

Yeah the ultimate goal I'm trying to achieve is to flexibly running tasks for example before running, could have a claim saying how many resources I can and the agent will run as soon as it find there are enough resources

  
  
Posted 2 years ago

I can comment it on the github issue

  
  
Posted 2 years ago

How can I do to help extend it?

  
  
Posted 2 years ago

from clearml.backend_api.session.client import APIClient client = APIClient() result = client.queues.get_next_task(queue='queue_ID_here')Seems to work for me (latest RC 1.1.5rc2)

  
  
Posted 2 years ago

Well it should work, make sure you see the Task "holds" all the information needed (under the execution tab). repo / uncommitted changes / python packages etc.
Then configure your agent (choose pip/conda/poetry as package managers), and spin it up (by default in venv/coda mode, or in docker mode)
Should work 🙂

  
  
Posted 2 years ago

I’ll try it tomorrow and let you know if there is anything wrong

  
  
Posted 2 years ago

I tried from clearml.backend_api.session import client no luck

  
  
Posted 2 years ago

which client should I import for Client.queues ?

  
  
Posted 2 years ago

But it seems buggy

  
  
Posted 2 years ago

Just tried the code

  
  
Posted 2 years ago

Oh this is one line missing on the above code

  
  
Posted 2 years ago

Sure I'm right here with you

  
  
Posted 2 years ago

Apiclient will report

  
  
Posted 2 years ago

sorry typo client.task. should be client.tasks.

  
  
Posted 2 years ago

works fine awesome!

  
  
Posted 2 years ago

On sec

  
  
Posted 2 years ago

Or can I enable agent in this kind of local mode?

  
  
Posted 2 years ago

Guess I’ll need to implement job schedule myself

  
  
Posted 2 years ago

So is there any tutorial on this topic

  
  
Posted 2 years ago

Do you think the local agent will be supported someday in the future?

  
  
Posted 2 years ago

I'm thinking roll out multiple experiments at once

  
  
Posted 2 years ago

This is so awesome

  
  
Posted 2 years ago

let me check a sec

  
  
Posted 2 years ago

Yeah the ultimate goal I'm trying to achieve is to flexibly running tasks for example before running, could have a claim saying how many resources I can and the agent will run as soon as it find there are enough resources

Checkout Task.execute_remotely() you can push it anywhere in your code, when execution get to it, If you are running without an agent it will stop the process and re-enqueue it to be executed remotely, on the remote machine the call itself becomes a noop,

I can comment it on the github issue

Yes please do 🙂

How can I do to help extend it?

How about a CLI tool, like what we have with "clearml-task" ?

This is so awesome

Thank you ! 😊

  
  
Posted 2 years ago
15K Views
33 Answers
2 years ago
7 months ago
Tags