Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Unanswered
Hey, I'M Using Clearml Gcp Autoscaler And It Seems That


CostlyOstrich36
We created a very basic and simple task to demonstrate the difference in times between task running from an autoscaler spinned up instance VS manual spinned up instance with clearml-agent,
the task code is as follows:

from clearml import Task
import time

mydict = {"a": 1, "b": 2}
task = Task.init(project_name="test", task_name="test_small_dict")
task.execute_remotely(queue_name="tomer_queue")

# measure the time the function executes
start = time.time()
task.connect(mydict)
end = time.time()
print("Time elapsed: ", end - start)

the instance from autoscaler took 13.647857427597046 seconds
VS
manual clearml-agent took 1.5556252002716064 seconds

that's around ~10 times slower!

I'm providing the full logs of both experiments.

  
  
Posted 25 days ago
19 Views
0 Answers
25 days ago
25 days ago