Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi There! I Had A Question Regarding Batch Inference With Clearml. I Would Like To Serve A Model Using An Inference Task (Containing The Model And The Code To Perform The Inference) As A Base To Be Cloned And Edited (Change Input Arguments), And Queue To

Hi there!
I had a question regarding batch inference with clearml.
I would like to serve a model using an inference task (containing the model and the code to perform the inference) as a base to be cloned and edited (change input arguments), and queue to be processed by a remote worker.
Is this a correct way to do batch inference? What is the best practice to achieve this using docker?
thanks in advanced for your answer!
Best regards

  
  
Posted 5 months ago
Votes Newest

Answers 3


Hi Damjan, thank you for your message.
But If I understand correctly, that doc would be great for online serving. I am looking a solution for batch inference instead.

  
  
Posted 5 months ago

I will try to create the base docker image using the "Exporting a Task into a Standalone Docker Container" section from
from None

  
  
Posted 5 months ago

Not ClearML employee (just a recent user), but maybe this will help? None

  
  
Posted 5 months ago
427 Views
3 Answers
5 months ago
5 months ago
Tags