Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi There! I Had A Question Regarding Batch Inference With Clearml. I Would Like To Serve A Model Using An Inference Task (Containing The Model And The Code To Perform The Inference) As A Base To Be Cloned And Edited (Change Input Arguments), And Queue To

Hi there!
I had a question regarding batch inference with clearml.
I would like to serve a model using an inference task (containing the model and the code to perform the inference) as a base to be cloned and edited (change input arguments), and queue to be processed by a remote worker.
Is this a correct way to do batch inference? What is the best practice to achieve this using docker?
thanks in advanced for your answer!
Best regards

  
  
Posted 11 months ago
Votes Newest

Answers 3


Not ClearML employee (just a recent user), but maybe this will help? None

  
  
Posted 11 months ago

I will try to create the base docker image using the "Exporting a Task into a Standalone Docker Container" section from
from None

  
  
Posted 11 months ago

Hi Damjan, thank you for your message.
But If I understand correctly, that doc would be great for online serving. I am looking a solution for batch inference instead.

  
  
Posted 11 months ago
804 Views
3 Answers
11 months ago
11 months ago
Tags