Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello We Want To Serve A Simple Rulebase Model. Think It As A .Py With With A Simple If...Else Function 1) How Do You Deliver A Rule Based Model. Or Do I Need To Train A Tensorflopytorch,Scikitlearn To Serve A Simple Rulebase Model. 2) How Do You Manage

Hello we want to serve a simple rulebase model. Think it as a .py with with a simple if...else function

  1. How do you deliver a Rule Based model. Or do I need to Train a TensorfloPytorch,Scikitlearn to serve a simple rulebase model.

  2. How do you manage your online featurer using ClearML?

Thanks!

  
  
Posted 2 years ago
Votes Newest

Answers 7


Hi Oriel!

If you want to only serve an if-else model, why do you want to use clearml-serving for that? What do you mean by "online featurer"?

  
  
Posted 2 years ago

Hi DeterminedCrocodile36 ,

To use a custom engine you need to change the process tree.
https://github.com/allegroai/clearml-serving/tree/main/examples/pipeline
Section 3 is what you're interested in
And here is an example of the code you need to change. I think it's fairly straightforward.
https://github.com/allegroai/clearml-serving/blob/main/examples/pipeline/preprocess.py

  
  
Posted 2 years ago

Adding a custom engine example is on the 'to do' list but if you manage to add a PR with an example it would be great 🙂

  
  
Posted 2 years ago

What do you mean exactly? Is it that you want more visibility into what kind of preprocessing code is running for each endpoint?

  
  
Posted 2 years ago

Because I want different models versions in the same API fashion. Some of them are RuleBased som of them are MLbased.

  1. At the inference I need to ask som other features from distinct sources. Ex: Some third ApI request.
  
  
Posted 2 years ago

Like Nathan said, custom engines are a TODO, but for your second question, you can add that API request in the model preprocessing, which is a function you can define yourself! It will be ran every time a request comes in and you can do whatever you want in it and change the incoming data however you wish 🙂

example: https://github.com/allegroai/clearml-serving/blob/main/examples/keras/preprocess.py

  
  
Posted 2 years ago

But if I'm going to use Preprocessing Layer for tha then I won't be able to seeat ClearML Models the methods for RuleBased vs The ML based models.

  
  
Posted 2 years ago