Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Looking At Clearml-Serving - Two Questions - 1, What’S The Status Of The Project 2. How Does One Say How A Model Is Loaded And Served Etc? For Example, If I Have A Spacy Ner Model, I Need To Specify Some Custom Code Right?

Looking at clearml-serving - two questions -

1, what’s the status of the project
2. how does one say how a model is loaded and served etc? For example, if i have a spacy ner model, I need to specify some custom code right?

  
  
Posted 2 years ago
Votes Newest

Answers 18


'config.pbtxt' could not be inferred. please provide specific config.pbtxt definition.

This basically means there is no configuration on how to serve the mode, i.e. size/type of lower (input) layer and output layer.
You can wither store the configuration on the creating Task, like is done here:
https://github.com/allegroai/clearml-serving/blob/b5f5d72046f878bd09505606ca1147d93a5df069/examples/keras/keras_mnist.py#L51
Or you can provide it as standalone file when registering the model with clearml-serving, an example for config.pbtxt :
https://github.com/triton-inference-server/server/blob/main/qa/python_models/identity_fp32/config.pbtxt

  
  
Posted 2 years ago

Just to confirm AgitatedDove14 - clearml doesn’t do any “magic” in regard to this for tensorflow, pytorch etc right?

  
  
Posted 2 years ago

Ah, just saw from the example that even that is doing the config pbtxt stuff - https://github.com/allegroai/clearml-serving/blob/main/examples/keras/keras_mnist.py#L51

  
  
Posted 2 years ago

Hi TrickySheep9 , can you provide more info on your specific use-case?

  
  
Posted 2 years ago

Here’s an example error I get trying it out on one of the example models:
Error: Requested Model project=ClearML Examples name=autokeras imdb example with scalars tags=None not found. 'config.pbtxt' could not be inferred. please provide specific config.pbtxt definition.

  
  
Posted 2 years ago

So for adding a model for serve with endpoint you can use

clearml-serving triton --endpoint "<your endpoint>" --model-project "<your project>" --model-name "<your model name>"
when the model is getting updated, is should use the new one

  
  
Posted 2 years ago

Hi TrickySheep9 , is this model register in your clearml app?

  
  
Posted 2 years ago

Yes. It's a pickle file that I have added via OutputModel

  
  
Posted 2 years ago

And other question is clearml-serving ready for serious use?

  
  
Posted 2 years ago

AgitatedDove14 - any thoughts on this?

  
  
Posted 2 years ago

But you have to do config.pbtxt stuff right?

  
  
Posted 2 years ago

And other question is clearml-serving ready for serious use?

Define serious use? KFserving support is in the pipeline, if that helps.
Notice that clearml-serving is basically a control plane for the serving engine, not to neglect the importance of it, the heavy lifting is done by Triton 🙂 (or any other backend we will integrate with, maybe Seldon)

  
  
Posted 2 years ago

Sure, got it. Will play around with it 🙂

  
  
Posted 2 years ago

Thanks

  
  
Posted 2 years ago

👍

  
  
Posted 2 years ago

Hey SuccessfulKoala55 Like I mentioned, I have a spacy ner model that I need to serve for inference.

  
  
Posted 2 years ago

clearml doesn’t do any “magic” in regard to this for tensorflow, pytorch etc right?

No 😞 and if you have an idea on how, that will be great.
Basically the problem is that there is no "standard" way to know which layer is in/out

  
  
Posted 2 years ago

Found the custom backend aspect of Triton - https://github.com/triton-inference-server/python_backend
Is that the right way?

  
  
Posted 2 years ago
622 Views
18 Answers
2 years ago
one year ago
Tags