Unanswered
Looking At Clearml-Serving - Two Questions -
1, What’S The Status Of The Project
2. How Does One Say How A Model Is Loaded And Served Etc? For Example, If I Have A Spacy Ner Model, I Need To Specify Some Custom Code Right?
'config.pbtxt' could not be inferred. please provide specific config.pbtxt definition.
This basically means there is no configuration on how to serve the mode, i.e. size/type of lower (input) layer and output layer.
You can wither store the configuration on the creating Task, like is done here:
https://github.com/allegroai/clearml-serving/blob/b5f5d72046f878bd09505606ca1147d93a5df069/examples/keras/keras_mnist.py#L51
Or you can provide it as standalone file when registering the model with clearml-serving, an example for config.pbtxt :
https://github.com/triton-inference-server/server/blob/main/qa/python_models/identity_fp32/config.pbtxt
189 Views
0
Answers
3 years ago
one year ago