Unanswered
I'M New To Clearml And I'D Like To Deploy An Inference Service Based On My Trained Model, Something Like What Bentoml Does Wrapping Flask Api... Is There A Way To Do It Within Clearml?
GrumpyPenguin23 , AgitatedDove14 thanks for replying! basically i'm looking for a real time inference endpoint exposing a prediction API method, something like:curl -i \ --header "Content-Type: application/json" \ --request POST \ --data '[[5.1, 3.5, 1.4, 0.2]]' \
161 Views
0
Answers
3 years ago
one year ago