Unanswered
I'D Been Following The Clearml Serving Example On Its Github Repo Here. It Basically Deploys A Keras Mnist Model. The Tutorial However Ends Once The Model Is Deployed However And I'Ve Tried Going Through Resources On How To Do Inference But Have Had Troub
This is the simplest I could get for the inference request. The model and input and output names are the ones that the server wanted.
165 Views
0
Answers
2 years ago
one year ago