Unanswered
I'M New To Clearml And I'D Like To Deploy An Inference Service Based On My Trained Model, Something Like What Bentoml Does Wrapping Flask Api... Is There A Way To Do It Within Clearml?
ContemplativeCockroach39 unfortunately No directly as part of clearml 😞
I can recommend the Nvidia triton serving (I'm hoping we will have the out-of-the-box integration soon)
mean while you can manually run it , see docs:
https://developer.nvidia.com/nvidia-triton-inference-server
docker here
https://ngc.nvidia.com/catalog/containers/nvidia:tritonserver
156 Views
0
Answers
3 years ago
one year ago