Unanswered
Hi There!
I Had A Question Regarding Batch Inference With Clearml.
I Would Like To Serve A Model Using An Inference Task (Containing The Model And The Code To Perform The Inference) As A Base To Be Cloned And Edited (Change Input Arguments), And Queue To
Hi Damjan, thank you for your message.
But If I understand correctly, that doc would be great for online serving. I am looking a solution for batch inference instead.
142 Views
0
Answers
one year ago
one year ago