Unanswered
And One More Question. How Can I Get Loaded Model In Preporcess Class In Clearml Serving?
ohh AbruptHedgehog21 if this is the case, why don't you store the model with torch.jit.save
and use Triton to run the model ?
See example:
https://github.com/allegroai/clearml-serving/tree/main/examples/pytorch
(BTW: if you want a full custom model serve, in this case you would need to add torch to the list of python packages)
192 Views
0
Answers
2 years ago
one year ago