Unanswered
Hey,
Just Trying Out Clearml-Serving And Getting The Following Error
RobustRat47 what's the Triton container you are using ?
BTW, the Triton error is:model_repository_manager.cc:1152] failed to load 'test_model_pytorch' version 1: Internal: unable to create stream: the provided PTX was compiled with an unsupported toolchain.
https://github.com/triton-inference-server/server/issues/3877
162 Views
0
Answers
2 years ago
one year ago