Hi @<1765547897220239360:profile|FranticShark20> ! Do you have any other logs that could help us debug this, such as tritonserver logs?
Also, can you use model.onnx
as the model file name both in the upload and default_model_filename, just to make sure this is not a file extension problem (this can happen with triton)
Answered
Hi, I Am Familiarising Myself With Clearml-Serving And Following The Steps From The
Hi, I am familiarising myself with clearml-serving and following the steps from the Huggingface example .
The model is deployed and available in the endpoint configuration (see screenshot).
But when I try to run the query I get an error message: {"detail":"Error [<class 'grpc.aio._call.AioRpcError'>] processing request: <AioRpcError of RPC that terminated with:\n\tstatus = StatusCode.UNAVAILABLE\n\tdetails = \"Request for unknown model: 'transformer_model' is not found\"\n\tdebug_error_string = \"UNKNOWN:Error received from peer {grpc_message:\"Request for unknown model: \\'transformer_model\\' is not found\", grpc_status:14, created_time:\"2024-11-07T16:38:37.442221189+00:00\"}\"\n>"}
.
What am I missing?
156 Views
1
Answer
one month ago
one month ago
Tags