
Reputation
Badges 1
8 × Eureka!But as of rn i can acess and serve models, but cannot get them to be listed on the server interface
Yeah that is also my understanding, i have it on a separate machine as resources are not a issue.
hi thanks for the reply!
I have setted it and its dowloading (i checked on the container logs) but when i try to POST i get that error
Following up on this i was unable to fix the issue. But i ended up finding another complication. When uploading a onnx model using the upload command it keeps getting tagged as a TensorFlow model, even with the correct file structure, and that leads to the previous issue since the serving module will search for different format than the onnx.
As far as i could see this comes from the helper inside the triton engine, but as of right now i could not fix it.
Is there anything i might be doing ...
From what i could find, since the serving endpoint is not treated as a independent enviroment, the packages are being instaled into a 3.8.10 version of python. And the endpoint is trying to get them from another version that does not contain the packages. But i cannot change the version of either i dont understand why...
Ok! perfect, that was i was looking. Thank you so much
Do you have idea what might cause this?