Unanswered
Hi There, Another Triton-Related Question:
Are We Able To Deploy
Hi there!
Technically there should be nothing stopping you from deploying a python backend model. I just checked the source code and ClearML basically just downloads the model artifact and renames it based on the inferred type of model.
As far as I'm aware (could def be wrong here!), the Triton Python backend essentially requires a folder containing e.g. a model.py
file. I propose the following steps:
- Given the code above, if you package the
model.py
file as a folder in clearml, clearml-serving will detect this and simply extract the folder in the right place for you. Then you have to adjust theconfig.pbtxt
using the command line arguments to properly load the python file - If this does not work, an extra ifelse check should be added in the code above, also checking for "python" in the framework, similar to e.g. pytorch or onnx
- However it is done, once the python file is in the right position and the
config.pbtxt
is properly setup, triton should just take it from there and everything should work as expected
Could you try this approach? If this works, it would be an interesting example to add to the repo! Thanks 😄
163 Views
0
Answers
one year ago
one year ago