What do you mean convert it to Torchscript? @<1523701205467926528:profile|AgitatedDove14>
This line 🙂
None
Notice Triton (and so is clearml-serving) needs the pytorch model to be converted into torchscript, so that the triton backend can load it
@<1523701205467926528:profile|AgitatedDove14> Thanks for the reply. However, I am not sure what values I should specify for the input and output layers for YOLOv8.
If I load the model into Netron, I don't see any dimensions for the input layer shape.
On the output layer, I can see the shape is [136, 64, 80, 80]
. Is that correct?
Hi @<1593413673383104512:profile|MiniatureDragonfly17>
These are the specific model input/output layers name.
The way Triton analyses PyTorch model is usuallyinput__0
then input__1
for the input layers and output__0
and so on for the results:
You can see an example here:
None
--input-size 1 28 28 --input-name "INPUT__0" --input-type float32 --output-size -1 10 --output-name "OUTPUT__0" --output-type float32
, I can see the shape is
[136, 64, 80, 80]
. Is that correct?
Yes that's correct. In case of the name, just try input__0
Notice you also need to convert it to torchscript