i can't get some sort of response from curl
platform: "tensorflow_savedmodel" input [ { name: "dense_input" data_type: TYPE_FP32 dims: [-1, 784] } ] output [ { name: "activation_2" data_type: TYPE_FP32 dims: [-1, 10] } ]
So far I have taken one mnist image, and done the following:
` from PIL import Image
import numpy as np
def preprocess(img, format, dtype, h, w, scaling):
sample_img = img.convert('L')
resized_img = sample_img.resize((1, w*h), Image.BILINEAR)
resized = np.array(resized_img)
resized = resized.astype(dtype)
return resized
png img file
img = Image.open('./7.png')
preprocessed img, FP32 formated numpy array
img = preprocess(img, format, "float32", 28, 28, None)
to bytes
img = img.tofile('7.binaryimage',format='binary') curl -X POST 192.168.34.174:8000/v2/models/keras_mnist/versions/1 -H "Content-Type: application/octet-stream" -H' NV-InferRequest:batch_size: 1 input { name: "dense_input" dims: [-1, 784] } output { name: "activation_2" cls { count: 1 } }' --data-binary "@7.binaryimage" `
it's from the github issue you sent me but i don't know what the "application" part is or the "NV-InferRequest:...."
img.tofile('7.binaryimage',format='binary')
can this be the issue?
i'm probably sending the request all wrong + i'm not sure how the model expects the input
ElegantCoyote26 what is the model input layer definition? This implies the data format to pass to the serve endpoint
fp32 seems to be floating point 32 so my preprocessing seems wrong
Hmm I seems to fit the code 1x784 with float32, no?
ElegantCoyote26 , Hi 🙂
Can you provide an example of what you're trying to do?
i'm just interested in actually running a prediction with the serving engine and all
well, i have run the keras mnist example that is in the clearml-serving READme. Now I'm just trying to send a request to make a prediction via curl
i'm also not sure what this is-H "Content-Type: application/octet-stream" -H' NV-InferRequest:batch_size: 1 input { name: "dense_input" dims: [-1, 784] } output { name: "activation_2" cls { count: 1 } }'