问题
I'm serving a pre-trained inception model, and I've followed the official tutorials to serve it up until now. I'm currently getting an Error Code 3, as follows:
{ Error: contents must be scalar, got shape [305]
[[Node: map/while/DecodeJpeg = DecodeJpeg[_output_shapes=[[?,?,3]], acceptable_fraction=1, channels=3, dct_method="", fancy_upscaling=true, ratio=1, try_recover_truncated=false, _device="/job:localhost/replica:0/task:0/device:CPU:0"](map/while/TensorArrayReadV3)]]
at /server/node_modules/grpc/src/client.js:554:15 code: 3, metadata: Metadata { _internal_repr: {} } }
I'm using the prediction_service.proto as it is from Tensorflow Serving's API. Here's my Nodejs file where I define the function:
const PROTO_PATH = "./pb/prediction_service.proto";
const TensorflowServing = grpc.load(PROTO_PATH).tensorflow.serving;
const testClient = new TensorflowServing.PredictionService(
TF_TEST, grpc.credentials.createInsecure()
);
function getTestModelMsg(val){
return {
model_spec: { name: "inception", signature_name: "predict_images", version: 1},
inputs: {
images: {
dtype: "DT_STRING",
tensor_shape: {
dim: [{size: 220}, {size: 305}],
unknown_rank: false
},
string_val: val
}
}
}
}
function predictTest(array, callback) {
testClient.predict(getTestModelMsg(array), (error, response) => {
if(error)
return callback(error);
callback(null, response.outputs)
})}
And I'm passing in the image as a binary image as follows:
fs.readFile('./test/Xiang_Xiang_panda.jpg', (err, data) => {
if(err) {
return res.json({message: "Not found"});
}
predictTest( data.toString('binary') , (error, outputs) => {
if (error) {
console.error(error);
return res.status(500).json({ error });
}
res.status(200).json({ outputs });
})
})
I've been stuck at this for a while so would really appreciate if anyone could help me out here! Any help would be great! Thanks in advance! :)
回答1:
Okay, so I finally managed to crack this. Posting it as an answer here in case someone faces this exact same problem.
So the inception model expects a base64 encoded image:
fs.readFile('./test/Xiang_Xiang_panda.jpg', (err, data) => {
if(err) {
return res.json({message: "Not found"});
}
predictTest( data.toString('base64') , (error, outputs) => {
if (error) {
console.error(error);
return res.status(500).json({ error });
}
res.status(200).json({ outputs });
})
})
Then looking at the inception_client.py from Tensorflow Serving, I found out the the tensor actually has the shape=[1]
. So this makes the getTestModelMsg as:
function getTestModelMsg(val){
return {
model_spec: { name: "inception", signature_name: "serving_default", version: 1},
inputs: {
images: {
dtype: "DT_STRING",
tensor_shape: {
dim: [{size: 1}],
unknown_rank: false
},
string_val: val
}
}
}
Hope that helps someone. Goodluck. :)
来源:https://stackoverflow.com/questions/47361889/nodejs-tensorflow-serving-client-error-3