how to input multi features for tensorflow model inference

和自甴很熟 提交于 2021-01-29 12:59:09

问题


I'm trying to model serving test.
Now, I'm following this example "https://www.tensorflow.org/beta/guide/saved_model"

This example is OK. But, In my case, I have multi input features.

loaded = tf.saved_model.load(export_path)
infer = loaded.signatures["serving_default"]
print(infer.structured_input_signature)
=> ((), {'input1': TensorSpec(shape=(None, 1), dtype=tf.int32, name='input1'), 'input2': TensorSpec(shape=(None, 1), dtype=tf.int32, name='input2')})

In example, for single input features, just input feature like

infer(tf.constant(x))

In my case, for multi input features, How to input features??
I'm using tensorflow 2.0 beta and python3.5.


回答1:


I solve this problem.
In single input feature model, infer._num_positional_args assigned 1.
But, multi input features model infer._num_positional_args assigned 0. I don't know why.
I solve like this.

infer._num_positional_args = 2
infer(tf.constant(x1), tf.constant(x2)

For using requests

import json
import requests
data = json.dumps({"signature_name": "serving_default", "instances": [{'input1':[x1], 'input2':[x2]}]})
headers = {"content-type": "application/json"}
json_response = requests.post('http://localhost:8501/v1/models/model:predict', data=data, headers=headers)

For saved_model_cli

!saved_model_cli run --dir $export_path --tag_set serve --signature_def serving_default \
--input_exprs 'inptu1=[[x1]];input2=[[x2]]'


来源:https://stackoverflow.com/questions/57457532/how-to-input-multi-features-for-tensorflow-model-inference

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!