Tensorflow Serving: Rest API returns “Malformed request” error

╄→尐↘猪︶ㄣ 提交于 2019-12-06 08:07:13

问题


Tensorflow Serving server (run with docker) responds to my GET (and POST) requests with this:

{ "error": "Malformed request: POST /v1/models/saved_model/" }

Precisely the same problem was already reported but never solved (supposedly, this is a StackOverflow kind of question, not a GitHub issue):

https://github.com/tensorflow/serving/issues/1085

https://github.com/tensorflow/serving/issues/1095

Any ideas? Thank you very much.


回答1:


I verified that this does not work pre-v12 and does indeed work post-v12.

> docker run -it -p 127.0.0.1:9000:8500 -p 127.0.0.1:9009:8501 -v /models/55:/models/55 -e MODEL_NAME=55 --rm tensorflow/serving
> curl http://localhost:9009/v1/models/55
   { "error": "Malformed request: GET /v1/models/55" }

Now try with v12:

> docker run -it -p 127.0.0.1:9000:8500 -p 127.0.0.1:9009:8501 -v /models/55:/models/55 -e MODEL_NAME=55 --rm tensorflow/serving:1.12.0
> curl http://localhost:9009/v1/models/55
{
 "model_version_status": [
  {
   "version": "1541703514",
   "state": "AVAILABLE",
   "status": {
    "error_code": "OK",
    "error_message": ""
   }
  }
 ]
}



回答2:


Depends on your model, but this is what my body looks like:

{"inputs": {"text": ["Hello"]}}

I used Postman to help me out so that it knew it was a JSON.

This is for predict API, so the url ends in ":predict" Again, that depends on what API you're trying to use.




回答3:


There were two issues with my approach:

1) The status check request wasn't supported in my Tensorflow_model_server (see https://github.com/tensorflow/serving/issues/1085 for details)

2) More importantly, when using Windows you must escape quotation marks in JSON. So instead of:

curl -XPOST http://localhost:8501/v1/models/saved_model:predict -d "{"instances":[{"features":[1,1,1,1,1,1,1,1,1,1]}]}"

I should have used this:

curl -XPOST http://localhost:8501/v1/models/saved_model:predict -d "{\"instances\":[{\"features\":[1,1,1,1,1,1,1,1,1,1]}]}"



回答4:


Model status API is only supported in master branch. There is no TF serving release that supports it yet (the API is slated for upcoming 1.12 release). You can use the nightly docker image (tensorflow/serving:nightly) to test on master branch builds.

This solution gived by netf in issue:1128 in tensorflow/serving. I already try this solution, it's done and i can get the model status.Getting Model status img(this is the img for model status demo).

Hope I can help you.

If you not clear the master branch builds, you can contact me.

I can give your instruction.

Email:mizeshuang@gmail.com



来源:https://stackoverflow.com/questions/52671138/tensorflow-serving-rest-api-returns-malformed-request-error

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!