Tensorflow Serving server (run with docker) responds to my GET (and POST) requests with this:
{ "error": "Malformed request: POST /v1/models/saved_model/" }
Precisely the same problem was already reported but never solved (supposedly, this is a StackOverflow kind of question, not a GitHub issue):
https://github.com/tensorflow/serving/issues/1085
https://github.com/tensorflow/serving/issues/1095
Any ideas? Thank you very much.
I verified that this does not work pre-v12 and does indeed work post-v12.
> docker run -it -p 127.0.0.1:9000:8500 -p 127.0.0.1:9009:8501 -v /models/55:/models/55 -e MODEL_NAME=55 --rm tensorflow/serving
> curl http://localhost:9009/v1/models/55
{ "error": "Malformed request: GET /v1/models/55" }
Now try with v12:
> docker run -it -p 127.0.0.1:9000:8500 -p 127.0.0.1:9009:8501 -v /models/55:/models/55 -e MODEL_NAME=55 --rm tensorflow/serving:1.12.0
> curl http://localhost:9009/v1/models/55
{
"model_version_status": [
{
"version": "1541703514",
"state": "AVAILABLE",
"status": {
"error_code": "OK",
"error_message": ""
}
}
]
}
Depends on your model, but this is what my body looks like:
{"inputs": {"text": ["Hello"]}}
I used Postman to help me out so that it knew it was a JSON.
This is for predict API, so the url ends in ":predict" Again, that depends on what API you're trying to use.
There were two issues with my approach:
1) The status check request wasn't supported in my Tensorflow_model_server (see https://github.com/tensorflow/serving/issues/1085 for details)
2) More importantly, when using Windows you must escape quotation marks in JSON. So instead of:
curl -XPOST http://localhost:8501/v1/models/saved_model:predict -d "{"instances":[{"features":[1,1,1,1,1,1,1,1,1,1]}]}"
I should have used this:
curl -XPOST http://localhost:8501/v1/models/saved_model:predict -d "{\"instances\":[{\"features\":[1,1,1,1,1,1,1,1,1,1]}]}"
Model status API is only supported in master branch. There is no TF serving release that supports it yet (the API is slated for upcoming 1.12 release). You can use the nightly docker image (tensorflow/serving:nightly) to test on master branch builds.
This solution gived by netf in issue:1128 in tensorflow/serving. I already try this solution, it's done and i can get the model status.Getting Model status img(this is the img for model status demo).
Hope I can help you.
If you not clear the master branch builds, you can contact me.
I can give your instruction.
Email:mizeshuang@gmail.com
来源:https://stackoverflow.com/questions/52671138/tensorflow-serving-rest-api-returns-malformed-request-error