How to properly serve an object detection model from Tensorflow Object Detection API?

前端 未结 4 1035
清歌不尽
清歌不尽 2021-02-15 23:06

I am using Tensorflow Object Detection API(github.com/tensorflow/models/tree/master/object_detection) with one object detection task. Right now I am having problem on serving th

相关标签:
4条回答
  • 2021-02-15 23:46

    For the error

    grpc.framework.interfaces.face.face.AbortionError: AbortionError(code=StatusCode.NOT_FOUND, details="FeedInputs: unable to find feed output ToFloat:0"

    Just upgrade the tf_models to the latest version, and re-export the model.

    See https://github.com/tensorflow/tensorflow/issues/11863

    0 讨论(0)
  • 2021-02-15 23:50

    I was struggling with the exact problem. I was trying to host the pre-trained SSDMobileNet-COCO checkpoint from Tensorflow Object Detection API Zoo

    Turns out I was using an old commit of tensorflow/models which happens to be the default sub-module of serving

    I simply pulled the recent commit with

    cd serving/tf_models git pull origin master git checkout master

    After that, built the model server again.

    bazel build //tensorflow_serving/model_servers:tensorflow_model_server

    The error went away and I was able to get accurate predictions

    0 讨论(0)
  • 2021-02-15 23:59
    1. Your idea is fine. It' ok to have that warning.

    2. The issue is that the input needs to be converted to uint8 as the model expects. Here is the code snippet that worked for me.

    request = predict_pb2.PredictRequest()
    request.model_spec.name = 'gan'
    request.model_spec.signature_name = 
        signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY
    
    image = Image.open('any.jpg')
    image_np = load_image_into_numpy_array(image)
    image_np_expanded = np.expand_dims(image_np, axis=0)
    
    request.inputs['inputs'].CopyFrom(
        tf.contrib.util.make_tensor_proto(image_np_expanded, 
            shape=image_np_expanded.shape, dtype='uint8'))
    

    This part is important for you shape=image_np_expanded.shape, dtype='uint8' and make sure to pull the latest update for serving.

    0 讨论(0)
  • 2021-02-16 00:05

    The current exporter code doesn't populate signature field properly. So serving using model server doesn't work. Apologies to that. A new version to better support exporting the model is coming. It includes some important fixes and improvements needed for serving, especially serving on Cloud ML Engine. See the github issue if you want to try an early version of it.

    For "The specified SavedModel has no variables; no checkpoints were restored." message, it is expected due to the exact reason you said, as all variables are converted into constants in the graph. For the error of "FeedInputs: unable to find feed output ToFloat:0", make sure you use TF 1.2 when building the model server.

    0 讨论(0)
提交回复
热议问题