How to a make a model ready for TensorFlow Serving REST interface with a base64 encoded image?

孤者浪人 提交于 2019-12-06 07:26:22

This is not quite an answer, but I don't have enough reputation to comment, so...

This is the most helpful info I've found on the issue so far:

https://github.com/tensorflow/serving/issues/994#issuecomment-410165268

But I still haven't been able to figure it out completely, so please post an update if you get yours working.

As you mentioned JSON is a very inefficient approach, as payload normally exceeds original filesize, you need to convert the model to be able to process the image bytes written to a string using Base64 encoding:

{"b64": base64_encoded_string}

This new conversion will reduce the prediction time and bandwidth utilization used to transfer image from prediction client to your infrastructure.

I recently used a Transfer Learning model with TF Hub and Keras which was using a JSON as input, as you mentioned this is not optimal for prediction. I used the following function to overwrite it:

Using the following code we add a new serving function which will be able to process Base64 encoded images.

Using TF estimator model:

h5_model_path = os.path.join('models/h5/best_model.h5')
tf_model_path = os.path.join('models/tf')
estimator = keras.estimator.model_to_estimator(
    keras_model_path=h5_model_path,
    model_dir=tf_model_path)

def serving_input_receiver_fn():
    def prepare_image(image_str_tensor):
        image = tf.image.decode_jpeg(image_str_tensor, channels=CHANNELS)
        return image_preprocessing(image)

    input_ph = tf.placeholder(tf.string, shape=[None])
    images_tensor = tf.map_fn(
        prepare_image, input_ph, back_prop=False, dtype=tf.uint8)
    images_tensor = tf.image.convert_image_dtype(images_tensor, dtype=tf.float32)

    return tf.estimator.export.ServingInputReceiver(
        {'input': images_tensor},
        {'image_bytes': input_ph})

export_path = os.path.join('/tmp/models/json_b64', version)
if os.path.exists(export_path):  # clean up old exports with this version
    shutil.rmtree(export_path)
estimator.export_savedmodel(
    export_path,
    serving_input_receiver_fn=serving_input_receiver_fn)

A good example here

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!