How to serve retrained Inception model using Tensorflow Serving?

后端 未结 3 821
攒了一身酷
攒了一身酷 2020-12-14 13:11

So I have trained inception model to recognize flowers according to this guide. https://www.tensorflow.org/versions/r0.8/how_tos/image_retraining/index.html

         


        
相关标签:
3条回答
  • 2020-12-14 13:37

    To serve the graph after you have trained it, you would need to export it using this api: https://www.tensorflow.org/versions/r0.8/api_docs/python/train.html#export_meta_graph

    That api generates the metagraph def that is needed by the serving code ( this will generate that .meta file you are asking about)

    Also, you need to restore a checkpoint using Saver.save() which is the Saver class https://www.tensorflow.org/versions/r0.8/api_docs/python/train.html#Saver

    Once you have done this, you will both the metagraph def and the checkpoint files that are needed to restore the graph.

    0 讨论(0)
  • 2020-12-14 13:42

    You have to export the model. I have a PR that exports the model during retraining. The gist of it is below:

    import tensorflow as tf
    
    def export_model(sess, architecture, saved_model_dir):
      if architecture == 'inception_v3':
        input_tensor = 'DecodeJpeg/contents:0'
      elif architecture.startswith('mobilenet_'):
        input_tensor = 'input:0'
      else:
        raise ValueError('Unknown architecture', architecture)
      in_image = sess.graph.get_tensor_by_name(input_tensor)
      inputs = {'image': tf.saved_model.utils.build_tensor_info(in_image)}
    
      out_classes = sess.graph.get_tensor_by_name('final_result:0')
      outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_classes)}
    
      signature = tf.saved_model.signature_def_utils.build_signature_def(
        inputs=inputs,
        outputs=outputs,
        method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
      )
    
      legacy_init_op = tf.group(tf.tables_initializer(), name='legacy_init_op')
    
      # Save out the SavedModel.
      builder = tf.saved_model.builder.SavedModelBuilder(saved_model_dir)
      builder.add_meta_graph_and_variables(
        sess, [tf.saved_model.tag_constants.SERVING],
        signature_def_map={
          tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature
        },
        legacy_init_op=legacy_init_op)
      builder.save()
    

    Above will create a variables directory and saved_model.pb file. If you put it under a parent directory representing the version number (e.g. 1/) then you can call tensorflow serving via:

    tensorflow_model_server --port=9000 --model_name=inception --model_base_path=/path/to/saved_models/
    
    0 讨论(0)
  • 2020-12-14 13:45

    Check out this gist how to load your .pb output graph in a Session:

    https://github.com/eldor4do/Tensorflow-Examples/blob/master/retraining-example.py

    0 讨论(0)
提交回复
热议问题