How to keep tensorflow session open between predictions? Loading from SavedModel

后端 未结 3 737
庸人自扰
庸人自扰 2021-02-05 18:00

I trained a tensorflow model that i\'d like to run predictions on from numpy arrays. This is for image processing within videos. I will pass the images to the model as they happ

相关标签:
3条回答
  • 2021-02-05 18:18

    Your code does not work because in your init function you open the session and close it. So there is no session after init finishes.

    If you want to make many predictions after your model has been trained, I recommend you not to reinvent the wheel and use the tool, TF developers created for this reason: TF serving.

    TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs. TensorFlow Serving provides out-of-the-box integration with TensorFlow models, but can be easily extended to serve other types of models and data

    They have a lot of tutorials starting from the very basic ones and spending a day on learning a few things will save you months later.

    0 讨论(0)
  • 2021-02-05 18:22

    Others have explained why you can't put your session in a with statement in the constructor.

    The reason you see different behavior when using the context manager vs. not is because tf.saved_model.loader.load has some weird interactions between the default graph and the graph that is part of the session.

    The solution is simple; don't pass a graph to session if you're not using it in a with block:

    sess=tf.Session()
    tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
    

    Here's some example code for a class to do predictions:

    class Model(object):
    
      def __init__(self, model_path):
        # Note, if you don't want to leak this, you'll want to turn Model into
        # a context manager. In practice, you probably don't have to worry
        # about it.
        self.session = tf.Session()
    
        tf.saved_model.loader.load(
            self.session,
            [tf.saved_model.tag_constants.SERVING],
            model_path)
    
        self.softmax_tensor = self.session.graph.get_tensor_by_name('final_ops/softmax:0')
    
      def predict(self, images):
        predictions = self.session.run(self.softmax, {'Placeholder:0': images})
        # TODO: convert to human-friendly labels
        return predictions
    
    
    images = [tf.gfile.FastGFile(f, 'rb').read() for f in glob.glob("*.jpg")]
    model = Model('model_path')
    print(model.predict(images))
    
    # Alternatively (uses less memory, but has lower throughput):
    for f in glob.glob("*.jpg"):
      print(model.predict([tf.gfile.FastGFile(f, 'rb').read()]))
    
    0 讨论(0)
  • 2021-02-05 18:23

    Your code creates a scope which is exited after it leaves init.

    def __init__(self): 
      with tf.Session(graph=tf.Graph()) as self.sess:
        tf.saved_model.loader.load(self.sess[tf.saved_model.tag_constants.SERVING], "model")
    

    The following should work for you if you have everything else working properly.

    def __init__(self):   
      self.sess=tf.Session(graph=tf.Graph())
      tf.saved_model.loader.load(self.sess[tf.saved_model.tag_constants.SERVING], "model")
    

    When I do something like this I also usually create the option of passing the session to the class by a parameter, then when I call the class I pass in a session create by with

    0 讨论(0)
提交回复
热议问题