Why is TF Keras inference way slower than Numpy operations?

前端 未结 3 955
挽巷
挽巷 2021-02-05 14:38

I\'m working on a reinforcement learning model implemented with Keras and Tensorflow. I have to do frequent calls to model.predict() on single inputs.

While testing infe

相关标签:
3条回答
  • 2021-02-05 14:43

    The memory leak issue still seems to persist in Keras. The following lines of code mentioned in that issue did the trick for me:

    import ... as K
    import gc
    
    model = ....
    del model
    K.clear_session()
    gc.collect()
    
    0 讨论(0)
  • 2021-02-05 14:45

    A little late, but maybe useful for someone:

    Replace model.predict(X) with model.predict(X, batch_size=len(X))

    That should do it.

    0 讨论(0)
  • 2021-02-05 14:51

    Are you running your Keras model (with TensorFlow backend) in a loop? If so, Keras has a memory leak issue identified here: LINK

    In this case you have to import the following:

    import keras.backend.tensorflow_backend
    import tensorflow as tf
    
    from keras.backend import clear_session
    

    Finally, you have to put the following at the end of every iteration of a loop after you're done doing your computations:

    clear_session()
    if keras.backend.tensorflow_backend._SESSION:
        tf.reset_default_graph()
        keras.backend.tensorflow_backend._SESSION.close()
        keras.backend.tensorflow_backend._SESSION = None
    

    This should help you free up memory at the end of every loop and eventually, make the process faster. I hope this helps.

    0 讨论(0)
提交回复
热议问题