Keras (TensorFlow, CPU): Training Sequential models in loop eats memory

后端 未结 1 1921
半阙折子戏
半阙折子戏 2020-12-30 10:37

I am trying to train 1000x of Sequential models in a loop. In every loop my program leaks memory until I run out and get an OOM exception.

I already asked a similar

相关标签:
1条回答
  • 2020-12-30 11:04

    The memory leak stems from Keras and TensorFlow using a single "default graph" to store the network structure, which increases in size with each iteration of the inner for loop.

    Calling K.clear_session() frees some of the (backend) state associated with the default graph between iterations, but an additional call to tf.reset_default_graph() is needed to clear the Python state.

    Note that there might be a more efficient solution: since nn does not depend on either of the loop variables, you can define it outside the loop, and reuse the same instance inside the loop. If you do that, there is no need to clear the session or reset the default graph, and performance should increase because you benefit from caching between iterations.

    0 讨论(0)
提交回复
热议问题