How to prevent tensorflow from allocating the totality of a GPU memory?

前端 未结 16 2164
南旧
南旧 2020-11-22 04:26

I work in an environment in which computational resources are shared, i.e., we have a few server machines equipped with a few Nvidia Titan X GPUs each.

For small to m

16条回答
  •  有刺的猬
    2020-11-22 04:48

    For TensorFlow 2.0 and 2.1 (docs):

    import tensorflow as tf
    tf.config.gpu.set_per_process_memory_growth(True)
    

    For TensorFlow 2.2+ (docs):

    import tensorflow as tf
    gpus = tf.config.experimental.list_physical_devices('GPU')
    for gpu in gpus:
      tf.config.experimental.set_memory_growth(gpu, True)
    

    The docs also list some more methods:

    • Set environment variable TF_FORCE_GPU_ALLOW_GROWTH to true.
    • Use tf.config.experimental.set_virtual_device_configuration to set a hard limit on a Virtual GPU device.

提交回复
热议问题