How to prevent tensorflow from allocating the totality of a GPU memory?

前端 未结 16 2217
南旧
南旧 2020-11-22 04:26

I work in an environment in which computational resources are shared, i.e., we have a few server machines equipped with a few Nvidia Titan X GPUs each.

For small to m

相关标签:
16条回答
  • 2020-11-22 04:48

    For TensorFlow 2.0 and 2.1 (docs):

    import tensorflow as tf
    tf.config.gpu.set_per_process_memory_growth(True)
    

    For TensorFlow 2.2+ (docs):

    import tensorflow as tf
    gpus = tf.config.experimental.list_physical_devices('GPU')
    for gpu in gpus:
      tf.config.experimental.set_memory_growth(gpu, True)
    

    The docs also list some more methods:

    • Set environment variable TF_FORCE_GPU_ALLOW_GROWTH to true.
    • Use tf.config.experimental.set_virtual_device_configuration to set a hard limit on a Virtual GPU device.
    0 讨论(0)
  • 2020-11-22 04:49
    config = tf.ConfigProto()
    config.gpu_options.allow_growth=True
    sess = tf.Session(config=config)
    

    https://github.com/tensorflow/tensorflow/issues/1578

    0 讨论(0)
  • 2020-11-22 04:51

    You can use

    TF_FORCE_GPU_ALLOW_GROWTH=true
    

    in your environment variables.

    In tensorflow code:

    bool GPUBFCAllocator::GetAllowGrowthValue(const GPUOptions& gpu_options) {
      const char* force_allow_growth_string =
          std::getenv("TF_FORCE_GPU_ALLOW_GROWTH");
      if (force_allow_growth_string == nullptr) {
        return gpu_options.allow_growth();
    }
    
    0 讨论(0)
  • 2020-11-22 04:51

    Shameless plug: If you install the GPU supported Tensorflow, the session will first allocate all GPU whether you set it to use only CPU or GPU. I may add my tip that even you set the graph to use CPU only you should set the same configuration(as answered above:) ) to prevent the unwanted GPU occupation.

    And in an interactive interface like IPython and Jupyter, you should also set that configure, otherwise, it will allocate all memory and left almost none for others. This is sometimes hard to notice.

    0 讨论(0)
  • 2020-11-22 04:52

    For Tensorflow version 2.0 and 2.1 use the following snippet:

     import tensorflow as tf
     gpu_devices = tf.config.experimental.list_physical_devices('GPU')
     tf.config.experimental.set_memory_growth(gpu_devices[0], True)
    

    For prior versions , following snippet used to work for me:

    import tensorflow as tf
    tf_config=tf.ConfigProto()
    tf_config.gpu_options.allow_growth=True
    sess = tf.Session(config=tf_config)
    
    0 讨论(0)
  • 2020-11-22 04:52

    If you're using Tensorflow 2 try the following:

    config = tf.compat.v1.ConfigProto()
    config.gpu_options.allow_growth = True
    session = tf.compat.v1.Session(config=config)
    
    0 讨论(0)
提交回复
热议问题