How to prevent tensorflow from allocating the totality of a GPU memory?

前端 未结 16 2216
南旧
南旧 2020-11-22 04:26

I work in an environment in which computational resources are shared, i.e., we have a few server machines equipped with a few Nvidia Titan X GPUs each.

For small to m

16条回答
  •  时光说笑
    2020-11-22 04:56

    Tensorflow 2.0 Beta and (probably) beyond

    The API changed again. It can be now found in:

    tf.config.experimental.set_memory_growth(
        device,
        enable
    )
    

    Aliases:

    • tf.compat.v1.config.experimental.set_memory_growth
    • tf.compat.v2.config.experimental.set_memory_growth

    References:

    • https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/config/experimental/set_memory_growth
    • https://www.tensorflow.org/guide/gpu#limiting_gpu_memory_growth

    See also: Tensorflow - Use a GPU: https://www.tensorflow.org/guide/gpu

    for Tensorflow 2.0 Alpha see: this answer

提交回复
热议问题