tensorflow: what's the difference between tf.nn.dropout and tf.layers.dropout

前端 未结 4 1936
面向向阳花
面向向阳花 2021-02-01 05:34

I\'m quite confused about whether to use tf.nn.dropout or tf.layers.dropout.

many MNIST CNN examples seems to use tf.nn.droput, with keep_prop as one of params.

4条回答
  •  无人共我
    2021-02-01 05:46

    A quick glance through tensorflow/python/layers/core.py and tensorflow/python/ops/nn_ops.py reveals that tf.layers.dropout is a wrapper for tf.nn.dropout.

    The only differences in the two functions are:

    1. The tf.nn.dropout has parameter keep_prob: "Probability that each element is kept"
      tf.layers.dropout has parameter rate: "The dropout rate"
      Thus, keep_prob = 1 - rate as defined here
    2. The tf.layers.dropout has training parameter: "Whether to return the output in training mode (apply dropout) or in inference mode (return the input untouched)."

提交回复
热议问题