tensorflow: what's the difference between tf.nn.dropout and tf.layers.dropout

前端 未结 4 1932
面向向阳花
面向向阳花 2021-02-01 05:34

I\'m quite confused about whether to use tf.nn.dropout or tf.layers.dropout.

many MNIST CNN examples seems to use tf.nn.droput, with keep_prop as one of params.

相关标签:
4条回答
  • 2021-02-01 05:46

    A quick glance through tensorflow/python/layers/core.py and tensorflow/python/ops/nn_ops.py reveals that tf.layers.dropout is a wrapper for tf.nn.dropout.

    The only differences in the two functions are:

    1. The tf.nn.dropout has parameter keep_prob: "Probability that each element is kept"
      tf.layers.dropout has parameter rate: "The dropout rate"
      Thus, keep_prob = 1 - rate as defined here
    2. The tf.layers.dropout has training parameter: "Whether to return the output in training mode (apply dropout) or in inference mode (return the input untouched)."
    0 讨论(0)
  • 2021-02-01 05:51

    On the training phase they are identical (as long as "drop rate" and "keep rate" are consistent). However, for evaluation (test) phase they are completely different. tf.nn.dropout will still do random dropping while tf.layers.dropout won't drop anything (transparent layer). In most cases it make sense to use tf.layers.dropout.

    0 讨论(0)
  • 2021-02-01 06:00

    The idea is the same, the parameters are slightly different. In nn.dropout, keep_prob is the probability that each element is kept. In layers.dropout rate=0.1 would drop out 10% of input units.

    So keep_prob = 1 - rate. Also layers.dropout allows training parameter.

    In general, just read carefully documentation about the functions you care about and you will see the differences.

    0 讨论(0)
  • 2021-02-01 06:04

    Apart from the answers from @nikpod and @Salvador Dali

    The tf.nn.dropout scaled the weights by 1./keep prob during training phase, while tf.layers.dropout scaled the weights by 1./(1-rate).

    During evaluation, You could set the keep prob to 1 which is equivalent to set training to false.

    0 讨论(0)
提交回复
热议问题