tensorflow: what's the difference between tf.nn.dropout and tf.layers.dropout

前端 未结 4 1933
面向向阳花
面向向阳花 2021-02-01 05:34

I\'m quite confused about whether to use tf.nn.dropout or tf.layers.dropout.

many MNIST CNN examples seems to use tf.nn.droput, with keep_prop as one of params.

4条回答
  •  梦毁少年i
    2021-02-01 06:00

    The idea is the same, the parameters are slightly different. In nn.dropout, keep_prob is the probability that each element is kept. In layers.dropout rate=0.1 would drop out 10% of input units.

    So keep_prob = 1 - rate. Also layers.dropout allows training parameter.

    In general, just read carefully documentation about the functions you care about and you will see the differences.

提交回复
热议问题