How to disable dropout while prediction in keras?

前端 未结 4 689
我寻月下人不归
我寻月下人不归 2021-01-03 22:30

I am using dropout in neural network model in keras. Little bit code is like

model.add(Dropout(0.5))
model.add(Dense(classes))

For testing,

相关标签:
4条回答
  • 2021-01-03 23:05

    Keras does this by default. In Keras dropout is disabled in test mode. You can look at the code here and see that they use the dropped input in training and the actual input while testing.

    As far as I know you have to build your own training function from the layers and specify the training flag to predict with dropout (e.g. its not possible to specify a training flag for the predict functions). This is a problem in case you want to do GANs, which use the intermediate output for training and also train the network as a whole, due to a divergence between generated training images and generated test images.

    0 讨论(0)
  • 2021-01-03 23:06

    Dropout removes certain neurons form play, and to compensate for that we usually take one of two ways.

    1. scaling the activation at test time
    2. inverting the dropout during the training phase

    And keras uses the second form of correction as you can see here

    0 讨论(0)
  • 2021-01-03 23:16

    You can change the dropout in a trained model (with dropout layers):

    f = K.function([model.layers[0].input, K.learning_phase()], [model.layers[-1].output])
    

    In this way you don't have to train again the model!!!

    0 讨论(0)
  • 2021-01-03 23:20

    As previously stated, dropout in Keras happens only at train time (with proportionate weight adjustment during training such that learned weights are appropriate for prediction when dropout is disabled).

    This is not ideal for cases in which we wish to use a dropout NNET as a probabilistic predictor (such that it produces a distribution when asked to predict the same inputs repeatedly). In other words, Keras' Dropout layer is designed to give you regularization at train time, but the "mean function" of the learned distribution when predicting.

    If you want to retain dropout for prediction, you can easily implement a permanent dropout ("PermaDropout") layer (this was based on suggestions made by F. Chollet on the GitHub discussion area for Keras):

    from keras.layers.core import Lambda
    from keras import backend as K
    
    def PermaDropout(rate):
        return Lambda(lambda x: K.dropout(x, level=rate))
    

    By replacing any dropout layer in a Keras model with "PermaDropout", you'll get the probabilistic behavior in prediction as well.

    # define the LSTM model
    n_vocab = text_to_train.n_vocab
    
    model = Sequential()
    model.add(LSTM(n_vocab*4, 
              input_shape=input_shape, 
              return_sequences=True))
    # Replace Dropout with PermaDropout
    # model.add(Dropout(0.3)
    model.add(PermaDropout(0.3))
    model.add(LSTM(n_vocab*2))
    # Replace Dropout with PermaDropout
    # model.add(Dropout(0.3)
    model.add(PermaDropout(0.3))
    #model.add(Dense(n_vocab*2))
    model.add(Dense(n_vocab, activation='softmax'))
    model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
    
    0 讨论(0)
提交回复
热议问题