问题
I am doing king of transfer learning. What I have done is First train the model with the big datasets and save the weights. Then I train the model with my dataset by freezing the layers. But I see there was some overfitting. So I try to change the dropout of the model and load the weights since the numbers are changing while drop out are changing. I find difficulties to change the dropout.
Directly my question is, Is it possible to change the model's dropout while loading the weights?
my scenario 1 is like that
- model defined.
- train the model.
- list item
- save weights.
...
redefine the dropout others are not changed in the model
- load the weights . I got the error.
2nd Scenario
model1 defined.
train the model.
save weights
load model1 weights to model1
....
model2 defined by changing the dropouts.
try to set the wights of model1 to model 2 using for loop except for the dropout layer. I got an error.
This is the error I got.
File "/home/sathiyakugan/PycharmProjects/internal-apps/apps/support-tools/EscalationApp/LSTM_Attention_IMDB_New_open.py", line 343, in <module>
NewModel.layers[i].set_weights(layer.get_weights())
File "/home/sathiyakugan/PycharmProjects/Python/venv/lib/python3.5/site-packages/keras/engine/base_layer.py", line 1062, in set_weights
str(weights)[:50] + '...')
ValueError: You called `set_weights(weights)` on layer "lstm_5" with a weight list of length 1, but the layer was expecting 3 weights. Provided weights: [array([[ 0. , 0. , 0. , ..., 0....
What is the right way to go? Since I am new to Keras, I am struggling to go further.
回答1:
I recommend you to load the weights using the function model.load_weights("weights_file.h5")
and then try the following:
for layer in model.layers:
if hasattr(layer, 'rate'):
layer.rate = 0.5
Since only the Dropout layers have the attribute rate
, when you find a layer with this attribute you can change it. Here I use 0.5 as the Dropout probability, you can put the value that you want.
Edit: if you are setting the weights layer by layer you can agregate the above if
in your for
throught the layers
IMPORTANT: after this you have to compile the model again:
from keras.optimizers import SGD
model.compile(optimizer=SGD(lr=1e-3, momentum=0.9), loss='categorical_crossentropy', metrics=['accuracy'])
Again, the parameters passed here are just for example purpose, so change them accordingly to your problem.
来源:https://stackoverflow.com/questions/52200599/add-dropout-after-loading-the-weights-in-keras