Transfer learning from the pre-trained NASnet network. How to know the number of layers to freeze?

后端 未结 1 1962
夕颜
夕颜 2021-01-07 01:18

In order to train a model for image classification (using Keras or Tensorflow) I want to retrain a certain number of layers of the NASNetMobile, using my own dataset of imag

相关标签:
1条回答
  • 2021-01-07 02:01

    Considering where the Aux branch begins, I'd try freezing the layers before activation_166. Something like this:

    model = NASNetLarge((img_rows, img_cols, img_channels),dropout=0.5, use_auxiliary_branch=True, include_top=True, weights=None, classes=nb_classes)
    
    model.load_weights('weights/NASNet-large.h5', by_name=True, skip_mismatch=True)
    
    # Freeze original layers
    model.trainable = True    
    set_trainable = False
    for layer in model.layers:
      if layer.name == 'activation_166':
        set_trainable = True
      if set_trainable:
        layer.trainable = True
      else:
        layer.trainable = False
      print("layer {} is {}".format(layer.name, '+++trainable' if layer.trainable else '---frozen'))
    
    0 讨论(0)
提交回复
热议问题