How to switch Off/On an LSTM layer?

时间秒杀一切 提交于 2019-12-24 06:34:49

问题


I am looking for a way to access the LSTM layer such that the addition and subtraction of a layer are event-driven. So the Layer can be added or subtracted when there is a function trigger. For Example (hypothetically): Add an LSTM layer if a = 2 and remove an LSTM layer if a = 3.

Here a = 2 and a= 3 is supposed to be a python function which returns specific value based on which the LSTM layer should be added or removed. I want to add a switch function to the layer so that it can be switched on or off based on the python function.

Is it possible?

Currently, I need to hard code the layer needed. For eg:

# Initialising the RNN 
regressor = Sequential()

# Adding the first LSTM layer and some Dropout regularization 
regressor.add(LSTM(units = 60, return_sequences = True, input_shape = 
(X_train.shape[1], X_train.shape[2])))
#regressor.add(Dropout(0.1))

# Adding the 2nd LSTM layer and some Dropout regularization 
regressor.add(LSTM(units = 60, return_sequences = True))
regressor.add(Dropout(0.1))

My goal is to both add and subtract these layers at runtime. Any help is appreciated!!


回答1:


I found the answer and posting in case anyone else is looking for the solution. This can be done by using freeze Keras layer functionality. Basically, you need to pass the boolean trainable argument to the layer constructor to set it as non-trainable.

Eg:

 frozen_layer = Dense(32, trainable=False)

Additionally, in case you want to set the trainable property of a layer to True or False after instantiation. By calling compile() on your model after modifying the trainable property. Eg:

    x = Input(shape=(32,))
    layer = Dense(32)
    layer.trainable = False
    y = layer(x)

    frozen_model = Model(x, y)
    # the weights of layer will not be updated during training for below model
    frozen_model.compile(optimizer='rmsprop', loss='mse')

    layer.trainable = True
    trainable_model = Model(x, y)
    # the weights of the layer will be updated during training 
    # (which will also affect the above model since it uses the same layer instance)
    trainable_model.compile(optimizer='rmsprop', loss='mse')

    frozen_model.fit(data, labels)  # this does NOT update the weights of layer
    trainable_model.fit(data, labels)  # this updates the weights of layer

Hope this helps!!



来源:https://stackoverflow.com/questions/54284898/how-to-switch-off-on-an-lstm-layer

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!