问题
I would like to develop a convolutional network architecture where in the first layer (Conv1D in this case), I would like to prespecify some portion of untrainable fixed filters, while also having several trainable filters that the model can learn. Is this possible and how would this be done?
My intuition is that I can make two separate Conv1D layers - one trainable and one untrainable - and then somehow concatenate them, but I'm not sure what this would look like in code. Also, for the untrainable filters, how do I prespecify the weights?
回答1:
This is quite easy with the functional API:
in = Input(....)
convA = Conv1D(filters1, kernel_size1, ...)
convB = Conv1D(filters2, kernel_size2, ...)
convB.trainable = False
convB.set_weights(some_weight_array)
conv1 = convA(in)
conv2 = convB(in)
convFinal = Concatenate(axis = -1)([conv1, conv2])
I haven't tried the code but it should be working after filling the small details.
回答2:
All keras layers has a set_weights
method (https://keras.io/layers/about-keras-layers/).
You can freeze the layer of the Conv1D
layer using trainable=False
(https://keras.io/getting-started/faq/#how-can-i-freeze-keras-layers).
Concatenate the trainable Conv1D
and the non-trainable Conv1D
using the Concatenate
layer (https://keras.io/layers/merge/).
来源:https://stackoverflow.com/questions/50178499/specify-some-untrainable-filters-for-keras-convolutional-network