How to have parallel convolutional layers in keras?

前端 未结 2 407
囚心锁ツ
囚心锁ツ 2020-12-23 18:05

I am a little new to neural networks and keras. I have some images with size 6*7 and the size of the filter is 15. I want to have several filters and train a convolutional l

相关标签:
2条回答
  • 2020-12-23 18:26

    Here is an example of designing a network of parallel convolution and sub sampling layers in keras version 2. I hope this resolves your problem.

    rows, cols = 100, 15
    def create_convnet(img_path='network_image.png'):
        input_shape = Input(shape=(rows, cols, 1))
    
        tower_1 = Conv2D(20, (100, 5), padding='same', activation='relu')(input_shape)
        tower_1 = MaxPooling2D((1, 11), strides=(1, 1), padding='same')(tower_1)
    
        tower_2 = Conv2D(20, (100, 7), padding='same', activation='relu')(input_shape)
        tower_2 = MaxPooling2D((1, 9), strides=(1, 1), padding='same')(tower_2)
    
        tower_3 = Conv2D(20, (100, 10), padding='same', activation='relu')(input_shape)
        tower_3 = MaxPooling2D((1, 6), strides=(1, 1), padding='same')(tower_3)
    
        merged = keras.layers.concatenate([tower_1, tower_2, tower_3], axis=1)
        merged = Flatten()(merged)
    
        out = Dense(200, activation='relu')(merged)
        out = Dense(num_classes, activation='softmax')(out)
    
        model = Model(input_shape, out)
        plot_model(model, to_file=img_path)
        return model
    

    The image of this network will look like

    0 讨论(0)
  • 2020-12-23 18:31

    My approach is to create other model that defines all parallel convolution and pulling operations and concat all parallel result tensors to single output tensor. Now you can add this parallel model graph in your sequential model just like layer. Here is my solution, hope it solves your problem.

    # variable initialization 
    from keras import Input, Model, Sequential
    from keras.layers import Conv2D, MaxPooling2D, Concatenate, Activation, Dropout, Flatten, Dense
    
    nb_filters =100
    kernel_size= {}
    kernel_size[0]= [3,3]
    kernel_size[1]= [4,4]
    kernel_size[2]= [5,5]
    input_shape=(32, 32, 3)
    pool_size = (2,2)
    nb_classes =2
    no_parallel_filters = 3
    
    # create seperate model graph for parallel processing with different filter sizes
    # apply 'same' padding so that ll produce o/p tensor of same size for concatination
    # cancat all paralle output
    
    inp = Input(shape=input_shape)
    convs = []
    for k_no in range(len(kernel_size)):
        conv = Conv2D(nb_filters, kernel_size[k_no][0], kernel_size[k_no][1],
                        border_mode='same',
                             activation='relu',
                        input_shape=input_shape)(inp)
        pool = MaxPooling2D(pool_size=pool_size)(conv)
        convs.append(pool)
    
    if len(kernel_size) > 1:
        out = Concatenate()(convs)
    else:
        out = convs[0]
    
    conv_model = Model(input=inp, output=out)
    
    # add created model grapg in sequential model
    
    model = Sequential()
    model.add(conv_model)        # add model just like layer
    model.add(Conv2D(nb_filters, kernel_size[1][0], kernel_size[1][0]))
    model.add(Activation('relu'))
    model.add(MaxPooling2D(pool_size=pool_size))
    model.add(Dropout(0.25))
    model.add(Flatten(input_shape=input_shape))
    model.add(Dense(128))
    model.add(Activation('relu'))
    model.add(Dense(128))
    model.add(Activation('relu'))
    model.add(Dropout(0.5))
    model.add(Dense(nb_classes))
    model.add(Activation('tanh'))
    

    For more information refer similar question: Combining the outputs of multiple models into one model

    0 讨论(0)
提交回复
热议问题