AttributeError: 'NoneType' object has no attribute '_inbound_nodes' while trying to add multiple keras Dense layers

后端 未结 2 892
北荒
北荒 2021-01-24 06:48

The input are 3 independent channels of 1000 features. I\'m trying to pass each channel through a independent NN path, then concatenate them into a flat layer. Then apply a FCN

相关标签:
2条回答
  • 2021-01-24 07:39

    Thanks to @CAta.RAy

    I solved it in this way:

    import numpy as np
    from keras import layers
    from keras.layers import Input, Add, Dense,Dropout, Lambda
    from keras.layers import Flatten
    from keras.optimizers import Adam
    from keras.models import Model
    import keras.backend as K
    
    
    
    
    def tst_1(): 
        inputs = Input((3, 1000))
    
        x1 = Lambda(lambda x:x[:,0])(inputs)
        dense10 = Dense(224, activation='relu')(x1)
        dense11 = Dense(112, activation='relu')(dense10)
        dense12 = Dense(56, activation='relu')(dense11)
    
        x2 = Lambda(lambda x:x[:,1])(inputs)
        dense20 = Dense(224, activation='relu')(x2)
        dense21 = Dense(112, activation='relu')(dense20)
        dense22 = Dense(56, activation='relu')(dense21)
    
        x3 = Lambda(lambda x:x[:,2])(inputs)
        dense30 = Dense(224, activation='relu')(x3)
        dense31 = Dense(112, activation='relu')(dense30)
        dense32 = Dense(56, activation='relu')(dense31)
    
        flat = Add()([dense12, dense22, dense32])
    
        dense1 = Dense(224, activation='relu')(flat)
        drop1 = Dropout(0.5)(dense1)
        dense2 = Dense(112, activation='relu')(drop1)
        drop2 = Dropout(0.5)(dense2)
        dense3 = Dense(32, activation='relu')(drop2)
        densef = Dense(1, activation='sigmoid')(dense3)
    
        model = Model(inputs = inputs, outputs = densef)
    
        return model
    
    Net = tst_1()
    Net.compile(optimizer=Adam(), loss='binary_crossentropy', metrics=['accuracy'])
    
    Net.summary()
    
    0 讨论(0)
  • 2021-01-24 07:47

    The problem is that splitting the input data using inputs[0,:,1] is not done as a keras layer.

    You need to create a Lambda layer to be able to accomplish this.

    The following code:

    from keras import layers
    from keras.layers import Input, Add, Dense,Dropout, Lambda, Concatenate
    from keras.layers import Flatten
    from keras.optimizers import Adam
    from keras.models import Model
    import keras.backend as K
    
    
    def tst_1(): 
    
        num_channels = 3
        inputs = Input(shape=(num_channels, 1000, 1))
    
        branch_outputs = []
        for i in range(num_channels):
            # Slicing the ith channel:
            out = Lambda(lambda x: x[:, i, :, :], name = "Lambda_" + str(i))(inputs)
    
            # Setting up your per-channel layers (replace with actual sub-models):
            out = Dense(224, activation='relu', name = "Dense_224_" + str(i))(out)
            out = Dense(112, activation='relu', name = "Dense_112_" + str(i))(out)
            out = Dense(56, activation='relu', name = "Dense_56_" + str(i))(out)
            branch_outputs.append(out)
    
        # Concatenating together the per-channel results:
        out = Concatenate()(branch_outputs)
    
    
        dense1 = Dense(224, activation='relu')(out)
        drop1 = Dropout(0.5)(dense1)
        dense2 = Dense(112, activation='relu')(drop1)
        drop2 = Dropout(0.5)(dense2)
        dense3 = Dense(32, activation='relu')(drop2)
        densef = Dense(1, activation='sigmoid')(dense3)
    
        model = Model(inputs = inputs, outputs = densef)
    
        return model
    
    Net = tst_1()
    Net.compile(optimizer=Adam(), loss='binary_crossentropy', metrics=['accuracy'])
    
    Net.summary()
    

    correctly created the net that you want.

    0 讨论(0)
提交回复
热议问题