问题
I am trying to concatenate 4 different layers into one layer to input into the next part of my model. I am using the Keras functional API and the code is shown below.
# Concat left side 4 inputs and right side 4 inputs
print(lc,l1_conv_net,l2_conv_net,l3_conv_net)
left_combined = merge.Concatenate()([lc, l1_conv_net, l2_conv_net, l3_conv_net])
This errors occurs which says that my input shape is not the same. However, I also printed the input shape and it is seems to be the same except along the concat axis (which is the shape[1] since shape[0]=? is the number of examples in the batch).
Tensor("input_1:0", shape=(?, 6), dtype=float32) Tensor("add_3/add_1:0", shape=(?, 100), dtype=float32) Tensor("add_6/add_1:0", shape=(?, 100), dtype=float32) Tensor("add_9/add_1:0", shape=(?, 100), dtype=float32)
ValueError: A `Concatenate` layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 6), (None, 7, 62), (None, 23, 62), (None, 2, 62)]
Coincidentally, the shape (None, 7, 62), (None, 23, 62), (None, 2, 62) is the input tensor shape for another custom keras layer which produces l1_conv_net as shown below:
l1_conv_net = build_graph_conv_net_fp_only([l1x, l1y, l1z],
conv_layer_sizes=self.conv_width,
fp_layer_size=self.fp_length,
conv_activation='relu', fp_activation='softmax')
So the print statement says that the shape is (?,6), (?,100) , (?,100) , (?,100) but the keras merge function reads it as [(None, 6), (None, 7, 62), (None, 23, 62), (None, 2, 62)] ? Why is this so?
Thank you!
回答1:
So.... if the message says you're using these shapes, then you can't concatenate....
[(None, 6), (None, 7, 62), (None, 23, 62), (None, 2, 62)]
You can try to concatenate the last three:
left_combined = keras.layers.Concatenate(axis=1)([l1_conv_net, l2_conv_net, l3_conv_net])
Don't print tensors, print K.int_shape(tensor)
to see the actual shapes. (By the way, something is really going wrong with what you posted because the shapes of the tensors are too weird. The Keras shapes make sense if you're using 1D convolutions or RNNs)
If your backend is not tensorflow, you may have wrong output_shape
parameters in custom or lambda layers somewhere.
回答2:
Keras concatenate has some restrictions. The number of dimensions has to be the same, that's why your first tensor fails. You can save that quite quickly by reshaping it to (None, 1, 62). If you are merging along the first axis, all None dimensions have to be the same in the calculations. Looking at the source code, it appears that having an axis as None is not a problem in itself.
So reshape the first tensor and check whether the None axis will always be the same for all axes.
回答3:
I'm not a big expert, yet for my case defining the input like that Input(shape=(1,1,) instead of this Input(shape=(1,)) , added the required dimension and merge was excepted... just try to add a dimension with length 1.
来源:https://stackoverflow.com/questions/51611946/keras-merge-concatenate-failed-because-of-different-input-shape-even-though-inpu