I am trying to concatenate my input with a constant tensor in the keras-2 function API. In my real problem, the constants depend on some parameters in setup, but I think the example below shows the error I get.
from keras.layers import*
from keras.models import *
from keras import backend as K
import numpy as np
a = Input(shape=(10, 5))
a1 = Input(tensor=K.variable(np.ones((10, 5))))
x = [a, a1] # x = [a, a] works fine
b = concatenate(x, 1)
x += [b] # This changes b._keras_history[0].input
b = concatenate(x, 1)
model = Model(a, b)
The error I get is:
ValueError Traceback (most recent call last)
~/miniconda3/envs/ds_tools/lib/python3.6/site-packages/keras/engine/topology.py in assert_input_compatibility(self, inputs)
418 try:
--> 419 K.is_keras_tensor(x)
420 except ValueError:
~/miniconda3/envs/ds_tools/lib/python3.6/site-packages/keras/backend/theano_backend.py in is_keras_tensor(x)
198 T.sharedvar.TensorSharedVariable)):
--> 199 raise ValueError('Unexpectedly found an instance of type `' + str(type(x)) + '`. '
200 'Expected a symbolic tensor instance.')
ValueError: Unexpectedly found an instance of type `<class 'theano.gpuarray.type.GpuArraySharedVariable'>`. Expected a symbolic tensor instance.
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
<ipython-input-2-53314338ab8e> in <module>()
5 a1 = Input(tensor=K.variable(np.ones((10, 5))))
6 x = [a, a1]
----> 7 b = concatenate(x, 1)
8 x += [b] # This changes b._keras_history[0].input
9 b = concatenate(x, 1)
~/miniconda3/envs/ds_tools/lib/python3.6/site-packages/keras/layers/merge.py in concatenate(inputs, axis, **kwargs)
506 A tensor, the concatenation of the inputs alongside axis `axis`.
507 """
--> 508 return Concatenate(axis=axis, **kwargs)(inputs)
509
510
~/miniconda3/envs/ds_tools/lib/python3.6/site-packages/keras/engine/topology.py in __call__(self, inputs, **kwargs)
550 # Raise exceptions in case the input is not compatible
551 # with the input_spec specified in the layer constructor.
--> 552 self.assert_input_compatibility(inputs)
553
554 # Collect input shapes to build layer.
~/miniconda3/envs/ds_tools/lib/python3.6/site-packages/keras/engine/topology.py in assert_input_compatibility(self, inputs)
423 'Received type: ' +
424 str(type(x)) + '. Full input: ' +
--> 425 str(inputs) + '. All inputs to the layer '
426 'should be tensors.')
427
ValueError: Layer concatenate_2 was called with an input that isn't a symbolic tensor. Received type: <class 'theano.gpuarray.type.GpuArraySharedVariable'>. Full input: [concatenate_1/input_3, concatenate_1/variable]. All inputs to the layer should be tensors.
I am running keras version 2.0.5
with the theano backend, with theano version 0.10.0dev1
. Any ideas on what is going wrong or a more correct way to accomplish the concatenation?
Dimensions in keras work like this:
- When you define them in layers, building your model, you never define "batch_size".
- Internally, using backend functions, in loss functions and in any tensor operation, the batch dimension is the first
Keras shows you a None
to represent the batch size in summaries, errors and others.
That means that:
- a's shape is (None, 10, 5)
- a1's shape just (10,5). You cannot concatenate them.
There are a few workarounds you can do, such as creating a1 with shape (1,10,5) and then repeating it's values in the batch dimension:
constant=K.variable(np.ones((1,10, 5)))
constant = K.repeat_elements(constant,rep=batch_size,axis=0)
I was totally unable to use Input(tensor=...)
because the constant's dimension is fixed, and the input's dimension is None
, so I worked it around with a lambda layer:
b = Lambda(lambda x: K.concatenate([x,constant],axis=1),output_shape=(20,5))(a)
But I can't at all understand what you want to achieve with x += [b]
and the rest.
来源:https://stackoverflow.com/questions/44740744/concatenate-input-with-constant-vector-in-keras