问题
I am having an issue declaring my model. My inputs are x_input and y_input, and my outputs are predictions. As follows:
model = Model(inputs = [x_input, y_input], outputs = predictions )
My inputs (x,y) are both embedded, then MatMult together. As follows:
# Build X Branch
x_input = Input(shape = (maxlen_x,), dtype = 'int32' )
x_embed = Embedding( maxvocab_x + 1, 16, input_length = maxlen_x )
XE = x_embed(x_input)
# Result: Tensor("embedding_1/Gather:0", shape=(?, 31, 16), dtype=float32)
# Where 31 happens to be my maxlen_x
Similarly for the y branch...
# Build Y Branch
y_input = Input(shape = (maxlen_y,), dtype = 'int32' )
y_embed = Embedding( maxvocab_y + 1, 16, input_length = maxlen_y )
YE = y_embed(y_input)
# Result: Tensor("embedding_1/Gather:0", shape=(?, 13, 16), dtype=float32)
# Where 13 happens to be my maxlen_y
I then do a batch dot between the two. (Simply dotting the data from each instance)
from keras import backend as K
dot_merged = K.batch_dot(XE, YE, axes=[2,2] ) # Choose the 2nd component of both inputs to Dot, using batch_dot
# Result: Tensor("MatMul:0", shape=(?, 31, 13), dtype=float32)`
I then flattened the last two dimensions of the tensor.
dim = np.prod(list(dot_merged.shape)[1:])
flattened= K.reshape(dot_merged, (-1,int(dim)) )
Ultimately, I fed this flattened data into a simple logistic regressor.
predictions = Dense(1,activation='sigmoid')(flattened)
And, my predictions are, of course, my output for the model.
I will list the output of each layer by the output shape of the tensor.
Tensor("embedding_1/Gather:0", shape=(?, 31, 16), dtype=float32)
Tensor("embedding_2/Gather:0", shape=(?, 13, 16), dtype=float32)
Tensor("MatMul:0", shape=(?, 31, 13), dtype=float32)
Tensor("Reshape:0", shape=(?, 403), dtype=float32)
Tensor("dense_1/Sigmoid:0", shape=(?, 1), dtype=float32)
I get the following error, specifically.
Traceback (most recent call last):
File "Model.py", line 53, in <module>
model = Model(inputs = [dx_input, rx_input], outputs = [predictions] )
File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 88, in wrapper
return func(*args, **kwargs)
File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/engine/topology.py", line 1705, in __init__
build_map_of_graph(x, finished_nodes, nodes_in_progress)
File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/engine/topology.py", line 1695, in build_map_of_graph
layer, node_index, tensor_index)
File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/engine/topology.py", line 1665, in build_map_of_graph
layer, node_index, tensor_index = tensor._keras_history
AttributeError: 'Tensor' object has no attribute '_keras_history'
Volia. Where did I go wrong? Thanks for any help ahead of time!
-Anthony
回答1:
Did you tried wrapping the backend functions into a Lambda
layer?
I think there are some necessary operations within a Keras layer's __call__()
method for a Keras Model
to be properly built, which will not be executed if you call the backend functions directly.
来源:https://stackoverflow.com/questions/45309236/keras-backend-modeling-issue