ValueError: Data cardinality is ambiguous

跟風遠走 提交于 2021-01-27 13:09:35

问题


I'm trying to train LSTM network on data taken from a DataFrame.

Here's the code:

x_lstm=x.to_numpy().reshape(1,x.shape[0],x.shape[1])

model = keras.models.Sequential([
    keras.layers.LSTM(x.shape[1], return_sequences=True, input_shape=(x_lstm.shape[1],x_lstm.shape[2])),
    keras.layers.LSTM(NORMAL_LAYER_SIZE, return_sequences=True),
    keras.layers.LSTM(NORMAL_LAYER_SIZE),
    keras.layers.Dense(y.shape[1])
])

optimizer=keras.optimizers.Adadelta()

model.compile(loss="mse", optimizer=optimizer)
for i in range(150):
    history = model.fit(x_lstm, y)
    save_model(model,'tmp.rnn')

This fails with

ValueError: Data cardinality is ambiguous:
  x sizes: 1
  y sizes: 99
Please provide data which shares the same first dimension.

When I change model to

model = keras.models.Sequential([
    keras.layers.LSTM(x.shape[1], return_sequences=True, input_shape=x_lstm.shape),
    keras.layers.LSTM(NORMAL_LAYER_SIZE, return_sequences=True),
    keras.layers.LSTM(NORMAL_LAYER_SIZE),
    keras.layers.Dense(y.shape[1])
])

it fails with following error:

Input 0 of layer lstm_9 is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: [None, 1, 99, 1200]

How do I get this to work?

x has shape of (99, 1200) (99 items with 1200 features each, this is just sample a larger dataset), y has shape (99, 1)


回答1:


As the Error suggests, the First Dimension of X and y is different. First Dimension indicates the Batch Size and it should be same.

Please ensure that Y also has the shape, (1, something).

I could reproduce your error with the Code shown below:

from tensorflow.keras.preprocessing.sequence import pad_sequences
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, LSTM
import tensorflow as tf
import numpy as np


# define sequences
sequences = [
    [1, 2, 3, 4],
       [1, 2, 3],
             [1]
    ]

# pad sequence
padded = pad_sequences(sequences)
X = np.expand_dims(padded, axis = 0)
print(X.shape) # (1, 3, 4)

y = np.array([1,0,1])
#y = y.reshape(1,-1)
print(y.shape) # (3,)

model = Sequential()
model.add(LSTM(4, return_sequences=False, input_shape=(None, X.shape[2])))
model.add(Dense(1, activation='sigmoid'))

model.compile (
    loss='mean_squared_error',
    optimizer=tf.keras.optimizers.Adam(0.001))

model.fit(x = X, y = y)

If we observe the Print Statements,

Shape of X is  (1, 3, 4)
Shape of y is (3,)

This Error can be fixed by uncommenting the Line, y = y.reshape(1,-1), which makes the First Dimension (Batch_Size) equal (1) for both X and y.

Now, the working code is shown below, along with the Output:

from tensorflow.keras.preprocessing.sequence import pad_sequences
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, LSTM
import tensorflow as tf
import numpy as np


# define sequences
sequences = [
    [1, 2, 3, 4],
       [1, 2, 3],
             [1]
    ]

# pad sequence
padded = pad_sequences(sequences)
X = np.expand_dims(padded, axis = 0)
print('Shape of X is ', X.shape) # (1, 3, 4)

y = np.array([1,0,1])
y = y.reshape(1,-1)
print('Shape of y is', y.shape) # (1, 3)

model = Sequential()
model.add(LSTM(4, return_sequences=False, input_shape=(None, X.shape[2])))
model.add(Dense(1, activation='sigmoid'))

model.compile (
    loss='mean_squared_error',
    optimizer=tf.keras.optimizers.Adam(0.001))

model.fit(x = X, y = y)

The Output of above code is :

Shape of X is  (1, 3, 4)
Shape of y is (1, 3)
1/1 [==============================] - 0s 1ms/step - loss: 0.2588
<tensorflow.python.keras.callbacks.History at 0x7f5b0d78f4a8>

Hope this helps. Happy Learning!



来源:https://stackoverflow.com/questions/62253289/valueerror-data-cardinality-is-ambiguous

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!