Keras GRUCell missing 1 required positional argument: 'states'

女生的网名这么多〃 提交于 2019-12-11 10:48:56

问题


I try to build a 3-layer RNN with Keras. Part of the code is here:

    model = Sequential()
    model.add(Embedding(input_dim = 91, output_dim = 128, input_length =max_length))
    model.add(GRUCell(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    model.add(GRUCell(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    model.add(GRUCell(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    model.add(TimeDistributed(Dense(target.shape[2])))

Then I met this error:

call() missing 1 required positional argument: 'states'

The error details are as follows:

~/anaconda3/envs/hw3/lib/python3.5/site-packages/keras/models.py in add(self, layer)
487                           output_shapes=[self.outputs[0]._keras_shape])
488         else:
--> 489             output_tensor = layer(self.outputs[0])
490             if isinstance(output_tensor, list):
491                 raise TypeError('All layers in a Sequential model '

 ~/anaconda3/envs/hw3/lib/python3.5/site-packages/keras/engine/topology.py in __call__(self, inputs, **kwargs)
601 
602             # Actually call the layer, collecting output(s), mask(s), and shape(s).
--> 603             output = self.call(inputs, **kwargs)
604             output_mask = self.compute_mask(inputs, previous_mask)
605 

回答1:


  1. Don't use Cell classes (i.e. GRUCell or LSTMCell) in Keras directly. They are computation cells which are wrapped by the corresponding layers. Instead use the Layer classes (i.e. GRU or LSTM):

    model.add(GRU(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    model.add(GRU(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    model.add(GRU(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    

    The LSTM and GRU use their corresponding cells to perform computations over the all timesteps. Read this SO answer to learn more about their difference.

  2. When you are stacking multiple RNN layers on top of each other you need to set their return_sequences argument to True in order to produce the output of each timestep, which in turn is used by the next RNN layer. Note that you may or may not do this on the last RNN layer (it depends on your architecture and the problem you are trying to solve):

    model.add(GRU(units = self.neurons, dropout = self.dropval,  bias_initializer = bias, return_sequences=True))
    model.add(GRU(units = self.neurons, dropout = self.dropval,  bias_initializer = bias, return_sequences=True))
    model.add(GRU(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    


来源:https://stackoverflow.com/questions/51254706/keras-grucell-missing-1-required-positional-argument-states

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!