Mixing feed forward layers and recurrent layers in Tensorflow?

吃可爱长大的小学妹 提交于 2020-01-02 02:19:30

问题


Has anyone been able to mix feedforward layers and recurrent layers in Tensorflow?

For example: input->conv->GRU->linear->output

I can imagine one can define his own cell with feedforward layers and no state which can then be stacked using the MultiRNNCell function, something like:

cell = tf.nn.rnn_cell.MultiRNNCell([conv_cell,GRU_cell,linear_cell])

This would make life a whole lot easier...


回答1:


can't you just do the following:

rnnouts, _ = rnn(grucell, inputs)
linearout = [tf.matmul(rnnout, weights) + bias for rnnout in rnnouts]

etc.




回答2:


This tutorial gives an example of how to use convolutional layers together with recurrent ones. For example, having last convolution layers like this:

...
l_conv4_a = conv_pre(l_pool3, 16, (5, 5), scope="l_conv4_a")
l_pool4 = pool(l_conv3_a, scope="l_pool4")
l_flatten = flatten(l_pool4, scope="flatten")

and having defined RNN cell:

_, shape_state = tf.nn.dynamic_rnn(cell=shape_cell,
    inputs=tf.expand_dims(batch_norm(x_shape_pl), 2), dtype=tf.float32, scope="shape_rnn")

You can concatenate both outputs and use it as the input to the next layer:

features = tf.concat(concat_dim=1, values=[x_margin_pl, shape_state, x_texture_pl, l_flatten], name="features")

Or you can just use the output of CNN layer as the input to the RNN cell:

_, shape_state = tf.nn.dynamic_rnn(cell=shape_cell,
    inputs=l_flatten, dtype=tf.float32, scope="shape_rnn")



回答3:


This is what I have so far; improvements welcome:

class LayerCell(rnn_cell_impl.RNNCell):

    def __init__(self, tf_layer, **kwargs):
        ''' :param tf_layer: a tensorflow layer, e.g. tf.layers.Conv2D or 
            tf.keras.layers.Conv2D. NOT tf.layers.conv2d !
            Can pass all other layer params as well, just need to give the 
            parameter name: paramname=param'''
        self.layer_fn = tf_layer(**kwargs)

    def __call__(self, inputs, state, scope=None):
        ''' Every `RNNCell` must implement `call` with
          the signature `(output, next_state) = call(input, state)`.  The optional
          third input argument, `scope`, is allowed for backwards compatibility
          purposes; but should be left off for new subclasses.'''
        return (self.layer_fn(inputs), state)

    def __str__(self):
            return "Cell wrapper of " + str(self.layer_fn)

    def __getattr__(self, attr):
        '''credits to https://stackoverflow.com/questions/1382871/dynamically-attaching-a-method-to-an-existing-python-object-generated-with-swig/1383646#1383646'''
        return getattr(self.layer_fn, attr)

    @property
    def state_size(self):
        """size(s) of state(s) used by this cell.

        It can be represented by an Integer, a TensorShape or a tuple of Integers
        or TensorShapes.
        """
        return  (0,) 

    @property
    def output_size(self):
        """Integer or TensorShape: size of outputs produced by this cell."""
        # use with caution; could be uninitialized
        return self.layer_fn.output_shape

(Naturally, don't use with recurrent layers because state-keeping will be destroyed.)

Seems to work with: tf.layers.Conv2D, tf.keras.layers.Conv2D, tf.keras.layers.Activation, tf.layers.BatchNormalization

Does NOT work with: tf.keras.layers.BatchNormalization. At least it failed for me when using it in a tf.while loop; complaining about combining variables from different frames, similar to here. Maybe keras uses tf.Variable() instead of tf.get_variable() ...?


Usage:

cell0 = tf.contrib.rnn.ConvLSTMCell(conv_ndims=2, input_shape=[40, 40, 3], output_channels=16, kernel_shape=[5, 5])
cell1 = LayerCell(tf.keras.layers.Conv2D, filters=8, kernel_size=[5, 5], strides=(1, 1), padding='same')
cell2 = LayerCell(tf.layers.BatchNormalization, axis=-1)

inputs =  np.random.rand(10, 40, 40, 3).astype(np.float32)
multicell = tf.contrib.rnn.MultiRNNCell([cell0, cell1, cell2])
state = multicell.zero_state(batch_size=10, dtype=tf.float32)

output = multicell(inputs, state)


来源:https://stackoverflow.com/questions/36430601/mixing-feed-forward-layers-and-recurrent-layers-in-tensorflow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!