How to use multilayered bidirectional LSTM in Tensorflow?

前端 未结 4 1875
囚心锁ツ
囚心锁ツ 2021-02-06 05:02

I want to know how to use multilayered bidirectional LSTM in Tensorflow.

I have already implemented the contents of bidirectional LSTM, but I wanna compare this model wi

4条回答
  •  梦谈多话
    2021-02-06 05:37

    This is primarily same as the first answer but with a little variation of usage of scope name and with added dropout wrappers. It also takes care of the error the first answer gives about variable scope.

    def bidirectional_lstm(input_data, num_layers, rnn_size, keep_prob):
    
        output = input_data
        for layer in range(num_layers):
            with tf.variable_scope('encoder_{}'.format(layer),reuse=tf.AUTO_REUSE):
    
                # By giving a different variable scope to each layer, I've ensured that
                # the weights are not shared among the layers. If you want to share the
                # weights, you can do that by giving variable_scope as "encoder" but do
                # make sure first that reuse is set to tf.AUTO_REUSE
    
                cell_fw = tf.contrib.rnn.LSTMCell(rnn_size, initializer=tf.truncated_normal_initializer(-0.1, 0.1, seed=2))
                cell_fw = tf.contrib.rnn.DropoutWrapper(cell_fw, input_keep_prob = keep_prob)
    
                cell_bw = tf.contrib.rnn.LSTMCell(rnn_size, initializer=tf.truncated_normal_initializer(-0.1, 0.1, seed=2))
                cell_bw = tf.contrib.rnn.DropoutWrapper(cell_bw, input_keep_prob = keep_prob)
    
                outputs, states = tf.nn.bidirectional_dynamic_rnn(cell_fw, 
                                                                  cell_bw, 
                                                                  output,
                                                                  dtype=tf.float32)
    
                # Concat the forward and backward outputs
                output = tf.concat(outputs,2)
    
        return output
    

提交回复
热议问题