LSTM Batches vs Timesteps

限于喜欢 提交于 2021-02-08 05:41:06

问题


I've followed the TensorFlow RNN tutorial to create a LSTM model. However, in the process, I've grown confused as to the difference, if any, between 'batches' and 'timesteps', and I'd appreciate help in clarifying this matter.

The tutorial code (see following) essentially creates 'batches' based on a designated number of steps:

with tf.variable_scope("RNN"):
      for time_step in range(num_steps):
        if time_step > 0: tf.get_variable_scope().reuse_variables()
        (cell_output, state) = cell(inputs[:, time_step, :], state)
        outputs.append(cell_output)

However, the following appears to do the same:

    for epoch in range(5):
        print('----- Epoch', epoch, '-----')
        total_loss = 0
        for i in range(inputs_cnt // BATCH_SIZE):
            inputs_batch = train_inputs[i * BATCH_SIZE: (i + 1) * BATCH_SIZE]
            orders_batch = train_orders[i * BATCH_SIZE: (i + 1) * BATCH_SIZE]
            feed_dict = {story: inputs_batch, order: orders_batch}

            logits, xent, loss = sess.run([...], feed_dict=feed_dict)

回答1:


Assuming you are working with text, BATCH_SIZE would be the number of sentences that you are processing in parallel and num_steps would be the maximum number of words in any sentence. These are different dimensions of your input to the LSTM.



来源:https://stackoverflow.com/questions/42010966/lstm-batches-vs-timesteps

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!