gated-recurrent-unit

Tensorflow RNN input size

大兔子大兔子 提交于 2019-12-06 06:25:39
I am trying to use tensorflow to create a recurrent neural network. My code is something like this: import tensorflow as tf rnn_cell = tf.nn.rnn_cell.GRUCell(3) inputs = [tf.constant([[0, 1]], dtype=tf.float32), tf.constant([[2, 3]], dtype=tf.float32)] outputs, end = tf.nn.rnn(rnn_cell, inputs, dtype=tf.float32) Now, everything runs just fine. However, I am rather confused by what is actually going on. The output dimensions are always the batch size x the size of the rnn cell's hidden state - how can they be completely independent of the input size? If my understanding is correct, the inputs

Mixing feed forward layers and recurrent layers in Tensorflow?

笑着哭i 提交于 2019-12-05 03:32:24
Has anyone been able to mix feedforward layers and recurrent layers in Tensorflow? For example: input->conv->GRU->linear->output I can imagine one can define his own cell with feedforward layers and no state which can then be stacked using the MultiRNNCell function, something like: cell = tf.nn.rnn_cell.MultiRNNCell([conv_cell,GRU_cell,linear_cell]) This would make life a whole lot easier... can't you just do the following: rnnouts, _ = rnn(grucell, inputs) linearout = [tf.matmul(rnnout, weights) + bias for rnnout in rnnouts] etc. This tutoria l gives an example of how to use convolutional

stock prediction : GRU model predicting same given values instead of future stock price

£可爱£侵袭症+ 提交于 2019-12-02 01:16:46
i was just testing this model from kaggle post this model suppose to predict 1 day ahead from given set of last stocks . After tweaking few parameters i got surprisingly good result, as you can see. mean squared error was 5.193.so overall it looks good at predicting future stocks right? well it turned out to be horrible when i take a look closely on the results. as you can see that this model is predicting last value of the given stocks which is our current last stock. so i did adjusted predictions to one step back.. so now you can clearly see that model is predicting one step backward or last

ValueError: The two structures don't have the same number of elements

会有一股神秘感。 提交于 2019-11-30 18:39:13
with tf.variable_scope('forward'): cell_img_fwd = tf.nn.rnn_cell.GRUCell(hidden_state_size, hidden_state_size) img_init_state_fwd = rnn_img_mapped[:, 0, :] img_init_state_fwd = tf.multiply( img_init_state_fwd, tf.zeros([batch_size, hidden_state_size])) rnn_outputs2, final_state2 = tf.nn.dynamic_rnn( cell_img_fwd, rnn_img_mapped, initial_state=img_init_state_fwd, dtype=tf.float32) This is my code for a GRU for input of dimension 100x196x50, it should be unpacked along the second dimension (that is 196). hidden_state_size is 50, batch_size is 100. However I get the following error: ValueError:

ValueError: The two structures don't have the same number of elements

孤者浪人 提交于 2019-11-30 02:41:51
问题 with tf.variable_scope('forward'): cell_img_fwd = tf.nn.rnn_cell.GRUCell(hidden_state_size, hidden_state_size) img_init_state_fwd = rnn_img_mapped[:, 0, :] img_init_state_fwd = tf.multiply( img_init_state_fwd, tf.zeros([batch_size, hidden_state_size])) rnn_outputs2, final_state2 = tf.nn.dynamic_rnn( cell_img_fwd, rnn_img_mapped, initial_state=img_init_state_fwd, dtype=tf.float32) This is my code for a GRU for input of dimension 100x196x50, it should be unpacked along the second dimension