multi-layer

How to use multilayered bidirectional LSTM in Tensorflow?

放肆的年华 提交于 2019-12-21 01:09:22
问题 I want to know how to use multilayered bidirectional LSTM in Tensorflow. I have already implemented the contents of bidirectional LSTM, but I wanna compare this model with the model added multi-layers. How should I add some code in this part? x = tf.unstack(tf.transpose(x, perm=[1, 0, 2])) #print(x[0].get_shape()) # Define lstm cells with tensorflow # Forward direction cell lstm_fw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0) # Backward direction cell lstm_bw_cell = rnn.BasicLSTMCell(n

MLP in tensorflow for regression… not converging

落花浮王杯 提交于 2019-12-10 11:37:47
问题 Hello it is my first time working with tensorflow, i try to adapt the example here TensorFlow-Examples to use this code for regression problems with boston database. Basically, i only change the cost function ,the database, the inputs number, and the target number but when i run the MPL doesn't converge (i use a very low rate). I test it with Adam Optimization and descend gradient optimization but i have the same behavior. I appreciate your suggestions and ideas...!!! Observation: When i ran

MLP in tensorflow for regression… not converging

这一生的挚爱 提交于 2019-12-07 11:43:26
Hello it is my first time working with tensorflow, i try to adapt the example here TensorFlow-Examples to use this code for regression problems with boston database. Basically, i only change the cost function ,the database, the inputs number, and the target number but when i run the MPL doesn't converge (i use a very low rate). I test it with Adam Optimization and descend gradient optimization but i have the same behavior. I appreciate your suggestions and ideas...!!! Observation: When i ran this program without the modifications described above, the cost function value always decrease. Here

Tensorflow same code but get different result from CPU device to GPU device

妖精的绣舞 提交于 2019-12-06 11:21:31
问题 I am trying to implement a program to test the Tensorflow performance on GPU device. Data test is MNIST data, supervised training using Multilayer perceptron(Neural networks). I followed this simple example but I change the number of performance batch gradient to 10000 for i in range(10000) : batch_xs, batch_ys = mnist.train.next_batch(100) sess.run(train_step,feed_dict={x : batch_xs, y_ : batch_ys}) if i % 500 == 0: print(i) Eventually, when I check the predict accuracy using this code

Cannot stack LSTM with MultiRNNCell and dynamic_rnn

岁酱吖の 提交于 2019-12-04 12:17:19
问题 I am trying to build a multivariate time series prediction model. I followed the following tutorial for temperature prediction. http://nbviewer.jupyter.org/github/addfor/tutorials/blob/master/machine_learning/ml16v04_forecasting_with_LSTM.ipynb I want to extend his model to multilayer LSTM model by using following code: cell = tf.contrib.rnn.LSTMCell(hidden, state_is_tuple=True) cell = tf.contrib.rnn.MultiRNNCell([cell] * num_layers,state_is_tuple=True) output, _ = tf.nn.dynamic_rnn(cell=cell

Cannot stack LSTM with MultiRNNCell and dynamic_rnn

橙三吉。 提交于 2019-12-03 07:42:37
I am trying to build a multivariate time series prediction model. I followed the following tutorial for temperature prediction. http://nbviewer.jupyter.org/github/addfor/tutorials/blob/master/machine_learning/ml16v04_forecasting_with_LSTM.ipynb I want to extend his model to multilayer LSTM model by using following code: cell = tf.contrib.rnn.LSTMCell(hidden, state_is_tuple=True) cell = tf.contrib.rnn.MultiRNNCell([cell] * num_layers,state_is_tuple=True) output, _ = tf.nn.dynamic_rnn(cell=cell, inputs=features, dtype=tf.float32) but I have an error saying: ValueError: Dimensions must be equal,

How to use multilayered bidirectional LSTM in Tensorflow?

旧城冷巷雨未停 提交于 2019-12-03 06:51:06
I want to know how to use multilayered bidirectional LSTM in Tensorflow. I have already implemented the contents of bidirectional LSTM, but I wanna compare this model with the model added multi-layers. How should I add some code in this part? x = tf.unstack(tf.transpose(x, perm=[1, 0, 2])) #print(x[0].get_shape()) # Define lstm cells with tensorflow # Forward direction cell lstm_fw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0) # Backward direction cell lstm_bw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0) # Get lstm cell output try: outputs, _, _ = rnn.static_bidirectional_rnn(lstm