I am experimenting with some simple models in tensorflow, including one that looks very similar to the first MNIST for ML Beginners example, but with a somewhat larger dimen
I was having a similar problem. (No problems training with GradientDescent optimizer, but error raised when using to Adam Optimizer, or any other optimizer with its own variables)
Changing to an interactive session solved this problem for me.
sess = tf.Session()
into
sess = tf.InteractiveSession()
You need to call tf.global_variables_initializer()
on you session, like
init = tf.global_variables_initializer()
sess.run(init)
Full example is available in this great tutorial https://www.tensorflow.org/get_started/mnist/mechanics
FailedPreconditionError: Attempting to use uninitialized value is one of the most frequent errors related to tensorflow. From official documentation, FailedPreconditionError
This exception is most commonly raised when running an operation that reads a tf.Variable before it has been initialized.
In your case the error even explains what variable was not initialized: Attempting to use uninitialized value Variable_1
. One of the TF tutorials explains a lot about variables, their creation/initialization/saving/loading
Basically to initialize the variable you have 3 options:
tf.variable_initializers(tf.global_variables())
var_name.initializer
I almost always use the first approach. Remember you should put it inside a session run. So you will get something like this:
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
If your are curious about more information about variables, read this documentation to know how to report_uninitialized_variables
and check is_variable_initialized
.
The AdamOptimizer class creates additional variables, called "slots", to hold values for the "m" and "v" accumulators.
See the source here if you're curious, it's actually quite readable: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/training/adam.py#L39 . Other optimizers, such as Momentum and Adagrad use slots too.
These variables must be initialized before you can train a model.
The normal way to initialize variables is to call tf.initialize_all_variables()
which adds ops to initialize the variables present in the graph when it is called.
(Aside: unlike its name suggests, initialize_all_variables() does not initialize anything, it only add ops that will initialize the variables when run.)
What you must do is call initialize_all_variables() after you have added the optimizer:
...build your model...
# Add the optimizer
train_op = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)
# Add the ops to initialize variables. These will include
# the optimizer slots added by AdamOptimizer().
init_op = tf.initialize_all_variables()
# launch the graph in a session
sess = tf.Session()
# Actually intialize the variables
sess.run(init_op)
# now train your model
for ...:
sess.run(train_op)
run init after AdamOptimizer,and without define init before or run init
sess.run(tf.initialize_all_variables())
or
sess.run(tf.global_variables_initializer())