问题
import tensorflow as tf
x = tf.constant(35, name='x')
y = tf.Variable(x + 5, name='y')
# model = tf.global_variables_initializer()
with tf.Session() as session:
print("x = ", session.run(x))
# session.run(model)
print("y = ", session.run(y))
I was not able to understand when global_variables_initializer()
is actually required. In the above code, if we uncomment lines 4 & 7, I can execute the code and see the values. If I run as-is, I see a crash.
My question is which variables it is initializing. x
is a constant which does not need initialization and y
is variable which is not being initialized but is used as an arithmetic operation.
回答1:
tf.global_variables_initializer is a shortcut to initialize all global variables. It is not required, and you can use other ways to initialize your variables or in case of easy scripts sometimes you do not need to initialize them at all.
Everything except of variables do not require initialization (constants and placeholders). But every used variable (even if it is a constant) should be initialized. This will give you an error, although z
is just 0-d tensor with only one number.
import tensorflow as tf
z = tf.Variable(4)
with tf.Session() as session:
print(session.run(z))
I highlighted the word used, because if you just have variables which are not run (or non of the runs depends on them) you do not need to initialize them.
For example this code will execute without any problems, nonetheless it has 2 variables and one operation which depends on them. But the run does not require them.
import tensorflow as tf
x = tf.constant(35, name='x')
y = tf.Variable(x + 5, name='y')
z = tf.Variable(4)
a = y + z
with tf.Session() as session:
print("x = ", session.run(x))
回答2:
From the docs (emphasis mine):
Calling tf.Variable() adds several ops to the graph:
- A variable op that holds the variable value.
- An initializer op that sets the variable to its initial value. This is actually a tf.assign op.
- The ops for the initial value, such as the zeros op for the biases variable in the example are also added to the graph.
Later,
Variable initializers must be run explicitly before other ops in your model can be run. The easiest way to do that is to add an op that runs all the variable initializers, and run that op before using the model.
In short, global_variables_initializer
is never required, Variable
initialization is. Whenever you have Variables
in your code, you must initialize them first. The global_variables_initializer
helper initializes all Variables
that have been previously declared, and is therefore just a very convenient way to do it.
回答3:
It's never a requirement unless you are using a declared tf.Variable
or tf.placeholder
from within your tensorflow session run. Personally, I always make it a habit of running tf.global_variables_initializer()
. It almost becomes part of the boiler plate code when running tensorflow models:
with tf.Session(graph=graph) as sess:
sess.run(tf.global_variables_initializer())
# run model etc...
回答4:
The tf.global_variables_initializer
just initializes all variables that tf.global_variables()
would list. This actually makes much sense in a distributed environment where the graph might be located in different computing nodes in a cluster.
In such a case, tf.global_variables_initializer()
which is just an alias for tf.variables_initializer(tf.global_variables())
would initialize all the variables in all the computing nodes, where the graph is placed.
来源:https://stackoverflow.com/questions/44299666/when-global-variables-initializer-is-actually-required