I am getting a placeholder error.
I do not know what it means, because I am mapping correctly on sess.run(..., {_y: y, _X: X})
... I provide here a fully
The tf.merge_all_summaries()
function is convenient, but also somewhat dangerous: it merges all summaries in the default graph, which includes any summaries from previous—apparently unconnected—invocations of code that also added summary nodes to the default graph. If old summary nodes depend on an old placeholder, you will get errors like the one you have shown in your question (and like previous questions as well).
There are two independent workarounds:
Ensure that you explicitly collect the summaries that you wish to compute. This is as simple as using the explicit tf.merge_summary() op in your example:
accuracy_summary = tf.scalar_summary("accuracy", accuracy)
loss_summary = tf.scalar_summary("loss", C)
merged = tf.merge_summary([accuracy_summary, loss_summary])
Ensure that each time you create a new set of summaries, you do so in a new graph. The recommended style is to use an explicit default graph:
with tf.Graph().as_default():
# Build model and create session in this scope.
#
# Only summary nodes created in this scope will be returned by a call to
# `tf.merge_all_summaries()`
Alternatively, if you are using the latest open-source version of TensorFlow (or the forthcoming 0.7.0 release), you can call tf.reset_default_graph() to reset the state of the graph and remove any old summary nodes.