I\'m using tf.train.shuffle_batch() to create batches of input images. It includes a min_after_dequeue parameter that makes sure there\'s a specified number of elements inside t
You are correct that running the RandomShuffleQueue.close() operation will stop the dequeuing threads from blocking when there are fewer than min_after_dequeue
elements in the queue.
The tf.train.shuffle_batch() function creates a tf.train.QueueRunner that performs operations on the queue in a background thread. If you start it as follows, passing a tf.train.Coordinator, you will be able to close the queue cleanly (based on the example here):
sess = tf.Session()
coord = tf.train.Coordinator()
tf.train.start_queue_runners(sess, coord=coord)
while not coord.should_stop():
sess.run(train_op)
# When done, ask the threads to stop.
coord.request_stop()
# And wait for them to actually do it.
coord.join(threads)