TensorFlow: How can I evaluate a validation data queue multiple times during training?

前端 未结 1 1194
执念已碎
执念已碎 2021-01-31 11:05

tl;dr

How can I evaluate a validation set after every K training iterations, using separate queues for training and validation data, without resorting to separate

相关标签:
1条回答
  • 2021-01-31 11:43

    I'm currently facing a similar problem. So far I avoided any queues at all and just feeding in the data via the feed_dict but I'm obviously loosing some performance by not using queues and parallelism (although I'm still happy with the current speed as I did the same in Theano earlier). Now I want to redesign this and use queues and stumbled upon this problem. There are this, this, this related issues.

    I'm currently thinking about doing it this way:

    • In training, I want to use a RandomShuffleQueue which makes it even more complicated. I think I will just ignore the problem and once the reader thread which enqueues tensors into the queue finishes, I will let the training stop, so I loose the remaining up-to capacity items for this epoch and just use it for the next epoch. Maybe to make it deterministic I check in the train-thread that I still read from the queue until there are only min_after_dequeue items left.

    • In evaluation, I want to use the same graph and the same session. I can use tf.cond to read from another separate queue instead of the RandomShuffleQueue. Or I could use feed_dict in evaluation. If I would use a separate queue, I would use a FIFOQueue and carefully track that I do the right amount of steps. I could also introduce another dummy tensor which I enqueue into the queue which gives me a end_of_epoch flag or so, so then I know in the eval-thread when to stop.


    In TensorFlow 1.2, there will be the tf.contrib.data interface (issue comment, documentation overview, API documentation), which provides the tf.contrib.data.Dataset API which also supports shuffling similar as tf.RandomShuffleQueue and batching and looping over multiple epochs. Also, you access the data by creating an iterator over it and you can reset the iterator. Some related StackOverflow questions are here and here.

    0 讨论(0)
提交回复
热议问题