Convolutional Neural Network - Dropout kills performance
问题 I'm building a convolutional neural network using Tensorflow (I'm new with both), in order to recognize letters. I've got a very weird behavior with the dropout layer : if I don't put it (ie. keep_proba at 1), it performs quite well and learns (see Tensorboard screenshots of accuracy and loss below, with training in blue and testing in orange). However, when I put the dropout layer during the training phase (I tried at 0.8 and 0.5), the network learns nothing : loss falls quickly around 3 or