Data Augmentation hurts accuracy Keras

北战南征 提交于 2019-12-13 03:57:55

问题


I'm trying to adapt Deep Learning with Python section 5.3 Feature extraction with Data Augmentation to a 3-class problem with resnet50 (imagenet weights).

Full code at https://github.com/morenoh149/plantdisease

from keras import models
from keras import layers
from keras.applications.resnet50 import ResNet50
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator

input_shape = (224, 224, 3)
target_size = (224, 224)
batch_size = 20

conv_base = ResNet50(weights='imagenet', input_shape=input_shape, include_top=False)

model = models.Sequential()
model.add(conv_base)
model.add(layers.Flatten())
model.add(layers.Dense(256, activation='relu'))
model.add(layers.Dense(3, activation='softmax'))

conv_base.trainable = False

train_datagen = ImageDataGenerator(rescale=1./255)
test_datagen = ImageDataGenerator(rescale=1./255)
train_generator = train_datagen.flow_from_directory(
    'input/train',
    target_size=target_size,
    batch_size=batch_size,
    class_mode='categorical')
validation_generator = test_datagen.flow_from_directory(
    'input/validation',
    target_size=target_size,
    batch_size=batch_size,
    class_mode='categorical')
model.compile(loss='categorical_crossentropy',
             optimizer=optimizers.RMSprop(lr=2e-5),
             metrics=['acc'])
history = model.fit_generator(
    train_generator,
    steps_per_epoch=96,
    epochs=30,
    verbose=2,
    validation_data=validation_generator,
    validation_steps=48)

Questions:

  • the book doesn't go much into ImageDataGenerator and selecting steps_per_epoch and validation_steps. What should these values be? I have 3 classes, 1000 images each. I've split it 60/20/20 train/validation/test.
  • I was able to get a validation accuracy of 60% without data augmentation. Above I've simplified the ImageDataGenerator to only rescale. This model has a validation accuracy of 30% Why?
  • What changes do I need to make to the data-augmented version of this script to match the accuracy with no augmentation?

UPDATE: This may be an issue with keras itself

  • https://github.com/keras-team/keras/issues/9214
  • https://github.com/keras-team/keras/pull/9965

回答1:


To answer your first question: steps_per_epochis the number of batches the training generator should yield before considering an epoch finished. If you have 600 training images with batch size 20, this would be 30 steps per epoch et cetera. validation_steps applies the same logic to the validation data generator, be it at the end of each epoch.

In general, steps_per_epoch is the size of your dataset divided by the batch size.



来源:https://stackoverflow.com/questions/51692071/data-augmentation-hurts-accuracy-keras

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!