Keras - How are batches and epochs used in fit_generator()?

后端 未结 3 554
梦谈多话
梦谈多话 2020-12-31 02:34

I have a video of 8000 frames, and I\'d like to train a Keras model on batches of 200 frames each. I have a frame generator that loops through the video frame-by-frame and a

相关标签:
3条回答
  • 2020-12-31 02:56

    After the first epoch is complete (after the model logs batches 0-24), the generator picks up where it left off

    This is an accurate description of what happens. If you want to reset or rewind the generator, you'll have to do this internally. Note that keras's behavior is quite useful in many situations. For example, you can end an epoch after seeing 1/2 the data then do an epoch on the other half, which would be impossible if the generator status was reset (which can be useful for monitoring the validation more closely).

    0 讨论(0)
  • 2020-12-31 03:05

    You can force your generator to reset itself by adding a while 1: loop, that's how I proceed. Thus your generator can yield batched data for each epochs.

    0 讨论(0)
  • 2020-12-31 03:15

    Because the Generator is a completely separated function, it will go on with its infinite loop whenever it is called again.

    What I can't justify is that fit_generator() will call the generator until it has enough samples. I can't find the variable batch_size, but there must be a criteria that sets an internal variable that defines the size.

    I checked this while printing a state within each loop sequence:

    def generator():
    
    while 1:
        for i in range(0,len(x_v)-1):
            if (i != predict_batch_nr):
                print("\n -> usting Datasett ", i+1 ," of ", len(x_v))
                x = x_v[i] #x_v has Batches of different length
                y = y_v[i] #y_v has Batches of different length
    
                yield x, y
    
    
    model.fit_generator(generator(),steps_per_epoch=5000,epochs=20, verbose=1)
    

    Example output is:

    4914/5000 [============================>.] - ETA: 13s - loss: 2442.8587
    usting Datasett  77  of  92
    4915/5000 [============================>.] - ETA: 12s - loss: 2442.3785
    -> usting Datasett  78  of  92
    -> usting Datasett  79  of  92
    -> usting Datasett  80  of  92
    4918/5000 [============================>.] - ETA: 12s - loss: 2442.2111
    -> usting Datasett  81  of  92
    -> usting Datasett  82  of  92
    
    0 讨论(0)
提交回复
热议问题