I have the following code which I rewrite to work on a large scale dataset. I am using Python generator to Fit the model on data yielded batch-by-batch.
def
As the linked question you provided indicates, Keras Generators have to iterate indefinitely, so you can output elements to your training as long as you want. More info on that on this Github issue.
For that, you must do some modificaiton to your generator like:
def subtract_mean_gen(x_source,y_source,avg_image,batch):
batch_list_x=[]
batch_list_y=[]
while 1: #run forever, so you can generate elements indefinitely
for line,y in zip(x_source,y_source):
x=line.astype('float32')
x=x-avg_image
batch_list_x.append(x)
batch_list_y.append(y)
if len(batch_list_x) == batch:
yield (np.array(batch_list_x),np.array(batch_list_y))
batch_list_x=[]
batch_list_y=[]