问题
I recently started learning Deeplearning4j and I fail to understand how the concept of epochs and iterations is actually implemented. In the online documentation it says:
an epoch is a complete pass through a given dataset ...
Not to be confused with an iteration, which is simply one update of the neural net model’s parameters.
I ran a training using a MultipleEpochsIterator, but for the first run I set 1 epoch, miniBatchSize = 1 and a dataset of 1000 samples, so I expected the training to finish after 1 epoch and 1000 iterations, but after more than 100.000 iterations it was still running.
int nEpochs = 1;
int miniBatchSize = 1;
MyDataSetFetcher fetcher = new MyDataSetFetcher(xDataDir, tDataDir, xSamples, tSamples);
//The same batch size set here was set in the model
BaseDatasetIterator baseIterator = new BaseDatasetIterator(miniBatchSize, sampleSize, fetcher);
MultipleEpochsIterator iterator = new MultipleEpochsIterator(nEpochs, baseIterator);
model.fit(iterator)
Then I did more tests changing the batch size, but that didn't change the frequency of the log lines printed by the IterationListener. I mean that I thought that if I increase the batch size to 100 then with 1000 samples I would have just 10 updates of the parameters an therefore just 10 iterations, but the logs and the timestamp intervals are more or less the same.
BTW. There is a similar question, but the answer does not actually answer my question, I would like to understand better the actual details: Deeplearning4j: Iterations, Epochs, and ScoreIterationListener
回答1:
None of this will matter after 1.x (which is already out in alpha) - we got rid of iterations long ago.
Originally it was meant to be shortcut syntax so folks wouldn't have to write for loops.
Just focus on for loops with epochs now.
来源:https://stackoverflow.com/questions/50273031/epochs-and-iterations-in-deeplearning4j