0
votes

I would like to know if setting Keras model.fit steps_per_epoch argument to a fixed value that is less than (number of samples in dataset / batch size) will always use the same samples in the dataset (from sample 0 to sample [(steps_per_epoch*batch_size)-1]), reducing so the effective size of the training set, or does it takes steps_per_epoch random samples from the training set(without replacement) of size batch_size in every epoch?

Rgds.

Alex

1

1 Answers

0
votes

You can easily try to see the elements inside each of your batches during training by adding a print() line inside a custom batch generator. You'll see that your batches are randomly permuted between each epochs (but the samples inside each of them will not if shuffle is set to False).

Consider for example a dataset of 10 samples, where each of them can be identified by a unique number. If, let's say, batch_size = 2, shuffle = False and steps_per_epoch = 10//2 = 5, then during the first epoch the batches will be called in order (batch_1 = [1,2], batch_2 = [3,4], ..., batch_5 = [9,10]). But after this epoch, the batches will be randomly permuted : we'll have for example for the second epoch batch_4 = [7,8], then batch_1, batch_3 etc... and so on until the end of training.

Thus, even if the steps_per_epoch argument is less than the recommanded size_of_dataset // batch_size, we can deduce that the batches will still be randomly permuted, implying that your model will still see the entirety of your dataset during training. Note that this question is trivial is shuffle = True, as the elements inside the batches will be randomly generated at each epoch from the entire dataset.


Lastly, I just wanted to share this answer to you, that depicts an interesting use of a reduced steps_per_epoch.