in Deep Learning by (14.6k points)
What do you mean by an epochs in the context of deep learning?

1 Answer

0 votes
by (14.6k points)

An epoch is a terminology used in deep learning that refers to the number of passes the deep learning algorithm has made across the full training dataset. Batches are commonly used to group data sets (especially when the amount of data is very large). The term "iteration" refers to the process of running one batch through the model.

The number of epochs equals the number of iterations if the batch size is the entire training dataset. This is frequently not the case for practical reasons. Several epochs are used in the creation of many models.

There is a general relation which is given by:-

d * e = i * b


d is the dataset size

e is the number of epochs

i is the number of iterations

b is the batch size

Related questions

0 votes
asked Dec 12, 2022 in Deep Learning by Robin (14.6k points)
0 votes
+1 vote
asked Oct 13, 2022 in Deep Learning by AdilsonLima (6.3k points)
+1 vote
asked Jun 10, 2021 in Deep Learning by Robindeniel (20.8k points)