0 votes
in Deep Learning by
Explain the following variant of Gradient Descent: Stochastic, Batch, and Mini-batch?

1 Answer

0 votes
by

Below are the variant Gradient Descent Stochastic Batch:

1. Stochastic Gradient Descent

Stochastic gradient descent is used to calculate the gradient and update the parameters by using only a single training example.

2. Batch Gradient Descent

Batch gradient descent is used to calculate the gradients for the whole dataset and perform just one update at each iteration.

3. Mini-batch Gradient Descent

Mini-batch gradient descent is a variation of stochastic gradient descent. Instead of a single training example, mini-batch of samples is used. Mini-batch gradient descent is one of the most popular optimization algorithms.

Related questions

0 votes
asked Dec 12, 2022 in Deep Learning by Robin
0 votes
asked Nov 21, 2021 in MongoDB by DavidAnderson
...