0 votes
in Deep Learning by
Explain Batch Gradient Descent.

1 Answer

0 votes
by

Batch Gradient Descent: Batch Gradient Descent entails computation (involved in each step of gradient descent) over the entire training set at each step and hence it is highly slow on very big training sets. As a result, Batch Gradient Descent becomes extremely computationally expensive. This is ideal for error manifolds that are convex or somewhat smooth. Batch Gradient Descent also scales nicely as the number of features grows.

Related questions

0 votes
asked Dec 13, 2022 in Deep Learning by Robindeniel
+1 vote
asked Apr 14, 2021 in ML Exploring the Model by SakshiSharma
...