in Machine Learning by
Do gradient descent methods of always converge to same point?

1 Answer

0 votes

 No, they do not because in some cases it reaches an local minima or a local optima points. You don’t reach to the global optima point. It depends on the data and starting the conditions.

Related questions

0 votes
asked Nov 21, 2021 in MongoDB by DavidAnderson
0 votes
asked Jun 12, 2021 in Deep Learning by Robindeniel
0 votes
asked Apr 26, 2020 in Color Theory by SakshiSharma