in Machine Learning by
Q:
Do gradient descent methods of always converge to same point?

▼ Show 1 Answer

0 votes
by

 No, they do not because in some cases it reaches an local minima or a local optima points. You don’t reach to the global optima point. It depends on the data and starting the conditions.

Learn More with Madanswer

Related questions

0 votes
asked Apr 14 in ML Exploring the Model by SakshiSharma
...