Categories

Nov 28, 2019 in Machine Learning

Q: Do gradient descent methods of always converge to same point?

1 Answer

Nov 28, 2019

 No, they do not because in some cases it reaches an local minima or a local optima points. You don’t reach to the global optima point. It depends on the data and starting the conditions.

Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

Madanswer
Jan 29 in Other
Nov 29, 2019 in Machine Learning
Nov 29, 2019 in Machine Learning
...