Nov 28, 2019 in Machine Learning
Q: Do gradient descent methods of always converge to same point?

1 Answer

0 votes
Nov 28, 2019

 No, they do not because in some cases it reaches an local minima or a local optima points. You don’t reach to the global optima point. It depends on the data and starting the conditions.

Related questions

0 votes
Jun 12 in Deep Learning
0 votes
Jan 29, 2020 in Other