in Machine Learning by (31.6k points)
Do gradient descent methods of always converge to same point?

1 Answer

0 votes
by (32.2k points)

 No, they do not because in some cases it reaches an local minima or a local optima points. You don’t reach to the global optima point. It depends on the data and starting the conditions.

Related questions

0 votes
asked Jan 18, 2020 in Machine Learning by sharadyadav1986 (31.6k points)
...