user image

Deepika Deepika

Job Interview Skills
English
2 years ago

Do gradient descent methods always converge to the same point?

user image

Abhishek Mishra

2 years ago

No, gradient descent methods do not always converge to the same point because they converge to a local minimum or a local optima point in some cases. It depends a lot on the data one is dealing with and the initial values of the learning parameter.

Recent Doubts

Close [x]