What is the difference between Gradient Descent and Newton's Gradient Descent?

后端 未结 4 1192

I understand what Gradient Descent does. Basically it tries to move towards the local optimal solution by slowly moving down the curve. I am trying to understand what is the act

4条回答
  •  粉色の甜心
    2021-01-29 23:26

    If you simply compare Gradient Descent and Newton's method, the purpose of the two methods are different.

    Gradient Descent is used to find(approximate) local maxima or minima (x to make min f(x) or max f(x)). While Newton's method is to find(approximate) the root of a function, i.e. x to make f(x) = 0

    In this sense, they are used to solve different problems. However, Newton's method can also be used in the context of optimization (the realm that GD is solving). Because finding maxima or minima can be approached by finding f'(x) = 0 which is exactly Newton's method is used for.

    In conclusion, two methods can be used in optimization: 1)GD and 2)find x so f'(x)=0 and Newton's method is just a way to solve that second problem.

提交回复
热议问题