why gradient descent when we can solve linear regression analytically

后端 未结 4 1524
走了就别回头了
走了就别回头了 2021-01-29 22:16

what is the benefit of using Gradient Descent in the linear regression space? looks like the we can solve the problem (finding theta0-n that minimum the cost func) with analytic

4条回答
  •  长发绾君心
    2021-01-29 23:11

    When you use the normal equations for solving the cost function analytically you have to compute:

    enter image description here

    Where X is your matrix of input observations and y your output vector. The problem with this operation is the time complexity of calculating the inverse of a nxn matrix which is O(n^3) and as n increases it can take a very long time to finish.

    When n is low (n < 1000 or n < 10000) you can think of normal equations as the better option for calculation theta, however for greater values Gradient Descent is much more faster, so the only reason is the time :)

提交回复
热议问题