Regularized logistic regresion with vectorization

谁说我不能喝 提交于 2021-01-07 02:41:45

问题


I'm trying to implement a vectorized version of the regularised logistic regression. I have found a post that explains the regularised version but I don't understand it.

To make it easy I will copy the code below:

hx = sigmoid(X * theta);
m = length(X);
J = (sum(-y' * log(hx) - (1 - y') * log(1 - hx)) / m) + lambda * sum(theta(2:end).^2) / (2*m);
grad =((hx - y)' * X / m)' + lambda .* theta .* [0; ones(length(theta)-1, 1)] ./ m ;

I understand the first part of the Cost equation, If I'm correct it could be represented as:

J = ((-y' * log(hx)) - ((1-y)' * log(1-hx)))/m; 

The problem it's the regularization term. Let's take more detail:

Dimensions:

X = (m x (n+1))
theta = ((n+1) x 1)

I don't understand why he let the first term of theta (theta_0) outside of the equation, when in theory the regularized term it's:

and it has to take into account all the thetas

For the gradient descent, I think that this equation it's equivalent:

L = eye(length(theta));
L(1,1) = 0;

grad = (1/m * X'* (hx - y)+ (lambda*(L*theta)/m).

回答1:


I'm also new here...

In Matlab indexes begin from 1, and in mathematic indexes begin from 0 (the indexes on the formula which you mentioned are also beginning from 0).

So, in theory, the first term of theta also needs to be let outside of the equation.

And as for your second question, you right! It is an equivalent clean equation!



来源:https://stackoverflow.com/questions/64030007/regularized-logistic-regresion-with-vectorization

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!