I\'m trying to implement a simple regularized logistic regression algorithm in Julia. I\'d like to use Optim.jl library to minimize my cost function, but I can\'t get it to work
Below you find my cost and gradient computation functions for Logistic Regression using closures and currying (version for those who got used to a function that returns the cost and gradient):
function cost_gradient(θ, X, y, λ)
m = length(y)
return (θ::Array) -> begin
h = sigmoid(X * θ)
J = (1 / m) * sum(-y .* log(h) .- (1 - y) .* log(1 - h)) + λ / (2 * m) * sum(θ[2:end] .^ 2)
end, (θ::Array, storage::Array) -> begin
h = sigmoid(X * θ)
storage[:] = (1 / m) * (X' * (h .- y)) + (λ / m) * [0; θ[2:end]]
end
end
Sigmoid function implementation:
sigmoid(z) = 1.0 ./ (1.0 + exp(-z))
To apply cost_gradient
in Optim.jl do the following:
using Optim
#...
# Prerequisites:
# X size is (m,d), where d is the number of training set features
# y size is (m,1)
# λ as the regularization parameter, e.g 1.5
# ITERATIONS number of iterations, e.g. 1000
X=[ones(size(X,1)) X] #add x_0=1.0 column; now X size is (m,d+1)
initialθ = zeros(size(X,2),1) #initialTheta size is (d+1, 1)
cost, gradient! = cost_gradient(initialθ, X, y, λ)
res = optimize(cost, gradient!, initialθ, method = ConjugateGradient(), iterations = ITERATIONS);
θ = Optim.minimizer(res);
Now, you can easily predict (e.g. training set validation):
predictions = sigmoid(X * θ) #X size is (m,d+1)
Either try my approach or compare it with your implementation.