import numpy as np
a = np.array([[1,2,3],
[4,5,6],
[7,8,9]])
b = np.array([[1,2,3]]).T
c = a.dot(b) #function
jacobian = a # as partial
You can use the Harvard autograd
library (link), where grad
and jacobian
take a function as their argument:
import autograd.numpy as np
from autograd import grad, jacobian
x = np.array([5,3], dtype=float)
def cost(x):
return x[0]**2 / x[1] - np.log(x[1])
gradient_cost = grad(cost)
jacobian_cost = jacobian(cost)
gradient_cost(x)
jacobian_cost(np.array([x,x,x]))
Otherwise, you could use the jacobian
method available for matrices in sympy
:
from sympy import sin, cos, Matrix
from sympy.abc import rho, phi
X = Matrix([rho*cos(phi), rho*sin(phi), rho**2])
Y = Matrix([rho, phi])
X.jacobian(Y)
Also, you may also be interested to see this low-level variant (link). MATLAB provides nice documentation on its jacobian
function here.