Compute the Jacobian matrix in Python

前端 未结 6 1713
温柔的废话
温柔的废话 2021-02-02 01:24
import numpy as np


a = np.array([[1,2,3],
              [4,5,6],
              [7,8,9]])


b = np.array([[1,2,3]]).T

c = a.dot(b) #function

jacobian = a # as partial         


        
相关标签:
6条回答
  • 2021-02-02 01:33

    You can use the Harvard autograd library (link), where grad and jacobian take a function as their argument:

    import autograd.numpy as np
    from autograd import grad, jacobian
    
    x = np.array([5,3], dtype=float)
    
    def cost(x):
        return x[0]**2 / x[1] - np.log(x[1])
    
    gradient_cost = grad(cost)
    jacobian_cost = jacobian(cost)
    
    gradient_cost(x)
    jacobian_cost(np.array([x,x,x]))
    

    Otherwise, you could use the jacobian method available for matrices in sympy:

    from sympy import sin, cos, Matrix
    from sympy.abc import rho, phi
    
    X = Matrix([rho*cos(phi), rho*sin(phi), rho**2])
    Y = Matrix([rho, phi])
    
    X.jacobian(Y)
    

    Also, you may also be interested to see this low-level variant (link). MATLAB provides nice documentation on its jacobian function here.

    0 讨论(0)
  • 2021-02-02 01:48

    Here is a Python implementation of the mathematical Jacobian of a vector function f(x), which is assumed to return a 1-D numpy array.

    import numpy as np
    
    def J(f, x, dx=1e-8):
        n = len(x)
        func = f(x)
        jac = np.zeros((n, n))
        for j in range(n):  # through columns to allow for vector addition
            Dxj = (abs(x[j])*dx if x[j] != 0 else dx)
            x_plus = [(xi if k != j else xi + Dxj) for k, xi in enumerate(x)]
            jac[:, j] = (f(x_plus) - func)/Dxj
        return jac
    

    It is recommended to make dx ~ 10-8.

    0 讨论(0)
  • 2021-02-02 01:51

    In python 3, you can try sympy package:

    import sympy as sym
    
    def Jacobian(v_str, f_list):
        vars = sym.symbols(v_str)
        f = sym.sympify(f_list)
        J = sym.zeros(len(f),len(vars))
        for i, fi in enumerate(f):
            for j, s in enumerate(vars):
                J[i,j] = sym.diff(fi, s)
        return J
    
    Jacobian('u1 u2', ['2*u1 + 3*u2','2*u1 - 3*u2'])
    

    which gives out:

    Matrix([[2,  3],[2, -3]])
    
    0 讨论(0)
  • 2021-02-02 01:53

    If you want to find the Jacobian numerically for many points at once (for example, if your function accepts shape (n, x) and outputs (n, y)), here is a function. This is essentially the answer from James Carter but for many points. The dx may need to be adjusted based on absolute value as in his answer.

    def numerical_jacobian(f, xs, dx=1e-6):
      """
          f is a function that accepts input of shape (n_points, input_dim)
          and outputs (n_points, output_dim)
    
          return the jacobian as (n_points, output_dim, input_dim)
      """
      if len(xs.shape) == 1:
        xs = xs[np.newaxis, :]
        
      assert len(xs.shape) == 2
    
      ys = f(xs)
      
      x_dim = xs.shape[1]
      y_dim = ys.shape[1]
      
      jac = np.empty((xs.shape[0], y_dim, x_dim))
      
      for i in range(x_dim):
        x_try = xs + dx * e(x_dim, i + 1)
        jac[:, :, i] = (f(x_try) - ys) / dx
      
      return jac
    
    def e(n, i):
      ret = np.zeros(n)
      ret[i - 1] = 1.0
      return ret
    
    0 讨论(0)
  • 2021-02-02 01:55

    The Jacobian is only defined for vector-valued functions. You cannot work with arrays filled with constants to calculate the Jacobian; you must know the underlying function and its partial derivatives, or the numerical approximation of these. This is obvious when you consider that the (partial) derivative of a constant (with respect to something) is 0.

    In Python, you can work with symbolic math modules such as SymPy or SymEngine to calculate Jacobians of functions. Here's a simple demonstration of an example from Wikipedia:

    Using the SymEngine module:

    Python 2.7.11 (v2.7.11:6d1b6a68f775, Dec  5 2015, 20:40:30) [MSC v.1500 64 bit (AMD64)] on win32
    Type "help", "copyright", "credits" or "license" for more information.
    >>>
    >>> import symengine
    >>>
    >>>
    >>> vars = symengine.symbols('x y') # Define x and y variables
    >>> f = symengine.sympify(['y*x**2', '5*x + sin(y)']) # Define function
    >>> J = symengine.zeros(len(f),len(vars)) # Initialise Jacobian matrix
    >>>
    >>> # Fill Jacobian matrix with entries
    ... for i, fi in enumerate(f):
    ...     for j, s in enumerate(vars):
    ...         J[i,j] = symengine.diff(fi, s)
    ...
    >>> print J
    [2*x*y, x**2]
    [5, cos(y)]
    >>>
    >>> print symengine.Matrix.det(J)
    2*x*y*cos(y) - 5*x**2
    
    0 讨论(0)
  • 2021-02-02 01:55

    While autograd is a good library, make sure to check out its upgraded version JAX which is very well documented (compared to autograd).

    A simple example:

    import jax.numpy as jnp
    from jax import jacfwd
    
    # Define some simple function.
    def sigmoid(x):
        return 0.5 * (jnp.tanh(x / 2) + 1)
    # Note that here, I want a derivative of a "vector" output function (inputs*a + b is a vector) wrt a input 
    # "vector" a at a0: Derivative of vector wrt another vector is a matrix: The Jacobian
    def simpleJ(a, b, inputs): #inputs is a matrix, a & b are vectors
        return sigmoid(jnp.dot(inputs, a) + b)
    
    inputs = jnp.array([[0.52, 1.12,  0.77],
                       [0.88, -1.08, 0.15],
                       [0.52, 0.06, -1.30],
                       [0.74, -2.49, 1.39]])
    
    b = jnp.array([0.2, 0.1, 0.3, 0.2])
    a0 = jnp.array([0.1,0.7,0.7])
    
    # Isolate the function: variables to be differentiated from the constant parameters
    f = lambda a: simpleJ(a, b, inputs) # Now f is just a function of variable to be differentiated
    
    J = jacfwd(f)
    # Till now I have only calculated the derivative, it still needs to be evaluated at a0.
    J(a0)
    
    0 讨论(0)
提交回复
热议问题