Partial Derivative using Autograd

前端 未结 1 1596
名媛妹妹
名媛妹妹 2021-02-11 03:24

I have a function that takes in a multivariate argument x. Here x = [x1,x2,x3]. Let\'s say my function looks like: f(x,T) = np.dot(x,T) + np.exp(np.dot(x,T) where T is a constan

1条回答
  •  我在风中等你
    2021-02-11 04:09

    I found the following description of the grad function in the autograd source code:

    def grad(fun, x)
    "Returns a function which computes the gradient of `fun` with
    respect to positional argument number `argnum`. The returned
    function takes the same arguments as `fun`, but returns the
    gradient instead. The function `fun`should be scalar-valued. The
    gradient has the same type as the argument."
    

    So

    def h(x,t):
        return np.dot(x,t) + np.exp(np.dot(x,t))
    h_x = grad(h,0) # derivative with respect to x
    h_t = grad(h,1) # derivative with respect to t
    

    Also make sure to use the numpy libaray that comes with autograd

    import autograd.numpy as np
    

    instead of

    import numpy as np
    

    in order to make use of all numpy functions.

    0 讨论(0)
提交回复
热议问题