derivative

LibGDX CatmullRomSpline Derivative Meaning?

前提是你 提交于 2020-01-07 03:04:05
问题 When calling the derivative method on the CatmullRomSpline, what exactly does this value represent? It’s been a while since calculus, but if my memory serves me correctly taking the derivative of the position with respect to time gives you the velocity at that point in time. But with CatmullRomSpline the “time” value is a percentage, so is the resulting derivative in pixels/percent? I printed out the derivative values (vector length) along my path and the values go as high as “989.6049”,

Derivative of log plot in python

喜夏-厌秋 提交于 2020-01-06 05:48:08
问题 We have the x and y values, and I am taking their log, by logx = np.log10(x) and logy = np.log10(y) . I am trying to compute the derivative of logy w.r.t logx, so dlogy/dlogx. I used to do this successfully using numpy gradient, more precisely derivy = np.gradient(logy,np.gradient(logx)) but for some strange reason it doesn't seem to work anymore yielding the error: "Traceback (most recent call last): File "derivlog.py", line 79, in <module> grady = np.gradient(logy,np.gradient(logx)) File "

MATLAB - tricky ode system with boolean

白昼怎懂夜的黑 提交于 2020-01-04 14:11:22
问题 EDITED: Thanks for upvotes, now i finally added images. Added full m.file, although i don't think it wass necessary. The key of the code is. xp(2)=...-((x(2)>=X2)&(xp(3)>=0)...; xp(3)=...-((x(3)>=X3)&(xp(2)>=0))...; Full code: function xp=uzdevums1(t,x) %parametr values r1 = 0.1; r2 = 1; r3 = 0.2; K1=100;K2 = 100; K3 = 100; X2=25;X3=10; speedx2 = 0.02; speedx3=0.02; %ode system xp=zeros(3,1); xp(1)=r1*(1-x(1)/(x(2)+x(3)))*x(1); xp(2)=r2*(1-x(2)/K2)*x(2)-((x(2)>=X2)&(xp(3)>=0)&xp(1)>0)*x(2)*x

Calculating a 3D gradient with unevenly spaced points

青春壹個敷衍的年華 提交于 2020-01-04 04:13:08
问题 I currently have a volume spanned by a few million every unevenly spaced particles and each particle has an attribute (potential, for those who are curious) that I want to calculate the local force (acceleration) for. np.gradient only works with evenly spaced data and I looked here: Second order gradient in numpy where interpolation is necessary but I could not find a 3D spline implementation in Numpy. Some code that will produce representative data: import numpy as np from scipy.spatial

Keras: calculating derivatives of model output wrt input returns [None]

牧云@^-^@ 提交于 2019-12-30 03:35:11
问题 I need help with calculating derivatives for model output wrt inputs in Keras. I want to add a regularization functional to the loss function. The regularizer contains the derivative of the classifier function. So I tried to take the derivative of model output. The model is a MLP with one hidden layer. The dataset is MNIST. When I compile the model and take the derivative, I get [None] as the result instead of the derivative function. I have seen a similar post, but didn't get answer there

Evaluate the second order differential equation in MATLAB

自作多情 提交于 2019-12-25 03:33:08
问题 I am deriving a second order differential equation in MATLAB. I have defined a time dependent variable and then applied following derivative operations- syms a b; th = sym('th(t)'); %th is a time dependent variable y = diff(a^2*cos(th)+(b/12)*sin(th)); thd = diff(th); %derivative of th wrt time ybythd = diff(y,thd); %derivative of y wrt thd p = diff(ybythd); %derivative of ybythd wrt time These operations calculates the value of p as following- p = diff(diff((b*cos(th(t))*diff(th(t), t))/12 -

Evaluate the second order differential equation in MATLAB

不羁的心 提交于 2019-12-25 03:32:06
问题 I am deriving a second order differential equation in MATLAB. I have defined a time dependent variable and then applied following derivative operations- syms a b; th = sym('th(t)'); %th is a time dependent variable y = diff(a^2*cos(th)+(b/12)*sin(th)); thd = diff(th); %derivative of th wrt time ybythd = diff(y,thd); %derivative of y wrt thd p = diff(ybythd); %derivative of ybythd wrt time These operations calculates the value of p as following- p = diff(diff((b*cos(th(t))*diff(th(t), t))/12 -

taking the gradient in Tensorflow, tf.gradient

人盡茶涼 提交于 2019-12-24 18:23:32
问题 I am using this function of tensorflow to get my function jacobian. Came across two problems: The tensorflow documentation is contradicted to itself in the following two paragraph if I am not mistaken: gradients() adds ops to the graph to output the partial derivatives of ys with respect to xs. It returns a list of Tensor of length len(xs) where each tensor is the sum(dy/dx) for y in ys. Blockquote Blockquote Returns: A list of sum(dy/dx) for each x in xs. Blockquote According to my test, it

Numerical differentiation via numpy FFT

我与影子孤独终老i 提交于 2019-12-24 14:27:52
问题 I am learning how to use numpy for Fast Fourier transform differentiation. In the code below, I create a simple sine function and try to get the cosine. The result is shown in the image, there seems to be a normalization factor which I do not understand despite reading the documentation and which prevents me from getting the correct results. Can you tell me how to get rid of the normalization factor or if I am failing in a different way? Also please explain why the Nyquist frequency is not

(Openmdao 2.4.0) 'compute_partials' function of a Component seems to be run even when forcing 'declare_partials' to FD for this component

佐手、 提交于 2019-12-24 10:56:21
问题 I want to solve MDA for Sellar using Newton non linear solver for the Group . I have defined Disciplines with Derivatives (using 'compute_partials') but I want to check the number of calls to Discipline 'compute' and 'compute_partials' when forcing or not the disciplines not to use their analytical derivatives (using 'declare_partials' in the Problem definition ). The problem is that is seems that the 'compute_partials' function is still called even though I force not to use it . Here is an