cross-correlation

Is Matlab still slower than opencv in C++

与世无争的帅哥 提交于 2019-12-05 12:52:45
According to this link and this one , it is said that opencv is much faster than matlab. First link is written in March 2012, second one is a bit later than that. In the first link, it says, "Programs written in OpenCV run much faster than similar programs written in Matlab." and rates Matlab: 2/10 and OpenCV: 9/10 Consider, I have two float Matrix whose size are 1024*1024 ( mat1 and mat2 ). I want to correlate this matrices. In matlab, corr2(mat1,mat2); //70-75 ms In opencv, c++ Mat result(1,1,CV_32F); matchTemplate(mat1,mat2,result, CV_TM_CCOEFF_NORMED); // 145-150 ms As far as I know, c and

numpy and statsmodels give different values when calculating correlations, How to interpret this?

给你一囗甜甜゛ 提交于 2019-12-05 08:36:16
I can't find a reason why calculating the correlation between two series A and B using numpy.correlate gives me different results than the ones I obtain using statsmodels.tsa.stattools.ccf Here's an example of this difference I mention: import numpy as np from matplotlib import pyplot as plt from statsmodels.tsa.stattools import ccf #Calculate correlation using numpy.correlate def corr(x,y): result = numpy.correlate(x, y, mode='full') return result[result.size/2:] #This are the data series I want to analyze A = np.array([np.absolute(x) for x in np.arange(-1,1.1,0.1)]) B = np.array([x for x in

Calculating the blur kernel between 2 images

让人想犯罪 __ 提交于 2019-12-04 10:01:20
问题 Unlike the standard (and more challenging) de-blurring and super resolution scenarios, I have access to both the original (sharp) image G and it's blurred version B . I'm simply looking for the blur kernel h . So because B is taken using a real camera the relation is: B=G*h+N (where * denotes convolution and N is some additive noise) Naturally, this is an over-constrained problem since h is small in size compared to G and B and so every few pixels in the pair of images generate an equation on

Performing a phase correlation with fft in R

谁说胖子不能爱 提交于 2019-12-03 20:13:44
I am trying to implement a 2d phase correlation algorithm in R using a recipe from Wikipedia ( http://en.wikipedia.org/wiki/Phase_correlation ) in order to track the movement between 2 images. These images (frames) were captured with a camera shaking in the wind and the ultimate goal is to remove the shake in these and subsequent frames. The two example images and the R code are below: ## we will need the tiff library library(tiff) ## read in the tiff files f1=as.matrix(readTIFF('f1.tiff',native=TRUE)) f2=as.matrix(readTIFF('f2.tiff',native=TRUE)) ## take the fft of the first frame F1 <- fft

Image registration using python and cross-correlation

回眸只為那壹抹淺笑 提交于 2019-12-03 12:25:23
问题 I got two images showing exaktly the same content: 2D-gaussian-shaped spots. I call these two 16-bit png-files "left.png" and "right.png". But as they are obtained thru an slightly different optical setup, the corresponding spots (physically the same) appear at slightly different positions. Meaning the right is slightly stretched, distorted, or so, in a non-linear way. Therefore I would like to get the transformation from left to right. So for every pixel on the left side with its x- and y

Calculating the blur kernel between 2 images

浪尽此生 提交于 2019-12-03 09:21:52
Unlike the standard (and more challenging) de-blurring and super resolution scenarios, I have access to both the original (sharp) image G and it's blurred version B . I'm simply looking for the blur kernel h . So because B is taken using a real camera the relation is: B=G*h+N (where * denotes convolution and N is some additive noise) Naturally, this is an over-constrained problem since h is small in size compared to G and B and so every few pixels in the pair of images generate an equation on the entries of h . But what would be the simplest way to actually implement this? My thoughts so far:

How to use the cross-spectral density to calculate the phase shift of two related signals

时光毁灭记忆、已成空白 提交于 2019-12-03 06:02:45
问题 I've two signals, from which I expect that one is responding on the other, but with a certain phase shift. Now I would like to calculate the coherence or the normalized cross spectral density to estimate if there is any causality between the input and output to find out on which frequencies this coherence appear. See for example this image (from here) which seems to have high coherence at the frequency 10: Now I know that I can calculate the phase shift of two signals using the cross

Image registration using python and cross-correlation

廉价感情. 提交于 2019-12-03 03:43:00
I got two images showing exaktly the same content: 2D-gaussian-shaped spots. I call these two 16-bit png-files "left.png" and "right.png". But as they are obtained thru an slightly different optical setup, the corresponding spots (physically the same) appear at slightly different positions. Meaning the right is slightly stretched, distorted, or so, in a non-linear way. Therefore I would like to get the transformation from left to right. So for every pixel on the left side with its x- and y-coordinate I want a function giving me the components of the displacement-vector that points to the

A question on cross-correlation & correlation coefficient [duplicate]

半腔热情 提交于 2019-12-03 03:21:35
问题 This question already has answers here : Closed 8 years ago . Possible Duplicate: Matlab Cross correlation vs Correlation Coefficient question When I cross correlate 2 data sets a and b (each 73 points long) in MATLAB and graph it, it appears like a triangle with 145 points. I'm confused between the correlation coefficient and the triangle-like graph when I plot the cross correlation output which ranges from +/- 1. 回答1: I seriously think you need to read up more on cross-correlation functions

Cross-correlation (time-lag-correlation) with pandas?

感情迁移 提交于 2019-12-03 02:44:40
问题 I have various time series, that I want to correlate - or rather, cross-correlate - with each other, to find out at which time lag the correlation factor is the greatest. I found various questions and answers/links discussing how to do it with numpy, but those would mean that I have to turn my dataframes into numpy arrays. And since my time series often cover different periods, I am afraid that I will run into chaos. Edit The issue I am having with all the numpy/scipy methods, is that they