问题
I found simple code in python for image registration here
in simple case of translation we have:
def translation(im0, im1):
"""Return translation vector to register images."""
shape = im0.shape
f0 = fft2(im0)
f1 = fft2(im1)
ir = abs(ifft2((f0 * f1.conjugate()) / (abs(f0) * abs(f1))))
t0, t1 = numpy.unravel_index(numpy.argmax(ir), shape)
if t0 > shape[0] // 2:
t0 -= shape[0]
if t1 > shape[1] // 2:
t1 -= shape[1]
return [t0, t1]
but I don't understand this part:
if t0 > shape[0] // 2:
t0 -= shape[0]
if t1 > shape[1] // 2:
t1 -= shape[1]
also it gives wrong shift sometimes, so it seems that output of t0,t1 depends on some cases? Maybe it because I have only overlap between images?
EDIT:
Also here is my tests using other tools:
For lion img from wikipedia (pure shift)
http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/im1.png http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/im2.png
ImageJ gives (second stack relative to first stack) x= -20 y= -23 R= 0.8126828943265368 (good)
phaseCorrelate gives x= 20.19 y= 22.56 (it gives shift first image relative to second image or something wrong?)
no hann window x= 20.23 y= 22.43
python code x= -22 y=- 14
test template matching for lion and lion head croped at (1,1)
http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/im2.png http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/temp_1_1.png
ImageJ gives (second stack relative to first stack) x= 0 y= 1 R= 0.7905318337522524 (failed 1 pix)
phaseCorrelate gives x= -0.4 y= -2.45 (not accurate and again opposite direction)
no hann window x= -0.88 y= -0.86
same but croped at (18,23)
http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/im2.png http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/temp_18_23.png
ImageJ gives (second stack relative to first stack) x= 17 y= 23 R= 0.8119669906973865 (failed 1 pix)
phaseCorrelate gives x= -18 y= -23 (good but opposite direction)
no hann window x= -18 y= -22.98
test image divided to 2 images with % overlap (no noise, no distortion)
http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/1.png http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/2.png
(second stack relative to first stack) x= 744 y= 0 R= 0.9999999999999999
phaseCorrelate gives x= -743.48 y= 0 (opposite direction)
no hann window x= -743.49 y= 0
test on real data
http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/1_.PNG http://dl.dropbox.com/u/8841028/FFT%20template%20matching/test%20images/2_.PNG
ImageJ gives (second stack relative to first stack) x= 878 y= -3 R= 0.9667271264277764
phaseCorrelate gives x= 34.47 y= -35.5 (wrong)
no hann window x= 146.32 y= 3.06 (wrong)
opencv 2.4.3(prebuild) code that I used.
#include "stdafx.h"
#include <opencv.hpp>
using namespace cv;
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
Mat im1= imread("1.PNG",0);
Mat im2= imread("2.PNG",0);
Mat r1;
im1.convertTo(r1,CV_64F);
Mat r2;
im2.convertTo(r2,CV_64F);
Point2d phaseShift;
if(r1.cols!=r2.cols||r1.rows!=r2.rows)
{
int n_cols= max(r1.cols,r2.cols);
int n_rows= max(r1.rows,r2.rows);
Mat r1_pad;
copyMakeBorder(r1,r1_pad,0,n_rows-r1.rows,0,n_cols-r1.cols, BORDER_CONSTANT, Scalar::all(0));
Mat r2_pad;
copyMakeBorder(r2,r2_pad,0,n_rows-r2.rows,0,n_cols-r2.cols, BORDER_CONSTANT, Scalar::all(0));
Mat hann;
createHanningWindow(hann, r1_pad.size(), CV_64F);
phaseShift = phaseCorrelate(r1_pad, r2_pad, hann);
}
else
{
Mat hann;
createHanningWindow(hann, r1.size(), CV_64F);
phaseShift = phaseCorrelate(r1, r2, hann);
}
return 0;
}
来源:https://stackoverflow.com/questions/16294700/fft-based-image-registration-in-python