I have images of same scene focused at different distances. I want to align the images such that every feature is in exactly same position in every image. How can i do this
First you have determine the transformation between your images. It can be as simple as shift or as complex as non-linear warping. Focusing operation can not only shift but also slightly scale and even shear the images. In this case you have to use Homography to match them.
Second, a simplest way that you have to try is to manually select at least 4 paris of corresponding points, record their coordinates and feed them into findHomography() function. The resulting 3x3 transformation can be used in warpPerspective() to match the images. If the outcome is good you can automate the process of finding correspondences by using, say, SURF points and matching their descriptors.
Finally, if the result is unsatisfactory you have to have a more general transformation than homography. I would try a piece-wise Affine trying to match pieces of images separately. Anyway, more info about your input and a final goal will help to solve the problem properly.
The basic idea is to in-plane rotate the images and its 2D translation in X and Y directions to based on their matched feature points.
Check out Image Alignment Algorithms where you can find two approaches to do this using OpenCV (with code).
You can use image stitching or feature machting algorithms of openCV.
The main steps of feature matching are :