OpenCV: How to calculate essential matrix from feature matches between two images from different cameras using 5-point algorithm?

后端 未结 1 1662
悲&欢浪女
悲&欢浪女 2021-01-27 03:16

Basically I want do to the same thing as this function here: https://docs.opencv.org/master/d9/d0c/group__calib3d.html#ga13f7e34de8fa516a686a56af1196247f

However, the pa

相关标签:
1条回答
  • 2021-01-27 03:44

    In OpenCV there is no standard function for calculating the essential matrix using two different cameras. But it is very easy to implement it yourself. You can add a function to the five-point.cpp and recompile OpenCV. I just added an overloaded function cv::findEssentialMat with an additional parameter for the second camera matrix.

    cv::Mat cv::findEssentialMat(InputArray _points1, InputArray _points2, InputArray _cameraMatrix1, InputArray _cameraMatrix2, int method, double prob, double threshold, OutputArray _mask)
    {
        CV_INSTRUMENT_REGION();
    
        Mat points1, points2, cameraMatrix1, cameraMatrix2;
        _points1.getMat().convertTo(points1, CV_64F);
        _points2.getMat().convertTo(points2, CV_64F);
        _cameraMatrix1.getMat().convertTo(cameraMatrix1, CV_64F);
        _cameraMatrix2.getMat().convertTo(cameraMatrix2, CV_64F);
    
        int npoints = points1.checkVector(2);
        CV_Assert(npoints >= 0 && points2.checkVector(2) == npoints &&
            points1.type() == points2.type());
    
        CV_Assert(cameraMatrix1.rows == 3 && cameraMatrix1.cols == 3 && cameraMatrix1.channels() == 1);
        CV_Assert(cameraMatrix2.rows == 3 && cameraMatrix2.cols == 3 && cameraMatrix2.channels() == 1);
    
        if (points1.channels() > 1)
        {
            points1 = points1.reshape(1, npoints);
            points2 = points2.reshape(1, npoints);
        }
    
        double fx1 = cameraMatrix1.at<double>(0, 0);
        double fy1 = cameraMatrix1.at<double>(1, 1);
        double cx1 = cameraMatrix1.at<double>(0, 2);
        double cy1 = cameraMatrix1.at<double>(1, 2);
        double fx2 = cameraMatrix2.at<double>(0, 0);
        double fy2 = cameraMatrix2.at<double>(1, 1);
        double cx2 = cameraMatrix2.at<double>(0, 2);
        double cy2 = cameraMatrix2.at<double>(1, 2);
    
        points1.col(0) = (points1.col(0) - cx1) / fx1;
        points2.col(0) = (points2.col(0) - cx2) / fx2;
        points1.col(1) = (points1.col(1) - cy1) / fy1;
        points2.col(1) = (points2.col(1) - cy2) / fy2;
    
        // Reshape data to fit opencv ransac function
        points1 = points1.reshape(2, npoints);
        points2 = points2.reshape(2, npoints);
    
        threshold /= (fx1 + fy1) / 2;
    
        Mat E;
        if (method == RANSAC)
            createRANSACPointSetRegistrator(makePtr<EMEstimatorCallback>(), 5, threshold, prob)->run(points1, points2, E, _mask);
        else
            createLMeDSPointSetRegistrator(makePtr<EMEstimatorCallback>(), 5, prob)->run(points1, points2, E, _mask);
    
        return E;
    }
    

    Then you have to add the function declaration to calib3d.hpp, recompile and reinstall your OpenCV version.

    Additional question: If I have calculated the essential matrix can I just use it as parameter in the method computeCorrespondEpilines() instead of the fundamental matrix, assuming images are already rectified?

    Yes, I think this should work.

    0 讨论(0)
提交回复
热议问题