camera-calibration

Calibration of images to obtain a top-view for points that lie on a same plane

帅比萌擦擦* 提交于 2019-12-04 19:21:53
问题 Calibration: I have calibrated the camera using this vision toolbox in Matlab. I used checkerboard images to do so. After calibration I get the cameraParams which contains: Camera Extrinsics RotationMatrices: [3x3x18 double] TranslationVectors: [18x3 double] and Camera Intrinsics IntrinsicMatrix: [3x3 double] FocalLength: [1.0446e+03 1.0428e+03] PrincipalPoint: [604.1474 359.7477] Skew: 3.5436 Aim: I have recorded trajectories of some objects in motion using this camera. Each object

OpenCV camera calibration of an image crop (ROI submatrix)

放肆的年华 提交于 2019-12-04 12:59:14
I have a bit of a problem working with OpenCV's undistort function. I am working with a camera using a wide angle lens. Let's say my access to it is problematic as it is already installed. The problem basically boils down to this: I have successfully measured all the lens parameters and can undistort a full frame image with no problem, the issue is I am actually working in sort of a linescan mode. We're using just a cut out in the middle of the sensor, about 100 px tall. Images for illustration: Now, if I apply undistort to the ROI (Region of interest) of the image in question it naturally

camera calibration MATLAB toolbox

若如初见. 提交于 2019-12-04 11:53:31
问题 I have to perform re-projection of my 3D points (I already have data from Bundler). I am using Camera Calibration toolbox in MATLAB to get the intrinsic camera parameters. I got output like this from 27 images (chess board; images are taken from different angles). Calibration results after optimization (with uncertainties): Focal Length: fc = [ 2104.11696 2101.75357 ] ± [ 23.13283 22.92478 ] Principal point: cc = [ 969.15779 771.30555 ] ± [ 21.98972 15.25166 ] Skew: alpha_c = [ 0.00000 ] ± [

Pixel coordinates to 3D line (opencv)

隐身守侯 提交于 2019-12-04 09:57:09
I have an image displayed on screen which is undistorted via cvInitUndistortMap & cvRemap (having done camera calibration), and the user clicks on a feature in the image. So I have the (u,v) pixel coordinates of the feature, and I also have the intrinsic matrix and the distortion matrix. What I'm looking for is the equation of the 3D line in camera/real-world coordinates on which the feature the user clicked must lie. I already have the perpendicular distance between the camera's image plane and the feature, so I can combine that with the aforementioned equation to give me the (X,Y,Z)

How can I undistort an image in Matlab using the known camera parameters?

这一生的挚爱 提交于 2019-12-04 09:25:41
问题 This is easy to do in OpenCV however I would like a native Matlab implementation that is fairly efficient and can be easily changed. The method should be able to take the camera parameters as specified in the above link. 回答1: You can now do that as of release R2013B, using the Computer Vision System Toolbox. There is a GUI app called Camera Calibrator and a function undistortImage. 回答2: The simplest and most common way of doing undistort (also called unwarp or compensating for lens distortion

OpenCV stereoRectify distorts image

我的未来我决定 提交于 2019-12-04 09:01:34
问题 we have a ELP 1.0 Megapixel Dual Lens Usb Stereo camera and we are trying to calibrate it using OpenCV 3.1 in C++. However, the result of the calibration is totally unusable, because calling stereoRectify totally twistes the image. This is what we do: Finding calibration (chessboard) pattern in both cameras, chessboard size is 5x7 and the result is almost same regardless the number of images taken findChessboardCorners(img[k], boardSize, corners, CALIB_CB_ADAPTIVE_THRESH | CALIB_CB_NORMALIZE

OpenCV Stereo Matching/Calibration

╄→гoц情女王★ 提交于 2019-12-04 08:33:20
问题 I'd initially posted this on the OpenCV forums, but unfortunately, I didn't get too many views/replies so I'm posting here with the hopes that someone might have a direction to please suggest? I am using the Bumblebee XB3 Stereo Camera and it has 3 lenses. I've spent about three weeks reading forums, tutorials, the Learning OpenCV book and the actual OpenCV documentation on using the stereo calibration and stereo matching functionality. In summary, my issue is that I have a good disparity map

3D reconstruction from two calibrated cameras - where is the error in this pipeline?

陌路散爱 提交于 2019-12-04 08:05:37
问题 There are many posts about 3D reconstruction from stereo views of known internal calibration, some of which are excellent. I have read a lot of them, and based on what I have read I am trying to compute my own 3D scene reconstruction with the below pipeline / algorithm. I'll set out the method then ask specific questions at the bottom. 0. Calibrate your cameras: This means retrieve the camera calibration matrices K 1 and K 2 for Camera 1 and Camera 2. These are 3x3 matrices encapsulating each

Creating stereoParameters class in Matlab: what coordinate system should be used for relative camera rotation parameter?

北战南征 提交于 2019-12-04 06:04:37
stereoParameters takes two extrinsic parameters: RotationOfCamera2 and TranslationOfCamera2 . The problem is that the documentation is a not very detailed about what RotationOfCamera2 really means, it only says: Rotation of camera 2 relative to camera 1, specified as a 3-by-3 matrix. What is the coordinate system in this case ? A rotation matrix can be specified in any coordinate system. What does it exactly mean "the coordinate system of Camera 1" ? What are its x,y,z axes ? In other words, if I calculate the Essential Matrix, how can I get the corresponding RotationOfCamera2 and

Stereo Calibration Opencv Python and Disparity Map

Deadly 提交于 2019-12-04 03:49:12
问题 I am interested in finding the disparity map of a scene. To start with, I did stereo calibration using the following code (I wrote it myself with a little help from Google, after failing to find any helpful tutorials for the same written in python for OpenCV 2.4.10). I took images of a chessboard simultaneously on both cameras and saved them as left*.jpg and right*.jpg. import numpy as np import cv2 import glob # termination criteria criteria = (cv2.TERM_CRITERIA_EPS + cv2.TERM_CRITERIA_MAX