I\'m currently working on an Augmented reality application. The targetted device being an Optical See-though HMD I need to calibrate its display to achieve a correct registr
Is it possible, and if yes, how to get these three parameters from the projection matrix I have ?
The projection matrix and the view matrix describe completely different transformations. While the projection matrix describes the mapping from 3D points of a scene, to 2D points of the viewport, the view matrix describes the direction and position from which the scene is looked at. The view matrix is defined by the camera position and the direction too the target of view and the up vector of the camera.
(see Transform the modelMatrix)
This means it is not possible to get the view matrix from the projection matrix. But the camera defines a view matrix.
If the projection is perspective, then it will be possible to get the field of view angle and the aspect ratio from the projection matrix.
The Perspective Projection Matrix looks like this:
r = right, l = left, b = bottom, t = top, n = near, f = far
2*n/(r-l) 0 0 0
0 2*n/(t-b) 0 0
(r+l)/(r-l) (t+b)/(t-b) -(f+n)/(f-n) -1
0 0 -2*f*n/(f-n) 0
it follows:
aspect = w / h
tanFov = tan( fov_y * 0.5 );
p[0][0] = 2*n/(r-l) = 1.0 / (tanFov * aspect)
p[1][1] = 2*n/(t-b) = 1.0 / tanFov
The field of view angle along the Y-axis in degrees:
fov = 2.0*atan( 1.0/prjMatrix[1][1] ) * 180.0 / PI;
The aspect ratio:
aspect = prjMatrix[1][1] / prjMatrix[0][0];
See further the answers to the following question:
How to render depth linearly in modern OpenGL with gl_FragCoord.z in fragment shader?
How to recover view space position given view space depth value and ndc xy