I'm having trouble converting my OpenGL scene coordiates to my screen coordinates.
I thought I needed to multiply my coordinates with the modelview matrix then with the projection matrix to get the ndc. But i'm getting weird coordinates.
Here is my piece of code
GLKVector3 coor = GLKVector3Make(point.x, point.y, 0);
GLKMatrix4 modelview = GLKMatrix4MakeWithArray(glProjectionMatrix);
GLKMatrix4 projetion = GLKMatrix4MakeWithArray(modelViewMatrix.data);
GLKVector3 eyeCoor = GLKMatrix4MultiplyVector3(modelview, coor);
GLKVector3 ndcCoor = GLKMatrix4MultiplyVector3(projetion,eyeCoor);
CGPoint p = CGPointMake(ndcCoor.x, ndcCoor.y);
Any idea ?
The code seems perfectly valid, but you should use 4D vectors for these homogeneous transforms.
So,
GLKVector4 coor = GLKVector4Make(point.x, point.y, 0, 1);
/// I hope those matrices are fine
GLKMatrix4 modelview = GLKMatrix4MakeWithArray(glProjectionMatrix);
GLKMatrix4 projetion = GLKMatrix4MakeWithArray(modelViewMatrix.data);
GLKVector4 eyeCoor = GLKMatrix4MultiplyVector4(modelview, coor);
GLKVector4 ndcCoor = GLKMatrix4MultiplyVector4(projetion,eyeCoor);
float XScr = ndcCoor.x / ndcCoor.w;
float YScr = ndcCoor.y / ndcCoor.w;
CGPoint p = CGPointMake(XScr, YScr);
If you want XScr and YScr to be in [0..1] range, then add
XScr = (XScr + 1.0f) * 0.5f;
YScr = (YScr + 1.0f) * 0.5f;
conversion.
Even easier: use the GLKit Math function GLKMathProject.
GLKVector3 GLKMathProject (
GLKVector3 object,
GLKMatrix4 model,
GLKMatrix4 projection,
int *viewport
);
So, in your case, e.g.
int viewport[] = {0, 0, 320, 480};
GLKVector3 windowVector = GLKMathProject(coor, modelview, projetion, viewport);
CGPoint p = CGPointMake(windowVector.x, windowVector.y);
Note that the origin is at lower left, so if you're using UIKit coordinates where the origin is at the upper left, then switch the y coordinate, e.g.
CGPoint p = CGPointMake(windowVector.x, window.bounds.size.height - windowVector.y);
来源:https://stackoverflow.com/questions/10796059/opengl-scene-coordinates-to-screen-coordinates