I'm attempting to implement an arcball style camera. I use glm::lookAt to keep the camera pointed at a target, and then move it around the surface of a sphere using azimuth/inclination angles to rotate the view.
I'm running into an issue where the view gets flipped upside down when the azimuth approaches 90 degrees.
Here's the relevant code:
Get projection and view martrices. Runs in the main loop
void Visual::updateModelViewProjection() { model = glm::mat4(); projection = glm::mat4(); view = glm::mat4(); projection = glm::perspective ( (float)glm::radians(camera.Zoom), (float)width / height, // aspect ratio 0.1f, // near clipping plane 10000.0f // far clipping plane ); view = glm::lookAt(camera.Position, camera.Target, camera.Up); }
Mouse move event, for camera rotation
void Visual::cursor_position_callback(GLFWwindow* window, double xpos, double ypos) { if (leftMousePressed) { ... } if (rightMousePressed) { GLfloat xoffset = (xpos - cursorPrevX) / 4.0; GLfloat yoffset = (cursorPrevY - ypos) / 4.0; camera.inclination += yoffset; camera.azimuth -= xoffset; if (camera.inclination > 89.0f) camera.inclination = 89.0f; if (camera.inclination < 1.0f) camera.inclination = 1.0f; if (camera.azimuth > 359.0f) camera.azimuth = 359.0f; if (camera.azimuth < 1.0f) camera.azimuth = 1.0f; float radius = glm::distance(camera.Position, camera.Target); camera.Position[0] = camera.Target[0] + radius * cos(glm::radians(camera.azimuth)) * sin(glm::radians(camera.inclination)); camera.Position[1] = camera.Target[1] + radius * sin(glm::radians(camera.azimuth)) * sin(glm::radians(camera.inclination)); camera.Position[2] = camera.Target[2] + radius * cos(glm::radians(camera.inclination)); camera.updateCameraVectors(); } cursorPrevX = xpos; cursorPrevY = ypos; }
Calculate camera orientation vectors
void updateCameraVectors() { Front = glm::normalize(Target-Position); Right = glm::rotate(glm::normalize(glm::cross(Front, {0.0, 1.0, 0.0})), glm::radians(90.0f), Front); Up = glm::normalize(glm::cross(Front, Right)); }
I'm pretty sure it's related to the way I calculate my camera's right vector, but I cannot figure out how to compensate.
Has anyone run into this before? Any suggestions?
It's a common mistake to use lookAt
for rotating the camera. You should not. The backward/right/up directions are the columns of your view matrix. If you already have them then you don't even need lookAt
, which tries to redo some of your calculations. On the other hand, lookAt
doesn't help you in finding those vectors in the first place.
Instead build the view matrix first as a composition of translations and rotations, and then extract those vectors from its columns:
void Visual::cursor_position_callback(GLFWwindow* window, double xpos, double ypos)
{
...
if (rightMousePressed)
{
GLfloat xoffset = (xpos - cursorPrevX) / 4.0;
GLfloat yoffset = (cursorPrevY - ypos) / 4.0;
camera.inclination = std::clamp(camera.inclination + yoffset, -90.f, 90.f);
camera.azimuth = fmodf(camera.azimuth + xoffset, 360.f);
view = glm::mat4();
view = glm::translate(view, glm::vec3(0.f, 0.f, camera.radius)); // add camera.radius to control the distance-from-target
view = glm::rotate(view, glm::radians(camera.inclination + 90.f), glm::vec3(1.f,0.f,0.f));
view = glm::rotate(view, glm::radians(camera.azimuth), glm::vec3(0.f,0.f,1.f));
view = glm::translate(view, camera.Target);
camera.Right = glm::column(view, 0);
camera.Up = glm::column(view, 1);
camera.Front = -glm::column(view, 2); // minus because OpenGL camera looks towards negative Z.
camera.Position = glm::column(view, 3);
view = glm::inverse(view);
}
...
}
Then remove the code that calculates view and the direction vectors from updateModelViewProjection
and updateCameraVectors
.
Disclaimer: this code is untested. You might need to fix a minus sign somewhere, order of operations, or the conventions might mismatch (Z is up or Y is up, etc...).
来源:https://stackoverflow.com/questions/40195569/arcball-camera-inverting-at-90-deg-azimuth