I want to get the android sensors to work with opengl to rotate the opengl\'s camera to wherever the phone is pointing to.
To elaborate: if the player is looking at
You're supplying null
for the inclination matrix - that's not correct.
SensorManager.getRotationMatrix(rotationMatrix, null, gravity, geomag);
There are lots of examples on how to use the SensorManager's getRotationMatrix(...)
.
float[] R = new float[16];
float[] I = new float[16];
if (SensorManager.getRotationMatrix(R, I, accelerometerValues, geomagneticValues)) {
float[] anglesInRadians = new float[3];
SensorManager.getOrientation(R, anglesInRadians);
...
}
Also "rotate the opengl's camera to wherever the phone is pointing to" is rather ambiguous. For instance, if you meant some sort of a augmented reality approach, then you should be mapping AXIS_X
to AXIS_Z
. Note that remapping might not even be necessary, e.g. when you already fix your activity to landscape mode. More details here.
Some example codes involving sensor data and OpenGL ES:
Here is the way I did the same thing (my App was also in landscape). First I get the orientation values (similar to the way you do):
final float pi = (float) Math.PI;
final float rad2deg = 180/pi;
public static float x; //pitch
public static float y; //roll
public static float z; //azimuth
float[] gravity = new float[3];
float[] geomag = new float[3];
float[] inOrientMatrix = new float[16];
float[] outOrientMatrix= new float[16];
float orientation[] = new float[3];
public static GLSurfaceView glView;
// (...)
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER: gravity = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD: geomag = event.values.clone();
break;
}
if (gravity != null && geomag != null){
if (SensorManager.getRotationMatrix(inOrientMatrix, null, gravity, geomag)){
SensorManager.remapCoordinateSystem(inOrientMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, outOrientMatrix);
SensorManager.getOrientation(outOrientMatrix, orientation);
x = orientation[1]*rad2deg; //pitch
y = orientation[0]*rad2deg; //azimuth
z = orientation[2]*rad2deg; //roll
glView.requestRender();
}
Then in my rendered, in the onDrawFrame(GL10 gl) I do:
gl.glLoadIdentity();
GLU.gluLookAt(gl, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, -1.0f, 0.0f, 1.0f, 0.0f);
gl.glRotatef(MainActivity.x, 1.0f, 0.0f, 0.0f);
gl.glRotatef(MainActivity.y, 0.0f, 1.0f, 0.0f);
gl.glRotatef(MainActivity.z, 0.0f, 0.0f, 1.0f);
//in case you have transformation, eg. the position of the object, you can do them here
//gl.Translatef(0.0f, 0.0f, -DISTANCE NORTH);
//gl.Translatef(0.0f, DISTANCE EAST, 0.0f);
gl.glPushMatrix();
//object draw
In other words, I rotate the whole world around me. A better way would be to change the direction of the camera using glLookAt, with the eye positioned at (0,0,0) and (0,1,0) being the up vector, but I simply couldn't get my center_view x,y,z values correctly.
Hope this helps...