I\'m making a map app, including the location arrow that shows you which way you\'re facing, like so:
This looks pretty tricky. I'm developping a PNS for Android and am facing a somehow similar problem for which I'm still in need of the light : How to get the rotation between accelerometer's axis and motion vector?
Thing is that it looks to me absolutely impossible to find which direction the user is facing (not the device) if he's not moving. That is the human has no sensor on his body, so what if the device stayed in the absolute same position but the user rotated by 90° ? I don't see any way to find this.
What I can suggest (and that fits actually to my problem) is that you could (I don't know what you actually do in your code) use the motion of the user to determine his heading. Let me explain. Let's say you have a first position A. User goes to B. Then you can build the AB vector and get the heading of the user when he stops at B. You'd then have to limit your code to the direction he is facing when arriving to destination.
I know this is not as good as what Google Maps gets, but do you know what Google uses for this ? I mean do they only use accelero and mag.field sensors ?
It seems that the appropriate way to get the bearing when the user is holding the phone vertically is to use something like this:
// after calling getRotationMatrix pass the rotationMatix below:
SensorManager.remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR);
If you want to handle both ways (vertical and flat) you will probably need to detect that and then only perform this remap when it is vertical.
See the API Documentation here.
Did you call remapCoordinateSystem? Otherwise, you only get the right facing value when the phone is hold vertically. For the case When the phone is held so the screen's facing is level with the horizon, there is no way you can get the user's facing. Because to get the facing you have to project the z value of the of the sensor reading into the xy plane in the world coordinate and it is zero when the device is held horizontally.
To be more precise, if you want to get the phone facing then the phone has to be inclined at least about 25 degrees from horizontal and you have to call remapCoordinateSystem. The following code will give you what you want for the last 2 pictures above.
Code
float[] rotationMatrix = new float[9];
if(SensorManager.getRotationMatrix(rotationMatrix, null, lastAcceleration, lastMagneticField)){
float[] orientMatrix = new float[3];
float remapMatrix = new float[9];
SensorManager.remapCoordinateSystem(rotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, remapMatrix);
SensorManager.getOrientation(remapMatrix, orientMatrix);
orientation = orientMat[0]*180/(float)Math.PI;
}
The getOrientation gives you the correct values assuming the phone is laying flat. Thus, if the phone is held vertically, then you have to remap coordinate to get the flat position. Geometrically, you project the the phone -z axis down to the world xy plane and then calculate the angle between this projection vector and the world y-axis.
You should take the pitch and determine if the user is close to holding the phone vertically up.
i chose that after 45 degree pitch up or down from flat on table, the coordinate system should be remapped.
if (Math.round(Math.toDegrees(-orientation[1])) < 45 && Math.round(Math.toDegrees(-orientation[1])) > -45) {
//do something while the phone is horizontal
}else{
//R below is the original rotation matrix
float remapOut[] = new float[9];
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, remapOut);
//get the orientation with remapOut
float remapOrientation[] = new float[3];
SensorManager.getOrientation(remapOut, remapOrientation);
It has worked out pretty well. Let me know if anyone can suggest an improvement on this. Thanks.