How to detect an absolute rotation of an android device

强颜欢笑 提交于 2019-12-12 12:49:01

问题


I'm looking to implement a robot that is navigating indoor, using a software running on an android device. One mandatory feature is to know the robot orientation in "real time".

I have one major constraint : the android device should be placed with a screen on the gravity axe (that means : vertical, like for taking pictures with the camera on the device)

This prevents me from using azimut which is the most common measure to get a reference angle. It makes no sense to use :

SensorManager.getRotationMatrixFromVector(mRotationMatrixFromVector, event.values);

then

SensorManager.getOrientation(mRotationMatrix, orientation);

since orientation[0] supposed to provide azimut is inconsistent.

I was searching for longtime now, but I didn't find any acceptable solution over here.

An "acceptable solution" is to have a system response within 100ms and with a precision close to 5°. The reference is not necessarily the magnetic North, but it must be stable over the time. It can be a starting position (but there are some drift issues...)

Are there any sensors of other type than TYPE_ROTATION_VECTOR more situable for this use ? I was trying with TYPE_GYROSCOPE but with poor results...

My android device is the Google Tango tablet.

Thanks for any help


回答1:


If you only care about azimuth in any reference frame, then the information provided by Tango is what you want -- specifically, the TangoPoseData provided by Tango.OnTangoUpdateListener.onPoseAvailable (after requesting the frame pair COORDINATE_FRAME_AREA_DESCRIPTION, COORDINATE_FRAME_DEVICE) contains a quaternion expressing the device orientation relative to the area description frame of reference. A quaternion is basically a rotation operation. If you pick a desired vector in the device's frame (for instance, a vector pointing out of the screen), the quaternion will allow you to rotate that vector into the area description (world) frame of reference. That rotated vector is the direction the original vector is pointing in the world, and you can determine azimuth by taking atan2 of the two horizontal components of the vector. For more information on Tango's reference frames, see here.

There should be a simpler answer to this question also. The magnetometer sensors should be able to provide an absolute orientation of the tablet relative to the Earth's magnetic field, which is pointed roughly north -- this is something pretty much any Android device can do. You could construct your own orientation from the three raw magnetometer sensor values, but Android provides a much more convenient way to access the fused sensors: Sensor.TYPE_ROTATION_VECTOR from SensorManager. However, the Tango tablet has a software issue that prevents its magnetometers from working; see here for more details. It's really disappointing that a device designed specifically for navigation would have no way of determining its absolute azimuth.

If SensorManager's TYPE_ROTATION_VECTOR did work on a Tango tablet, you could use SensorManager.getRotationMatrixFromVector to get a 3x3 rotation matrix from the quaternion returned in onSensorChanged, and then get the "azimuth" of the device using SensorManager.getOrientation (as you indicated in your question). However, this approach is somewhat limiting as that approach has gimbal lock when the tablet's screen is facing upward. When you pick your own vector and rotate it into world coordinates using a quaternion then compute azimuth based on two of the three coordinates, that allows you to be much more specific about what exactly you intend for "azimuth" to mean.




回答2:


@Ben : Thank you for your complete answer.

This explain why my attemps to get azimut from TYPE_ROTATION_VECTOR were unsuccessful on Tango tablet (same code deployed on other devices is working well)

The solution you suggested is close to the one I've found empirically, browsing the Tango libraries code :

float[] position = pose.getTranslationAsFloats(); Quaternion q = new Quaternion(pose.rotation[3], pose.rotation[0], pose.rotation[1], pose.rotation[2]); double currentRoll = Math.toDegrees(q.getRoll()); robotCommander.updateCoordonates(position[0], position[1], currentRoll);

X is position[0], Y is position[1] and currentRoll is current orientation of the robot.

I have some issues with this solution :

  • this a relative measurement, first time I get quaternion the currentRoll is set to 0, next angles are relative to the original position. When my robot is lost, I have no absolute measurement for repositionning.

  • there is an important drift. Still I haven't try using my code with an area description file (ADF) : this is a next step (may be also usefull for the previous point)

  • currentRoll never reaches 180° (?)

  • I'm not 100% sure my code is the correct way to use pose data.

You mentionned that << rotated vector is the direction the original vector is pointing in the world, and you can determine azimuth by taking atan2 of the two horizontal components of the vector >> Do you have any pointer or code example for this ?

Thanks for any help or suggestion.



来源:https://stackoverflow.com/questions/33367926/how-to-detect-an-absolute-rotation-of-an-android-device

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!