问题
I am at a loss.
I am trying to evaluate the rotation vector sensor to figure out which way my device is facing. Basically, what I need is the data for an artificial horizon (direction in the sense of east, north, ... would be a nice extra but is not strictly needed.)
I don't understand any of it. None of the formulas I've tried work or seem to get even close.
I should be able to tell if my display is facing up or down, and at which angle it is doing that, right? (Is the phone level, at a 30 degree angel, etc.)
At this stage I'm not even sure if my questions make any sense.
A quick and easy solution would be nice, but if there's anything out there that would help me understand what I am trying to do I'd appreciate it, too.
回答1:
From what you're saying I am inferring that you have a 3-axis MEMS IMU (Inertial Measurement Unit) and perhaps a digital magnetometer.
If this is not the case and you have only a 3-axis digital gyroscope, then it is not possible to know the 'absolute' orientation (with respect to North, East, Down - NED) of the device, only the orientation relative to the orientation at which the program began running. To be able to get the NED orientation additional sensors are required, such as an accelerometer, magnetometer or GPS (amongst others). I will continue with the assumption that you have an accelerometer and magnetometer.
Since you say that you only require an 'artificial horizon' I think you do not need any attitude tracking or filtering, you can take the IMU measurements and estimate the horizon. If it is the case that you require attitude tracking or filtering, then this approach will give you some of the possible measurement models to use in a Kalman Filter or Complementary DCM filter (see the report and paper at http://www.x-io.co.uk/open-source-imu-and-ahrs-algorithms).
The following are two measurement models implemented in MATLAB, with both of them you can get a noisy measurement of the roll, pitch and yaw with respect to an NED (North, East, Down) frame. Note that the second function will only work when the IMU is mostly flat, you can improve this with one of the attitude tracking systems I mentioned above.
function rpy = tiltFromAccel(a)
%% Gets the roll and pitch wrt. gravity using the accelerometer
rpy = zeros(3,1);
rpy(1) = atan2(a(2), sqrt(a(1).^2 + a(3).^2));
rpy(2) = atan2(-a(1), sqrt(a(2).^2 + a(3).^2));
rpy(3) = 0; % Yaw is unobservable from accelerometer alone
end
function yaw = northFromMag(m)
%% Gets the yaw of North using the magnetometer
% This will be really noisy. Effects that are not modelled: hard iron,
% soft iron, non-orthogonality of the sensors, scaling and bias.
% This function assumes that the magnetometer is flat.
yaw = atan2(m(1),m(2));
end
Once you have the roll and pitch angles it is easy to perform some simple logic as you have described IF the rotations are done one-at-a-time; if rotations are about multiple axes simultaneously then it is more complicated. This is a drawback of using Euler angles as the representation of attitude, for more information see https://en.wikipedia.org/wiki/Euler_angles#Proper_Euler_angles.
Hope this is what you're looking for.
来源:https://stackoverflow.com/questions/26808451/device-position-from-rotation-vector-with-quaternions