How to Calibrate Android Accelerometer & Reduce Noise, Eliminate Gravity

前端 未结 3 1065
佛祖请我去吃肉
佛祖请我去吃肉 2021-02-01 23:38

So, I\'ve been struggling with this problem for some time, and haven\'t had any luck tapping the wisdom of the internets and related SO posts on the subject.

I am writin

3条回答
  •  轻奢々
    轻奢々 (楼主)
    2021-02-02 00:07

    How do you deal with jitteriness? You smooth the data. Instead of looking at the sequence of values from the sensor as your values, you average them on an ongoing basis, and the new sequence formed become the values you use. This moves each jittery value closer to the moving average. Averaging necessarily gets rid of quick variations in adjacent values.. and is why people use the terminology Low (frequency) Pass filtering since data that originally may have varied a lot per sample (or unit time) now varies more slowly.

    eg, instead of using values 10 6 7 11 7 10, you can average these in many ways. For example, we can compute the next value from an equal weight of the running average (ie, of your last processed data point) with the next raw data point. Using a 50-50 mix for the above numbers, we'd get 10, 8, 7.5, 9.25, 8.125, 9.0675. This new sequence, our processed data, would be used in lieu of the noisy data. And we could use a different mix than 50-50 of course.

    As an analogy, imagine you are reporting where a certain person is located using only your eyesight. You have a good view of the wider landscape, but the person is engulfed in a fog. You will see pieces of the body that catch your attention .. a moving left hand, a right foot, shine off eyeglasses, etc, that are jittery, BUT each value is fairly close to the true center of mass. If we run some sort of running averaging, we'd get values that approach the center of mass of that target as it moves through the fog and are in effect more accurate than the values we (the sensor) reported which was made noisy by the fog.

    Now it seems like we are losing potentially interesting data to get a boring curve. It makes sense though. If we are trying to recreate an accurate picture of the person in the fog, the first task is to get a good smooth approximation of the center of mass. To this we can then add data from a complementary sensor/measuring process. For example, a different person might be up close to this target. That person might provide very accurate description of the body movements, but might be in the thick of the fog and not know overall where the target is ending up. This is the complementary position to what we first got -- the second data gives detail accurately without a sense of the approximate location. The two pieces of data would be stitched together. We'd low pass the first set (like your problem presented here) to get a general location void of noise. We'd high pass the second set of data to get the detail without unwanted misleading contributions to the general position. We use high quality global data and high quality local data, each set optimized in complementary ways and kept from corrupting the other set (through the 2 filterings).

    Specifically, we'd mix in gyroscope data -- data that is accurate in the local detail of the "trees" but gets lost in the forest (drifts) -- into the data discussed here (from accelerometer) which sees the forest well but not the trees.

    To summarize, we low pass data from sensors that is jittery but stays close to the "center of mass". We combine this base smooth value with data that is accurate at the detail but drifts, so this second set is high-pass filtered. We get the best of both worlds as we process each group of data to clean it of incorrect aspects. For the accelerometer, we smooth/low pass the data effectively by running some variation of a running average on its measured values. If we were treating the gyroscope data, we'd do math that effectively keeps the detail (accepts deltas) while rejecting the accumulated error that would eventually grow and corrupt the accelerometer smooth curve. How? Essentially, we use the actual gyro values (not averages), but use a small number of samples (of deltas) a piece when deriving our total final clean values. Using a small number of deltas keeps the overall average curve mostly along the same averages tracked by the low pass stage (by the averaged accelerometer data) which forms the bulk of each final data point.

提交回复
热议问题