问题
The problem with tracking multi-touches (at least two finger touches) on the following frame device.
White circles are LEDs and black circles are receivers. When user moves fingers inside this frame we can analyze which receivers received light from the LEDs and which has not received. Based on that we need to track movements of the fingers somehow.
First problem that we has separate x and y coordinates. What is the effective way to combine them? Second problem concerns analyzing coordinates when two fingers are close to each other. How to distinct between them?
I found that k-means clustering cam be useful here. What are other algorithms I should look more carefully to handle this task?
回答1:
As you point out in your diagram, with two fingers different finger positions can give the same sensor readings, so you may have some irreducible uncertainty, unless you find some clever way to use previous history or something.
Do you actually need to know the position of each finger? Is this the right abstraction for this situation? Perhaps you could get a reasonable user interface if you limited yourself to one finger for precise pointing, and recognised e.g. gesture commands by some means that did not use an intermediate representation of finger positions. Can you find gestures that can be easily distinguished from each other given the raw sensor readings?
I suppose the stereotypical computer science approach to this would be to collect the sensor readings from different gestures, throw them at some sort of machine learning box, and hope for the best. You might also try drawing graphs of how the sensor readings change over time for the different gestures and looking at them to see if anything obvious stands out. If you do want to try out machine learning algorithms, http://www.cs.waikato.ac.nz/ml/weka/ might be a good start.
来源:https://stackoverflow.com/questions/8519477/tracking-multi-touch-movements-inside-the-frame-with-transmitters-and-receivers