I am developing an Mac application and I would like to know the position of the finger in the trackpad when there is a touch.
Is it something possible and if yes, ho
Swift 3:
I've written an extension to NSTouch that returns the trackpad-touch pos, relative to an NSView:
extension NSTouch {
/**
* Returns the relative position of the touch to the view
* NOTE: the normalizedTouch is the relative location on the trackpad. values range from 0-1. And are y-flipped
* TODO: debug if the touch area is working with a rect with a green stroke
*/
func pos(_ view:NSView) -> CGPoint{
let w = view.frame.size.width
let h = view.frame.size.height
let touchPos:CGPoint = CGPoint(self.normalizedPosition.x,1 + (self.normalizedPosition.y * -1))/*flip the touch coordinates*/
let deviceSize:CGSize = self.deviceSize
let deviceRatio:CGFloat = deviceSize.width/deviceSize.height/*find the ratio of the device*/
let viewRatio:CGFloat = w/h
var touchArea:CGSize = CGSize(w,h)
/*Uniform-shrink the device to the view frame*/
if(deviceRatio > viewRatio){/*device is wider than view*/
touchArea.height = h/viewRatio
touchArea.width = w
}else if(deviceRatio < viewRatio){/*view is wider than device*/
touchArea.height = h
touchArea.width = w/deviceRatio
}/*else ratios are the same*/
let touchAreaPos:CGPoint = CGPoint((w - touchArea.width)/2,(h - touchArea.height)/2)/*we center the touchArea to the View*/
return CGPoint(touchPos.x * touchArea.width,touchPos.y * touchArea.height) + touchAreaPos
}
}
Here is an article I wrote about my GestureHUD class in macOS. With link to a ready-made extension as well: http://eon.codes/blog/2017/03/15/Gesture-HUD/
I don't know if there's an ObjC interface, but you might find the C HID Class Device Interface interesting.
At a Cocoa (Obj-C level) try the following - although remember that many users are still using mouse control.
http://developer.apple.com/mac/library/documentation/cocoa/conceptual/EventOverview/HandlingTouchEvents/HandlingTouchEvents.html
Your view needs to be set to accept touches ([self setAcceptsTouchEvents:YES]
). When you get a touch event like -touchesBeganWithEvent:
, you can figure out where the finger lies by looking at its normalizedPosition
(range is [0.0, 1.0] x [0.0, 1.0]) in light of its deviceSize
in big points (there are 72 bp per inch). The lower-left corner of the trackpad is treated as the zero origin.
So, for example:
- (id)initWithFrame:(NSRect)frameRect {
self = [super initWithFrame:frameRect];
if (!self) return nil;
/* You need to set this to receive any touch event messages. */
[self setAcceptsTouchEvents:YES];
/* You only need to set this if you actually want resting touches.
* If you don't, a touch will "end" when it starts resting and
* "begin" again if it starts moving again. */
[self setWantsRestingTouches:YES]
return self;
}
/* One of many touch event handling methods. */
- (void)touchesBeganWithEvent:(NSEvent *)ev {
NSSet *touches = [ev touchesMatchingPhase:NSTouchPhaseBegan inView:self];
for (NSTouch *touch in touches) {
/* Once you have a touch, getting the position is dead simple. */
NSPoint fraction = touch.normalizedPosition;
NSSize whole = touch.deviceSize;
NSPoint wholeInches = {whole.width / 72.0, whole.height / 72.0};
NSPoint pos = wholeInches;
pos.x *= fraction.x;
pos.y *= fraction.y;
NSLog(@"%s: Finger is touching %g inches right and %g inches up "
@"from lower left corner of trackpad.", __func__, pos.x, pos.y);
}
}
(Treat this code as an illustration, not as tried and true, battle-worn sample code; I just wrote it directly into the comment box.)