In some apps, it makes sense for the app to directly handle keyboard shortcuts which are otherwise bound to system wide combinations. For example, ⌘-Space (normally Spotlight) o
I solved this ages ago but I only just noticed I never posted it here. The answer ended up involving CGSSetGlobalHotKeyOperatingMode()
. This is not a public API, but there are a number of Mac App Store apps which use it by obfuscating the function name and looking it up dynamically. Apple doesn't seem to mind. The API is pretty straightforward to use, and there's plenty of open example source code floating about.
Ok, so the Cocoa event methods and Quartz event taps are out because they either require root or accessibility access, or do not catch events before the dock does.
Carbon's PushSymbolicHotKeyMode
is out because, per the docs, it requires accessibility access.
Carbon's RegisterEventHotKey
is probably out because Apple doesn't seem to allow it (see my link in comment on the question). However, even so, I tested and you can't use it to catch Command+Tab.
I made a quick proof-of-concept of how this can work, but YMMV:
KeyboardWatcher
example class from this answer. You will need to link IOKit.Handle_DeviceEventCallback
will give you the keys that are pressed. You can obviously modify this to your needsSetSystemUIMode
to block the task switcher and Spotlight. You will need to link Carbon.SetSystemUIMode(kUIModeContentSuppressed, kUIOptionDisableProcessSwitch);
Note that this will only work while your app is in the foreground (probably what you want). I set this on my view using a tracking rectangle, so it only takes effect when the mouse is over my view (like in Remotix):
- (void)viewDidLoad {
[super viewDidLoad];
NSTrackingArea* trackingArea = [[NSTrackingArea alloc] initWithRect:[self.view bounds] options: (NSTrackingMouseEnteredAndExited |
NSTrackingActiveAlways) owner:self userInfo:nil];
[self.view addTrackingArea:trackingArea];
}
- (void) mouseEntered:(NSEvent*)theEvent {
SetSystemUIMode(kUIModeContentSuppressed, kUIOptionDisableProcessSwitch);
}
- (void) mouseExited:(NSEvent*)theEvent {
SetSystemUIMode(kUIModeNormal, 0);
}
Remotix seems to link Carbon and IOKit, but I can't see if they have the USB entitlement (I tried the demo, not the App Store version). It's possible they are doing something like this.
The normal way to achieve this is by installing a Quartz event tap. However to receive events targeting other applications, you need (as you say) to be either root, or have accessibility access enabled for your app.
It seems not possible to use an event tap with the current sandboxing rules. This is confirmed in the developer forum. The link is login only, but to quote from the thread:
Is there are any chance to handle events that comming from media keys by prevents launch iTunes. Before sandbox it was possible by create CGEventTap but now sandbox deny using hid-controll.
No, this is not currently possible within App Sandbox.
I'm not sure of another way to do this; and I'd be interested to know what apps in the App Store can?
VMWare Fusion is clearly not sandboxed, and Apple's own apps are exempt from the rules. Remember that sandboxing is only enforced on new apps added after it was introduced, in 2012. Apps added before that date do not have sandboxing enforced. See this answer.
For others looking for a solution for a full-screen app, or if you're willing to take over the full screen, you can use: CGDisplayCapture
It will cause your app to capture all keyboard input and prevent even Spotlight and App switching from being invoked using a keyboard.
import Quartz
// disable keyboard events for all apps, except yours
CGDisplayCapture(CGMainDisplayID())
// reenable keyboard events for other apps
CGDisplayRelease(CGMainDisplayID())
Note: Until the display is released, the app will not receive window/application active/resign events. So, perhaps you can use mouse tracking to release the display while your app is active. Also, even the screensaver/lock screen will be impacted. Make sure to deactivate the capture as needed.
Refs: