问题
I am wanting to know when a user has touched anywhere on the screen of my app.
I have looked into using -(UIResponder *)nextResponder but unfortunately this will not work, as I am also reloaded a table automatically, so this gets trigged when that occurs.
I have also tried a gesture recognizer, with the following code. But this will only recognise touches on the view. Where as I have many buttons the user will be using to operate the app. I would like to avoid adding a gesture recogniser or code for this in every button and segment control I have on the screen
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tapOnView:)];
[self.mainView addGestureRecognizer:tap];
- (void)tapOnView:(UITapGestureRecognizer *)sender
{
//do something
}
I have also tried -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event , but this has the same issue as the gesture recognizer.
I was wondering if there is any way I could achieve this task. I was hoping that I may be able to recognise the type of event from within the nextResponder, and then I could detect if it is button for example.
EDIT: The reason I am working on this is that my app needs to stay active and the screen cannot be locked (so I have disabled screen locking). To avoid excessive use of power, I need to dim the screen, but then return the brightness back to the original level once the app is touched. I need this feature to only occur on 1 of my viewcontrollers.
回答1:
As mentioned by Ian MacDonald, using hitTest::
is a great solution to detect user interaction on an app wide scale, including when buttons, textfields, etc, are selected.
My solution was to subclass UIWindow
and implement the hitTest method.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// do your stuff here
// return nil if you want to prevent interaction with UI elements
return [super hitTest:point withEvent:event];
}
回答2:
You could attach your UITapGestureRecognizer
to your [[UIApplication sharedApplication] keyWindow]
.
Alternatively, you could override hitTest:
of your root UIView
.
Is there a particular task you are hoping to accomplish? There may be a better way than assigning an "anywhere" gesture.
Edit: Use hitTest:
.
@interface PassthroughView : UIView
@property (readonly) id target;
@property (readonly) SEL selector;
@end
@implementation PassthroughView
- (void)setTarget:(id)target selector:(SEL)selector {
_target = target;
_selector = selector;
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
[_target performSelector:_selector];
return nil;
}
@end
@implementation YourUIViewController {
PassthroughView *anytouchView;
}
- (void)viewDidLoad {
// Add this at the end so it's above all other views.
anytouchView = [[PassthroughView alloc] initWithFrame:self.view.bounds];
[anytouchView setAutoresizingMask:UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight];
[anytouchView setTarget:self selector:@selector(undim)];
[anytouchView setHidden:YES];
[self.view addSubview:anytouchView];
}
- (void)undim {
[anytouchView setHidden:YES];
}
- (void)dim {
[anytouchView setHidden:NO];
}
@end
回答3:
Your edit adds more clarity to your question.
The reason I am working on this is that my app needs to stay active and the screen cannot be locked (so I have disabled screen locking). To avoid excessive use of power, I need to dim the screen, but then return the brightness back to the original level once the app is touched.
Since you are controlling the screen brightness, you can add one transparent view controller before dimming screen on top of your root controller which does only one job, listen to tap using Tap gesture. And on tap you can dismiss the view controller and adjust brightness to previous state.
By doing so you dont have to worry about buttons being clicked as they will be below the transparent view controller. Since its a whole new view controller sitting on top of stack you dont have to modify your existing code as well.
回答4:
Ok I have had a similar problem before.
As I remember I subclassed the UIWindow for full screen detection and made it First responder.
Than I overridden the touch to handle from subclasses.
You can also use code to identify the control that is been touched.
#import <QuartzCore/QuartzCore.h>
- (void)viewDidLoad
{
[super viewDidLoad];
[self.view setMultipleTouchEnabled:YES];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// Enumerate over all the touches
[touches enumerateObjectsUsingBlock:^(id obj, BOOL *stop) {
// Get a single touch and it's location
UITouch *touch = obj;
CGPoint touchPoint = [touch locationInView:self.view];
...
}];
}
To disable the locking of screen I used below code:
[[UIApplication sharedApplication] setIdleTimerDisabled:YES];
I used following functions to dim or increase the screen brightness
[[UIScreen mainScreen] setBrightness:0.0f]; //and
[[UIScreen mainScreen] setBrightness:1.0f];
来源:https://stackoverflow.com/questions/26364799/detecting-a-touch-anywhere-on-the-screen