Say I have this code:
#import
@interface MyView : UIView
@end
@implementation MyView
- (void)touchesMoved:(NSSet *)touches withEvent:
May I suggest using this approach instead:
UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc]
initWithTarget:self
action:@selector(tapDetected:)];
tapRecognizer.numberOfTapsRequired = 1;
tapRecognizer.numberOfTouchesRequired = 1;
[self.view addGestureRecognizer:tapRecognizer];
You can do this from the superview, no need to make custom UI elements.
Thanks for the suggestions and informative replies. I ended up just using the explanation shown on this page: (under "Trick 1: Emulating Photos app swiping/zooming/scrolling with a single UIScrollView").
Instinct would say "subclass UIButton", but UIButton is actually a class cluster (ie, the actual type of object transparently changes depending on what kind of button you want to make), which makes it supremely difficult to subclass. Short of rewriting UIButton, there's not much you can do on that approach.
When I needed to solve an almost identical problem, I came up with a view that uses composition to display a proxy UIView, but still allow for touch interception. Here's how it works:
@interface MyView : UIView {
UIView * visibleView;
}
- (id) initWithFrame:(CGRect)frame controlClass:(Class)class;
@end
@implementation MyView
- (id) initWithFrame:(CGRect)frame controlClass:(Class)class {
if (self = [super initWithFrame:frame]) {
CGRect visibleViewFrame = frame;
visibleViewFrame.origin = CGPointZero;
visibleView = [[class alloc] initWithFrame:visibleViewFrame];
[visibleView setUserInteractionEnabled:NO];
[self addSubview:visibleView];
[self setUserInteractionEnabled:YES];
[self setBackgroundColor:[UIColor clearColor]];
}
return self;
}
- (void) dealloc {
[visibleView removeFromSuperview];
[visibleView release], visibleView = nil;
[super dealloc];
}
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (someCondition == YES) {
[visibleView touchesBegan:touches withEvent:event];
}
}
//and similar methods for touchesMoved:, touchesEnded:, etc
@end
The basic idea to this is that you're embedding the button (or whatever) into a container class (similar to what you described in your question). However, you're disabling interaction on the button, which means that when a user taps the button, the tap event will "fall through" the button and into the buttons containing superview (in this, your MyView instance). From there, you can process the touch appropriately. If you want the button to "respond" to the touch, just allow the button to do so by sending the appropriate UIResponder message. (You might need to re-enable userInteractionEnabled on the visibleView first, though. I'm unsure of that.)
As I stated above, I have used this approach (not this exact implementation, but this pattern) quite successfully, when I needed to have a button on the interface, but also capture the UITouch object itself.
I believe you would have to subclass UIButton in this case and override the responses to UIResponder's touch and motion events. You could then forward them to whatever object you wanted to assuming the UIButton subclass could get to it.
In this case you would pass them to the superview of the UIButton.
@interface MyButton : UIButton
@end
@implementation MyButton
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[[self superview] touchesMoved:touches withEvent:event];
}
@end
See UIResponder documentation
I'd be interested in seeing other solutions, but the easiest way I know of is to override either of these two UIView methods:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event;
These methods get called to determine if the touch is within the bounds of the view or any of its subviews, so this is a good point to intercept the touch and then pass it on. Just do what ever you want, and then
return [super hitTest:point withEvent:event];
or
return [super pointInside:point withEvent:event];