UIButton function with target nil doens´t get called? - ios

i´ve made lots of my own "CustomUIButton" in a for-loop in my viewcontroller.
In this "CustomUIButton"-class i´ve implemented an UIGestureRecognizer like this:
(id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// custom things.
UILongPressGestureRecognizer* longPress = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleLongPress:)];
longPress.minimumPressDuration = 1.0;
[self addGestureRecognizer:longPress];
[longPress release];
}
}
- (void) handleLongPress:(UILongPressGestureRecognizer*) recognizer{
if (recognizer.state == UIGestureRecognizerStateEnded) {
NSLog(#"Long press Ended");
}
else {
NSLog(#"Long press detected.");
// Do something
}
}
If i init the target with "self", my "handleLongPress"-function in this class will be called. It´s cool. If i init the target with "nil", it should check the parent viewcontroller, right?
Any ideas why an additional function with the same name in my viewcontroller won´t be called? (For this test i´ve commented the "longpress"-function of the button-class out.)

In the docs for UIGestureRecognizer's initWithTarget:action: method, for the target parameter it says:
An object that is the recipient of
action messages sent by the receiver
when it recognizes a gesture. nil is
not a valid value.
Note the last sentence.
The docs also say this which should explain why it doesn't work:
A gesture recognizer does not
participate in the view’s responder
chain.
You must specify a value for target.

Related

Handling a gesture that needs to address multiple selectors?

Im dealing with some very confusing code on a project, it seems a gesture was used to trigger a func, and that func fires off a selector to a parent view the trigger a func, this is working fine after some tweaks.
Issue is, the subclass is used in a couple of different parent views, so using superview to find a selector is causing a crash, it exists in 1 use, but not in the second.
How can i handle this so that it calls different selectors based on its parent view? The current setup seems pretty hacky and obviously doesnt work as it needs to... some code below:
The reused view inits with this gesture:
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(bandBypassWasPressed:)];
[tap setDelegate:self];
[tap setCancelsTouchesInView:NO];
[self addGestureRecognizer:tap];
Calling this func in its self:
- (IBAction)bandBypassWasPressed:(UITapGestureRecognizer *)sender {
if (CGRectContainsPoint(self.bounds, [sender locationInView:self])) {
[self.superview performSelector:#selector(bandViewOn:) withObject:self];
[self setNeedsDisplay];
}
}
The issue is that 'bandViewOn' only exists in the superview in 1 use of this subview, not in the other, meaning it fires off a call and crashes the app as there isnt a func there with that name.
There is a different func I want it to call depending on its superview. This is
- (void)lowBandBypass:(NSInteger)on {
NSLog(#"lowBandBypass CALLED");
_eqData.filter[1].bypass = on;
_lowBand.on = on;
[_lowBand setNeedsDisplay];
}
How can i handle this to resolve this odd issue...
Cheers and appreciate its a bit complex!
You can use respondsToSelector to check that the superview implements the method before calling it.
if (CGRectContainsPoint(self.bounds, [sender locationInView:self])) {
if [self.superview respondsToSelector:#selector(bandViewOn:)] {
[self.superview performSelector:#selector(bandViewOn:) withObject:self];
}
[self setNeedsDisplay];
}

IBAction not fired sometimes after UIGestureRecognizer handler

I'm trying to track hits on UI elements (tap and long press) using UIGestureRecognizer. After hit was tracked (let's say logged via NSLog) UI element should do it's job.
I'm creating gesture recognizers like this:
UITapGestureRecognizer* tap = [[UITapGestureRecognizer] alloc initWithTarget:self action:(OnGesture:)]
tap.cancelsTouchesInView = NO;
tap.delegate = self;
[view addGestureRecognizer:tap];
UILongPressGestureRecognizer* longPress = [[UILongPressGestureRecognizer] alloc initWithTarget:self action:(OnGesture:)]
longPress.cancelsTouchesInView = NO;
longPress.delegate = self;
[view addGestureRecognizer:longPress];
I've overridden some gesture recognizer methods:
-(BOOL)gestureRecognizer:(UIGestureRecognizer*)_recognizer shouldReceiveTouch(UITouch*)_touch
{
return YES;
}
-(BOOL)gestureRecognizer:(UIGestureRecognizer*)_recognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer*)_otherRecognizer
{
return YES;
}
Inside the gesture recognizer handler, I'm trying to find the exact subview of the tap by using the hitTest method.
-(void)OnGesture:(UIGestureRecognizer*)_recognizer
{
if([_recognizer.state == UIGestureRecognizerStateEnded])
{
if([_recognizer isKindOfClass:[UITapGestureRecognizer class]]
|| [_recognizer isKindOfClass:[UILongPressGestureRecognizer class])
{
CGPoint location = [_recognizer locationOfTouch:0 inView:_recognizer.view];
// my problem occurs here:
//---------------------------------------------------------------------------
UIView* hitView = [_recognizer.view hitTest:location withEvent:nil];
//---------------------------------------------------------------------------
NSLog(#"Hit on view: %#", hitView);
}
}
}
So my problem is:
Sometimes (1 out of 10 cases) when I press the UIButton OnGesture method fires, but the IBAction of the "Touch Up Inside" event of that button is not firing.
But when I comment out hitTest call:
//UIView* hitView = [_recognizer.view hitTest:location withEvent:nil];
the bug stops being reproducible. IBAction always gets called.
Why is this happening? How can I fix this?
P.S. there could be some typos in the sample code above.
According to the docs, in order for it to work:
This method ignores view objects that are hidden, that have disabled user interactions, or have an alpha level less than 0.01. This method does not take the view’s content into account when determining a hit. Thus, a view can still be returned even if the specified point is in a transparent portion of that view’s content.
So you might wanna do self.someSubview.userInteractionEnabled = YES;

Init touch by UITapGestureRecognizer's -locationInView

I use UITapGestureRecognizer on a tableView for -endEditing after -textFieldDidBeginEditing. First I want to make -endEditing and then perform this touch on tableView's element. How can I do it right way?
#pragma mark - Text Field Delegate
- (void)textFieldDidBeginEditing:(UITextField *)textField {
UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(finishEditing:)];
[self.tableView addGestureRecognizer:tapRecognizer];
}
- (void)finishEditing:(UITapGestureRecognizer *)tapRecognizer {
[self.view endEditing:YES];
}
- (void)textFieldDidEndEditing:(UITextField *)textField {
[self saveName];
}
Setting cancelsTouchesInView to false will pass the touches through to the view.
When this property is true (the default) and the receiver recognizes
its gesture, the touches of that gesture that are pending are not
delivered to the view and previously delivered touches are cancelled
through a touchesCancelled:withEvent: message sent to the view. If a
gesture recognizer doesn’t recognize its gesture or if the value of
this property is false, the view receives all touches in the
multi-touch sequence.
See the reference here

Detect if user has touched the screen on iOS

I googled several other questions and tutorials but couldn't find an answer for my question. I want to detect if the user has touched/tapped/hold/clicked the screen. I tried with touchesBegan: withEvent: but it is not firing any events.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
if ([touch view] == mapView_) {
NSLog(#"Touches began");
} else NSLog(#"Touches began");
}
Is there another way to detect user interaction throught touching the screen?
You have to use UITapGestureRecognizer.
Conform your class to UIGestureRecognizerDelegate protocol.
Instantiate the gesture recognizer. For example, to instantiate a UITapGestureRecognizer, we will do:
UITapGestureRecognizer *tapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTapFrom:)];
Here, action is the selector which will handle the gesture. Here, our selector handleTapFrom will look something like:
- (void) handleTapFrom: (UITapGestureRecognizer *)recognizer
{
//Code to handle the gesture
}
The argument to the selector is the gesture recognizer. We can use this gesture recognizer to access its properties, for example, we can find the state of the gesture recognizer, like, UIGestureRecognizerStateBegan, UIGestureRecognizerStateEnded, etc.
Set the desired properties on the instantiated gesture recognizer. For example, for a UITapGestureRecognizer, we can set the properties numberOfTapsRequired, and numberOfTouchesRequired.
Add the gesture recognizer to the view you want to detect gestures for. In our sample code (I will be sharing that code for your reference), we will add gesture recognizers to an imageView with the following line of code:
[self.imageView addGestureRecognizer:tapGestureRecognizer];
After adding the gesture recognizer to the view, set the delegate for the gesture recognizer, i.e. the class which will handle all the gesture recognizer stuff. In our sample code, it would be like:
tapGestureRecognizer.delegate = self;
Note: Assign the delegate after adding the gesture recognizer to the view. Otherwise, the action method won’t be called.
Reference - Here
The solution was using UIPanGestureRecogniser.
Here's the code that solved the problems and stopped my headache:
UIPanGestureRecognizer *tap = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handleTapFrom:)];
[mapView_ setMultipleTouchEnabled:YES];
[mapView_ setUserInteractionEnabled:YES];
mapView_.gestureRecognizers = #[tap];
And then the selector method:
- (void) handleTapFrom: (UIPanGestureRecognizer*)recogniser {
NSLog(#"Pin");
}
You can subclass UIApplication if you are only interested in whether the user touched the screen or not, and implement this method:
- (void)sendEvent:(UIEvent *)event {
[super sendEvent:event];
if (event.type == UIEventTypeTouches) {
// handling code
}
}
In this case the main.m file would look like this:
int main(int argc, char * argv[])
{
#autoreleasepool {
return UIApplicationMain(argc, argv, #"UIApplicationSubclass", #"AppDelegate");
}
}

On a UILongPressGestureRecognizer how do I detect which object generated the event?

I have a view with several UIButtons. I have successfully implemented using UILongPressGestureRecognizer with the following as the selector;
- (void)longPress:(UILongPressGestureRecognizer*)gesture {
if ( gesture.state == UIGestureRecognizerStateEnded ) {
NSLog(#"Long Press");
}
}
What I need to know within this method is which UIButton received the longpress since I need to do something different, depending on which button received the longpress.
Hopefully the answer is not some issue of mapping the coordinates of where the longpress occured to the bounds of the buttons - would rather not go there.
Any suggestions?
Thanks!
This is available in gesture.view.
Are you adding the long tap gesture controller to the UIView that has the UIButtons as subviews? If so, something along the lines of #Magic Bullet Dave's approach is probably the way to go.
An alternative is to subclass UIButton and add to each UIButton a longTapGestureRecogniser. You can then get your button to do what you like. For example, it could send a message identifying itself to a view controller. The following snippet illustrates methods for the subclass.
- (void) setupLongPressForTarget: (id) target;
{
[self setTarget: target]; // property used to hold target (add #property and #synthesise as appropriate)
UILongPressGestureRecognizer* longPress = [[UILongPressGestureRecognizer alloc] initWithTarget:button action:#selector(longPress:)];
[self addGestureRecognizer:longPress];
[longPress release];
}
- (void) longPress: (UIGestureRecognizer*) recogniser;
{
if (![recogniser isEnabled]) return; // code to prevent multiple long press messages
[recogniser setEnabled:NO];
[recogniser performSelector:#selector(setEnabled:) withObject: [NSNumber numberWithBool:YES] afterDelay:0.2];
NSLog(#"long press detected on button");
if ([[self target] respondsToSelector:#selector(longPressOnButton:)])
{
[[self target] longPressOnButton: self];
}
}
In your view controller you might have code something like this:
- (void) viewDidLoad;
{
// set up buttons (if not already done in Interface Builder)
[buttonA setupLongPressForTarget: self];
[buttonB setupLongPressForTarget: self];
// finish any other set up
}
- (void) longPressOnButton: (id) sender;
{
if (sender = [self buttonA])
{
// handle button A long press
}
if (sender = [self buttonB])
{
// handle button B long press
}
// etc.
}
If your view contains multiple subViews (like lots of buttons) you can determine what was tapped:
// Get the position of the point tapped in the window co-ordinate system
CGPoint tapPoint = [gesture locationInView:nil];
UIView *viewAtBottomOfHeirachy = [self.window hitTest:tapPoint withEvent:nil];
if ([viewAtBottomOfHeirachy isKindOfClass:[UIButton class]])

Resources