iOS: UIView losing tap recognition with UITabGestureRecognizer enabled - ios

I am using an imported button object, Floaty (https://github.com/kciter/Floaty) which is a subclass of the UIView class. In one of the view controllers where I am using this button (which includes a Text Field), the button fails to be selected when tapped if I include the following line to close the keyboard when tapped away from...
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(self.dismissKeyboard (_:)))
self.view.addGestureRecognizer(tapGesture)
The idea for how to close the keyboard came from this SO query: iOS - Dismiss keyboard when touching outside of UITextField
I thought part of the issue could be that the Floaty button needs to be in the front, so I added the following line of code, found from this SO query: Losing tap recognition after adding a UIScrollView under a UIButton
floaty.bringSubviewToFront(self.view)
I added this line to the end of my ViewDidLoad() function in the view where it is used.
(Important maybe to note is that all of the aforementioned code is in the ViewDidLoad function for my view controller)
Is there another way to handle closing a keyboard when tapping away from a text field? Or is there something else that I should do to the Floaty (UIView) object to handle tap / touch events?

You can try use Hit Testing. With hit testing, you can assign a view that handles touch. This great article explains how to apply hit testing http://smnh.me/hit-testing-in-ios/

Related

How do I toggle hidden of a label while a button is pressed?

I am trying to figure out how to only display a label while a button is pressed in OS. I know how to operate the touch events but I am not sure how to incorporate the UILongPressGestureRecognizer into this.
The UIButton class, as well as lots of other UIControl subclasses can have numerous actions hooked up to them.
When we are hooking up an action from interface builder to our source code file, if we open the "Event" drop down, we're presented with a long list of options:
In almost every scenario, we're hooking our actions only up to "Touch Up Inside". This allows the user to consider whether or not they want to really press the button. If they drag their finger off the button before letting go, the action doesn't fire, because the "up touch" gesture happened outside the bounds of the object.
But here, we want to actually hook our button's "touch down" event up. This is when we'll display the label.
Let's go ahead and create a "touch down" event and a "touch up inside" event:
Swift
#IBAction func buttonTouchDown(sender: UIButton) {
self.myLabel.hidden = false
}
#IBAction func buttonTouchEnded(sender: UIButton) {
self.myLabel.hidden = true
}
Objective-C
- (IBAction)buttonTouchDown:(UIButton *)sender {
self.myLabel.hidden = NO;
}
- (IBAction)buttonTouchEnded:(UIButton *)sender {
self.myLabel.hidden = YES;
}
So far, buttonTouchEnded is set up completely normally, and buttonTouchDown was set up by selecting "touch down" from the "Event" list.
We can always verify what our control is hooked up to by right clicking it in the interface builder:
But this menu is useful for more than simply checking what we've already hooked up. From here, we can hook up any of the other actions to our existing #IBAction methods simply by clicking in the circle and dragging to the existing method.
So we obviously want the label to disappear if we stop pressing the button, a normal touch up like you'd hook up any other button. The only question remaining is, what exact behavior do you want?
If you want the label to disappear only when the finger is lifted, no matter where the finger goes, then we must also hook up "touch up outside".
If you want the label to disappear when the user drags their finger off the button, then we should hook up the "touch drag exit" action.
We also probably want to hook up the "touch cancel" action, which would occur if some sort of system event (perhaps an incoming phone call) cancels the touch.
This Stack Overflow answer elaborates on the differences between the action options we have, so you can craft the behavior exactly how you need it.
Anyway, once we decide which actions we want to hook up to which methods, bring up that right click menu and click-drag from the circles to the methods:
The easiest thing to do would be to add an action to the touchDown event and a separate action to touchUpInside and touchUpOutside.
Show the label on the touchDown action and hide it on the touchUpInside / touchUpOutside action. (and for completeness, on touchCancel, as suggested by nhgrif in his very thorough answer.)
A long press gesture recognizer won't work in this situation. You could create a custom gesture recognizer that triggered one event on touch and another event on release, and use that. It's actually not that hard to do.
EDIT
I just uploaded a demo project to GitHub called "MorphingButton" (link) that I created for another question here on Stack Overflow.
That project now shows a label on touching the app button and hides the label when you release the button.
The project is a hybrid Swift/Objective-C project that shows how to do the button morphing and label showing/hiding in both languages. It has a tab bar with a Swift tab and an Objective-C tab.

iOS - change animation when user taps screen

Lets say that i have an animation - an image is going from left side of the screen to the right. I would like to make it a little bit interactive - when user taps on a screen i want to change direction of image movement. Whats the best approach to implement it?
What I do in some cases is take the main view of the View Controller, in Storyboard, and change the class type of that UIView to UIControl.
In the code that is accessed as MyViewController.view, which you can write:
var viewAsControl = myViewController.view as UIControl
In Swift or some equivalent of that.
The UIControl subclass of UIView is the hierarchical layer (class) that adds the action/target facilities to a view. For example, UIButton is a UIControl, because it generates events (actions), and it is also a UIView so it can be added as a subview.
Then from the Connections Inspector, accessed via the far right Icon of the far right panel (that is, the panel to the right of the storyboard editor window), I'd select the Touch Up Inside event type or some other event and drag it to an #IBAction tagged function I'd add to the View Controller's source code, to receive the tap event. From that tap notification, you can cancel the current animation and add a new one, etc...
Alternatively, you can create an IBOutlet for the view if you've turned it into a UIControl in IB, and use the addTarget() method to assign an action handler for a specific event, e.g. to make it call a function in your code.
Either way the effect will be that any time the view is tapped, it will generate the event for you to respond to

Attaching a custom alert view above the iOS keyboard

I'm trying to design (a "properly designed," not "hack") custom alert view. The view should attach itself to the top of the keyboard; sliding up with the keyboard (if there is an alert) or being hidden (if there is no alert).
The view should always "stick" to the keyboard... including, for instance, when the keyboard hides. In that case, the view should slide right down, out of sight, along with the keyboard.
Here's an example of what I'm trying to achieve (with an active alert):
I have originally thought about subclassing UIAlertView, but it looks like that is not recommended. And, after experimenting a bit, this is clearly a tricky task. I've got an alert that shows up but, it turns into problems staying in sync with the keyboard, and I haven't found a way to make it "track" with the motion of the keyboard... not smoothly.
Any ideas?
You can achieve this with inputAccessoryView of UITextField and UITextView. See Custom Views for Data Input chapter in Apple's "Text Programming Guide for iOS" for more information.
For example, a very simple red bar above the keyboard can be added with the following code:
let keyboardAlertView = UIView(frame:CGRectMake(0,0,320,44))
keyboardAlertView.backgroundColor = UIColor.redColor()
textField.inputAccessoryView = keyboardAlertView

How to get stepper and longpress to coexist?

I tried setting up a view with a longpress gesture and a stepper configured for continuous updates. With the longpress, the continuous feature of the stepper does not occur. For now, I've disabled the longpress. I guess I don't need it. But for future reference, how would I allow for both to coexist?
Just to be clear, here is the way the screen was set up when I tried this.
App was set up with a simple view controller.
A subview was added to this view (could have been a controller, but I just made it a UIView).
Several labels and stepper were added to this subview.
The steppers were wired up as outlets and actions.
A longpress recognizer was added to the main view in IB.
For completeness, a tap gesture was also added to the main view in IB.
Taps on the main view function as expected. Taps on the steppers function as expected. Longpress on the main view functions as expected. Longpress on the stepper does not.
I modified the code called by the longpress to check for the frame of the subview and not act if the touch location was within that rectangle, but that didn't make a difference. I did not try getting the longpress to fail in that situation, but I suppose I'll try that next. EDIT: OK, maybe not. There doesn't seem to be an API for that. However, there is this kludge, that I'm not going to try.
Attached is a screen shot from profiler with an inverted call tree so you can see what each item is being called by.
darkStepped: is the IBAction that is called by the stepper. If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
The stack trace reveals that the stepper's _updateCount method is dispatched through a timer.
This could be related to the fact that a stepper has an "autoIncrement" mode where, as long as your keep it pressed, it will update at a given (varying) rate. So, instead of simply calling _updateCount, the stepper sets up a timer to handle this behaviour.
For whatever reason the timer is used, the timer explains why you do not see the gesture recogniser in the stack trace.
In your case what happens is that the stepper gets the touches, handles them, and do not forward them to any gesture recognisers attached to it.
This can be explained as follows, although this snippet does not explicitly mention a long press recogniser in relation to a UIStepper control:
According to Apple Docs:
Interacting with Other User Interface Controls
In iOS 6.0 and later, default control actions prevent overlapping gesture recognizer behavior. For example, the default action for a button is a single tap. If you have a single tap gesture recognizer attached to a button’s parent view, and the user taps the button, then the button’s action method receives the touch event instead of the gesture recognizer. This applies only to gesture recognition that overlaps the default action for a control, which includes:
A single finger single tap on a UIButton, UISwitch, UIStepper, UISegmentedControl, and UIPageControl.
...
If you have a custom subclass of one of these controls and you want to change the default action, attach a gesture recognizer directly to the control instead of to the parent view. Then, the gesture recognizer receives the touch event first. As always, be sure to read the iOS Human Interface Guidelines to ensure that your app offers an intuitive user experience, especially when overriding the default behavior of a standard control.
So, it seems you can attach the gesture recogniser directly to the control (possibly you need to subclass UIStepper for this to work, I am not really sure how to interpret the last paragraph). Hopefully this will not disable the basic workings of the stepper (but maybe it will).
After carefully reviewing Apple's docs again, I've found the solution. I added the view controller as the delegate to the longpress gesture recognizer
self.longPress.delegate = self;
(and, of course, adding <UIGestureRecognizerDelegate> to the interface, and then added this method to the view controller:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
// Determine if the touch is inside the custom subview
if (gestureRecognizer == self.longPress) {
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(self.antControl.frame, touchLocation)) {
return NO;
}
}
return YES;
}
This way the gesture recognizer doesn't even get called when the longpress occurs within the frame of self.antControl, which is the subview mentioned in the question.

why does GestureRecognizer not work on textview first time?

I have a textView to input some text or emotion. so I make another button to change the keyboard to custom emotion view, and I also use tap GestureRecognizer for when I want to change back to keyboard.
I found every time I need to touch the button twice, the tap GestureRecognizer can work well. I thought and search long time but no result, finally I fix it. I think some one will meet the same question, so I shared it below.
because my add tap gesture recognition 's code is in init code, but when I tap textview, keyboard was pop up, the textview's position was changed too, so the init tap gesture recognition code isn't work now.
I fix it with add tap gesture recognition delay 0.3 s.

Resources