iOS 7 custom keyboard UIView touch down event delayed in bottom row - ios

here's an odd one..
I've got a UIView xib file that looks like this:
I've connected every UIButton touchDown and touchUpInside events to two IBAction methods:
- (IBAction)touchUpInside:(id)sender
{
NSLog(#"touch up inside");
if (((UIButton *)sender == _enter) | ((UIButton *)sender == _back)) {
[(UIButton *)sender setBackgroundColor:_color2];
}
else {
[(UIButton *)sender setBackgroundColor:_color1];
}
}
- (IBAction)touchDown:(id)sender
{
NSLog(#"touch down");
[(UIButton *)sender setBackgroundColor:_color2];
}
Everything works except for the bottom-most row of UIButton's, that's the odd part:
The touch down event is fired, but the button must be held for 0.5 second for it to change background color, whereas it is instantaneous for the other buttons.
It ONLY happens for the bottom-most row of UIButton's, as I've tried to switch buttons 7, 8, 9 with buttons #back, 0, #enter like this:
I've checked in Interface Builder all the UIButton attributes are the same, and I've tried moving the UIButton's objects order around as you can see on the left side of the picture, and I'm about out of ideas already. Basically what's odd is the UIControl behavior differs based on its position on the parent view...
UPDATE: I made the parent UIView height value large enough that there is 50 free pixels below the last row and the UIButton's work fine now. The only reason I can think of now is that there is a UITabBar there 2 view controllers level underneath. Even so it doesn't make sense.

The document says:
Expect users to swipe up from the bottom of the screen to reveal
Control Center. If iOS determines that a touch that begins at the
bottom of the screen should reveal Control Center, it doesn’t deliver
the gesture to the currently running app. If iOS determines that the
touch should not reveal Control Center, the touch may be slightly
delayed before it reaches the app.
One solution is here:
UIButton fails to properly register touch in bottom region of iPhone screen
But, in your case, I think you should use inputView in UIResponder.
See: https://developer.apple.com/library/ios/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/InputViews/InputViews.html
The inputView is not affected by that problem.

Related

UIScrollView scroll over background view

I have a UIScrollView (with a clear background) and behind it I have a UIImage that takes up about 1/3 of the devices height. In order to initial display the image which is sitting being the scroll view I set the scrollviews contentInset to use the same height as the image. This does exactly what I want, initialing showing the image, but scrolling down will eventually cover the image with the scroll views content.
The only issue is I added a button onto of the image. However it cannot be touched because the UIScrollView is actually over the top of it (even though the button can be seen due to the clear background). How can I get this to work.
Edit:
The following solved the problem:
//viewdidload
self.scrollView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: "onScrollViewTapped:"))
...
func onScrollViewTapped(recognizer:UITapGestureRecognizer)
{
var point = recognizer.locationInView(self.view)
if CGRectContainsPoint(self.closeButton.frame, point) {
self.closeButton.sendActionsForControlEvents(UIControlEvents.TouchUpInside)
}
}
Thanks for the screenshots and reference to Google maps doing what you're looking for, I can see what you're talking about now.
I noticed that the image is clickable and is scrolled over but there is no button showing on the image itself. What you can do is put a clear button in your UIScrollView that covers the image in order to make it clickable when you're able to see it. You're not going to be able to click anything under a UIScrollView as far as I can tell.
Please let me know if that works for you.
a simple solution is to reorder the views in the document out line. The higher the view in the outline, the lower the view is as a layer
Two things to test:
1) Make sure the image that contains the button has its userInteractionEnabled set to true (the default is false). Although, since the button is a subview and added on top of the ImageView (I assume) then this might not help.
2) If that doesn't help, can you instead add the button as a subview of the UIScrollView and set its position to be where the image is? This way it should stay on the image and will be hidden as the user scrolls down, but clickable since it is a child of the ScrollView.
Some code and/or images would help as well.
I think the way to do this is to subclass whatever objects are in your UIScrollView and override touches began / touches ended. Then figure out which coordinates are being touched and whether they land within the bounds of your button
e.g. in Swift this would be:
override func touchesBegan(touches: Set<NSObject>, withEvent event: UIEvent?) {
println("!!! touchesBegan")
if var touch = touches.first {
var touchObj:UITouch = touch as! UITouch
println("touchesBegan \(touchObj.locationInView(self))") //this locationInView should probably target the main screen view and then test coordinates against your button bounds
}
super.touchesBegan(touches, withEvent:event!)
}
See :
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIResponder_Class/index.html#//apple_ref/occ/instm/UIResponder/touchesBegan:withEvent:
And:
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UITouch_Class/index.html#//apple_ref/occ/instm/UITouch/locationInView:
You should subclass UIScrollView and override -hitTest:withEvent: like so, to make sure it only eats touches it should.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *const inherited = [super hitTest:point withEvent:event];
if (inherited == self) return nil;
return inherited;
}
Also make sure to set userInteractionEnabled to YES in your image view.
There is 2 way you can checked that weather touch event is fire on UIButton or not?
Option 1 : You need to add UITapGesture on UIScrollView. while tapping on UIScrollView. Tap gesture return touch point with respect to UIScrollView. you need to convert that touch point with respect to main UIView(that is self.view) using following method.
CGPoint originInSuperview = [superview convertPoint:CGPointZero fromView:subview];
after successfully conversation, you can checked that weather touch point is interact with UIButton frame or what. if it interact then you can perform you action that you are going to perform on UIButton selector.
CGRectContainsPoint(buttonView.frame, point)
Option 2 : Received first touch event while user touch on iPhone screen. and redirect touch point to current UIViewController. where you can check interact as like in option 1 describe. and perform your action.
Option 2 is already integrated in one of my project successfully but i have forgot the library that received first tap event and redirect to current controller. when i know its name i will remind you.
May this help you.

How implement longPressGestureRecognizer for subview with disabled user interaction?

I need UI like this:
with 2 buttons (yellow and red) and background view (grey), which will have next behaviour:
- highlight button when i press on it;
- execute when i release on button;
- when i press and move in on button from any other view, button became highlighted (ex: press on grey rect and release on red, or press on yellow and release on red);
- support gestures for buttons (like long press and swipe)
So for solving my problem i found only next way:
I redefine for my GrayView touch methods: touchesCancelled, touchesMoved, touchesBegan, and there are i check if current touch position is belong to some rect - i execute appropriate action. But for this solution i have to make my buttons with userInteractionEnabled = false, which means they doesn't support gestures or other events anymore. So if i what to use support it, i have to implement it by myself, what i don't what to do.
So how can i solve this?
If i understand correctly, you can add the gesture recognizers to the gray view as well. And when the gesture recognizer fires find which colored view was in the touch area:
- (void)tapAction:(UITapGestureRecognizer*)recognizer{
if(recognizer.state == UIGestureRecognizerStateEnded){
CGPoint position = [recognizer locationInView:grayView];
if(CGRectContainsPoint(redView.frame, position) {
...
}
}
}

UIButton fails to properly register touch in bottom region of iPhone screen

I have an app with many different buttons arranged in a calculator like, square / rectangular format. It is actually extremely similar to the default iOS calculator. There are approximately 6 rows with 4 columns each of buttons.
Problem
The problem I am having involves the buttons in the bottom row (approximately the bottom 10th of the screen on an iPhone 4). They do not behave normally when pressed in the sense that when pressed, they have to be pressed and held (for roughly just under a second) to register a "button press". This is opposed to the standard short tap.
No other buttons besides this bottom row behave in this fashion.
Additionally, if these buttons are tapped on their upper edge, they behave normally, responding as soon as they are touched. This leads me to believe that the buttons themselves are not the problems but there is some problem with the layout of my views.
It should be also noted that this problem is only present on physical devices. On the simulator, the buttons behave normally.
Context
The view containing these buttons is not the root view controller of the app. Instead it is transitioned to as so (nothing fancy here):
[self presentViewController:navController animated:YES completion:nil];
Where self is the root view controller
The view controller I am having problems with is contained within a navigation controller and is presented modally by the root view controller which you can see above.
What I have tried so far
Turning auto layout on and off: same problem
Rearranging hierarchy of views: I moved the problematic buttons on top of and behind all other
views with the same result: same problem
Multiple devices (iPhone 4, 4s, 5): same problem (although buttons respond normally on both 3.5 inch and 4 inch simulators)
Testing other apps (when buttons in this region are pressed on other apps, they behave normally)
Additional Information
Everything is laid out in Interface Builder for the problematic view controller
All of the buttons are system buttons with standard settings and are all exactly the same besides their text.
All of the elements of the screen (buttons, labels, etc. ) are subviews of the "view"
The buttons are flush against each other and should not overlap more than one or two pixels.
The problematic buttons have dimensions: 80 width X 44 height.
The problematic buttons are flush against the bottom of the screen
In addition to the buttons, there is one UIImage and several labels however these are at the top of the screen and do not overlap with any of the buttons in any way.
The cause for this issue is that Apple seems to place a GestureRecognizer at the bottom of the screen that delays touches in any other view.
After fiddling around with gesture recognizers on the App's windows I came up with a solution that incorporates a subclass of UIButton:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
BOOL inside = [super pointInside: point withEvent: event];
if (inside && !self.isHighlighted && event.type == UIEventTypeTouches)
{
self.highlighted = YES;
}
return inside;
}
The given method is getting called although touchesBegan: is called delayed. A check if the view is at the bottom of the screen may be suitable to prevent any side effects that may occur with this fix.
This sounds like an interaction between the buttons and the UIScreenEdgePanGestureRecognizer (or whatever it is) that is responsible for detecting that the user wants to bring up the system's Control Center.
There are actually two potential issues here:
There can be an interaction (i.e. conflict) between the possibility of a gesture directed at your app and a gesture directed at the system. If you have gesture recognizers, you might have to use delegate methods to mediate between them and the system's gesture recognizers.
There is a well-established bug where a tap near the screen edge (i.e. in the screen edge gesture recognizer's "zone") works but it causes the button to misbehave physically, i.e. it doesn't look as if it's been tapped even though logging shows that it has (see my answer here: https://stackoverflow.com/a/22000692/341994).
answer of Lukas in swift and deselect highlighted
extension UIButton {
public override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
var inside = super.pointInside(point, withEvent: event)
if inside != highlighted && event?.type == .Touches {
highlighted = inside
}
return inside
}
}
Swift 3 Solution
extension UIControl {
open override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
let inside = super.point(inside: point, with: event)
if inside != isHighlighted && event?.type == .touches {
isHighlighted = inside
}
return inside
}
}
I've written a full solution in swift based on Luka's answer. Just make any conflicting buttons conform to this class and the problem will be gone:
class BorderBugFixButton : UIButton {
override func awakeFromNib() {
super.awakeFromNib()
NSNotificationCenter.defaultCenter().addObserver(self, selector: "unHighlight", name: UIApplicationWillResignActiveNotification, object: nil)
}
deinit {
NSNotificationCenter.defaultCenter().removeObserver(self)
}
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
let inside = super.pointInside(point, withEvent: event)
if inside != highlighted && event?.type == .Touches {
highlighted = inside
}
return inside
}
internal func unHighlight() {
highlighted = false
}
}
P.S.: For those of you that don't like Storyboards/Xib's, just migrate the implementation of awakeFromNib() to init()
I've run into this numerous times when working with updating older Storyboards to iOS 7+, usually when the ViewController in question has a form of a UIScrollView. Double check these 2 settings in your Storyboard on the ViewController Object (Not the view, the one with the yellow circle). When I unchecked the 'Extend Edges' under top & bottom bars, the scrollView's frame was adjusted down by 64 points (height of Nav & Status bars).
After setting the space between NavBar.bottom and scrollView.top back to 0, the button started working. This might be due to the fact the scrollView.frame.bottom was 64 pixels above the bottom of the window, so touches in that area were disregarded because they were technically out of the scrollView's frame but still displayed visually for some reason.
iOS 9.3, Xcode 7.3
I would suggest that you should make a category to the UIButton class that implements Lukas' answer. For instructions on how to create a category see this post: How do I create a category in Xcode 6 or higher?
Give it an appropriate name with the traditional "+" sign, i.e. if you name it "BottomOfScreen", then the resulting file name will be "UIButton+BottomOfScreen".
If you are using objective-c, then you will get a *.h and *.m files with he new category.
*.h
#import <UIKit/UIKit.h>
#interface UIButton (BottomOfScreen)
#end
*.m
#import "UIButton+BottomOfScreen.h"
#implementation UIButton (BottomOfScreen)
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
BOOL inside = [super pointInside:point withEvent:event];
if (inside && !self.isHighlighted && (event.type == UIEventTypeTouches))
{
self.highlighted = true;
}
else
{
nil;
}
return inside;
}
#end
I was able to fix this issue by disabling delaysTouchesBegan in viewWillAppear
self.navigationController?.interactivePopGestureRecognizer?.delaysTouchesBegan = false
I've prepared this answer in the hopes that someone else might find it helpful.
My problem was a little more difficult to discover the cause, but way easier to resolve. The button in question was in a custom sideBarView and XIB with four other buttons that I had programmatically initialized and loaded into the viewController. The top four worked fine. Only the bottom didn't seem to work...
CAUSE: The CGRect defined programmatically for the custom sideBarView was actually smaller in height than the XIB needed. However, since the sideBarView wasn't clipToBounds, it showed it the lowest button, but any taps on it were registered as taps on the view below and not as taps on the lowest button.
To discover this, I checked the 3D viewer, the order of the objects in the XIB and even took comparison snaps of each of the buttons in the simulator with Color-blended layers selected and with breakpoints on didTap... it wasn't until I shortened the spaces between the constraints between each button and discovered that nothing but only the top of the lowest button would accept the tap, which gave me the clue that it was height limitation somewhere (like the initialization code).

Why a UIButton on UITableViewCell only drawn darker when touch gesture continued for a short time

Pressing the button quickly and not holding for a short time, will not highlight the button.
Different from a UIButton on a common UIView.
Like the head photo in official Twitter client got same issue.
Instagram client seems solved this, all buttons works fine.
Find same question here:
Why doesn't UIButton showsTouchWhenHighlighted work when the button is on a UITableViewCell?
But I still don't know how to fix it.
Well... a UITableView is a subclass of UIScrollView and the UIScrollView class is known to eat touches for it's own purpose.
When it realizes the touch was not meant for it, it passes it to it's immediate subview.
This feature is the delaysContentTouches property (which by default is YES).
Which is why, the UIButton shows it's highlighted state only after a extended touch because the touch event was with the UITableView for a short while until it determined whether the touch was meant for scrolling or swiping the cell and on realizing the touch was for neither, it immediately passes the touch event to the subView directly below it.
In case of a quick-tap, the button's highlighted state is bypassed due to this delay and the target selector method is called directly.
To show the highlighted state of the button in a UITableView (just as it would on a UIView) do:
For iOS7+:
In -viewDidLoad or anywhere appropriate do:
[yourTableViewObject setDelaysContentTouches:NO];
Also... The cell.subviews has a class UITableViewCellScrollView which apparently is another scrollView and we need to disable the delaysContentTouches property of this class as well.
So... in the -cellForRowAtIndexPath: method (just before return cell;) do:
NSArray *test = cell.subviews;
for (UIView *currentView in cell.subviews) {
if ([NSStringFromClass([currentView class]) isEqualToString:#"UITableViewCellScrollView"]) {
UIScrollView *svTemp = (UIScrollView *) currentView;
[svTemp setDelaysContentTouches:NO];
break;
}
}
For iOS 6-:
In iOS6, the cell.subviews has a UITableViewCellContentView class which is not a scrollView subclass and so all it takes is setting one parameter for the tableView alone.
So, in -viewDidLoad or anywhere appropriate, this is all that you need:
[yourTableViewObject setDelaysContentTouches:NO];
PS: By doing this, it will mess up with the scrolling of the tableView so use your better judgement.

UINavigationController and Touch Events in Left / Left-Bottom Corner of Screen

I'm trying to call action with UIControlEventTouchDown event for simple UIButton which placed in left-bottom corner of UIViewController (which pushed with UINavigationController).
I created storyboard to push view controller with button.
And added actions for button to trace touch events.
- (IBAction)touchUpInside:(id)sender {
NSLog(#"touchUpInside");
}
- (IBAction)touchDown:(id)sender {
NSLog(#"touchDown");
}
And also added touchesBegan to trace if it is called.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
NSLog(#"touchesBegan");
}
Now, with such setup, I have strange behaviour. There are some touch areas in left (width =13) and left-bottom (width = 50, height = 50) which respond differently on touches. If you will will make touch over those areas -touchesBegan is not called on touch down, as would be with normal behaviour. But it will be called only after touch up.
I believe left area is used in UINavigationControoler for interactive pop of pushed UIViewController. So two questions here.
For which functionality is responsible area in bottom-left?
How and where can I change behaviour to pass touch event to UIButton (for example if I want UIButton to respond on long touch event, when I pressing in "red" area)?
I had this same problem, and I fixed it by disabling the "swipe to go back" (technically called "interactive pop gesture" in UIKit) feature introduced in iOS 7.
Sample code to disable interactive pop gesture:
if ([self.navigationController respondsToSelector:#selector(interactivePopGestureRecognizer)]) {
self.navigationController.interactivePopGestureRecognizer.enabled = NO;
}
I believe this is due to the interactive pop gesture recognizer consuming/delaying touch events near the left edge of the screen (because a swipe to go back starts from the left edge) and thus causing the touch events to not be delivered to controls that are situated near the left edge of the view.

Resources