I have an app with many different buttons arranged in a calculator like, square / rectangular format. It is actually extremely similar to the default iOS calculator. There are approximately 6 rows with 4 columns each of buttons.
Problem
The problem I am having involves the buttons in the bottom row (approximately the bottom 10th of the screen on an iPhone 4). They do not behave normally when pressed in the sense that when pressed, they have to be pressed and held (for roughly just under a second) to register a "button press". This is opposed to the standard short tap.
No other buttons besides this bottom row behave in this fashion.
Additionally, if these buttons are tapped on their upper edge, they behave normally, responding as soon as they are touched. This leads me to believe that the buttons themselves are not the problems but there is some problem with the layout of my views.
It should be also noted that this problem is only present on physical devices. On the simulator, the buttons behave normally.
Context
The view containing these buttons is not the root view controller of the app. Instead it is transitioned to as so (nothing fancy here):
[self presentViewController:navController animated:YES completion:nil];
Where self is the root view controller
The view controller I am having problems with is contained within a navigation controller and is presented modally by the root view controller which you can see above.
What I have tried so far
Turning auto layout on and off: same problem
Rearranging hierarchy of views: I moved the problematic buttons on top of and behind all other
views with the same result: same problem
Multiple devices (iPhone 4, 4s, 5): same problem (although buttons respond normally on both 3.5 inch and 4 inch simulators)
Testing other apps (when buttons in this region are pressed on other apps, they behave normally)
Additional Information
Everything is laid out in Interface Builder for the problematic view controller
All of the buttons are system buttons with standard settings and are all exactly the same besides their text.
All of the elements of the screen (buttons, labels, etc. ) are subviews of the "view"
The buttons are flush against each other and should not overlap more than one or two pixels.
The problematic buttons have dimensions: 80 width X 44 height.
The problematic buttons are flush against the bottom of the screen
In addition to the buttons, there is one UIImage and several labels however these are at the top of the screen and do not overlap with any of the buttons in any way.
The cause for this issue is that Apple seems to place a GestureRecognizer at the bottom of the screen that delays touches in any other view.
After fiddling around with gesture recognizers on the App's windows I came up with a solution that incorporates a subclass of UIButton:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
BOOL inside = [super pointInside: point withEvent: event];
if (inside && !self.isHighlighted && event.type == UIEventTypeTouches)
{
self.highlighted = YES;
}
return inside;
}
The given method is getting called although touchesBegan: is called delayed. A check if the view is at the bottom of the screen may be suitable to prevent any side effects that may occur with this fix.
This sounds like an interaction between the buttons and the UIScreenEdgePanGestureRecognizer (or whatever it is) that is responsible for detecting that the user wants to bring up the system's Control Center.
There are actually two potential issues here:
There can be an interaction (i.e. conflict) between the possibility of a gesture directed at your app and a gesture directed at the system. If you have gesture recognizers, you might have to use delegate methods to mediate between them and the system's gesture recognizers.
There is a well-established bug where a tap near the screen edge (i.e. in the screen edge gesture recognizer's "zone") works but it causes the button to misbehave physically, i.e. it doesn't look as if it's been tapped even though logging shows that it has (see my answer here: https://stackoverflow.com/a/22000692/341994).
answer of Lukas in swift and deselect highlighted
extension UIButton {
public override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
var inside = super.pointInside(point, withEvent: event)
if inside != highlighted && event?.type == .Touches {
highlighted = inside
}
return inside
}
}
Swift 3 Solution
extension UIControl {
open override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
let inside = super.point(inside: point, with: event)
if inside != isHighlighted && event?.type == .touches {
isHighlighted = inside
}
return inside
}
}
I've written a full solution in swift based on Luka's answer. Just make any conflicting buttons conform to this class and the problem will be gone:
class BorderBugFixButton : UIButton {
override func awakeFromNib() {
super.awakeFromNib()
NSNotificationCenter.defaultCenter().addObserver(self, selector: "unHighlight", name: UIApplicationWillResignActiveNotification, object: nil)
}
deinit {
NSNotificationCenter.defaultCenter().removeObserver(self)
}
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
let inside = super.pointInside(point, withEvent: event)
if inside != highlighted && event?.type == .Touches {
highlighted = inside
}
return inside
}
internal func unHighlight() {
highlighted = false
}
}
P.S.: For those of you that don't like Storyboards/Xib's, just migrate the implementation of awakeFromNib() to init()
I've run into this numerous times when working with updating older Storyboards to iOS 7+, usually when the ViewController in question has a form of a UIScrollView. Double check these 2 settings in your Storyboard on the ViewController Object (Not the view, the one with the yellow circle). When I unchecked the 'Extend Edges' under top & bottom bars, the scrollView's frame was adjusted down by 64 points (height of Nav & Status bars).
After setting the space between NavBar.bottom and scrollView.top back to 0, the button started working. This might be due to the fact the scrollView.frame.bottom was 64 pixels above the bottom of the window, so touches in that area were disregarded because they were technically out of the scrollView's frame but still displayed visually for some reason.
iOS 9.3, Xcode 7.3
I would suggest that you should make a category to the UIButton class that implements Lukas' answer. For instructions on how to create a category see this post: How do I create a category in Xcode 6 or higher?
Give it an appropriate name with the traditional "+" sign, i.e. if you name it "BottomOfScreen", then the resulting file name will be "UIButton+BottomOfScreen".
If you are using objective-c, then you will get a *.h and *.m files with he new category.
*.h
#import <UIKit/UIKit.h>
#interface UIButton (BottomOfScreen)
#end
*.m
#import "UIButton+BottomOfScreen.h"
#implementation UIButton (BottomOfScreen)
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
BOOL inside = [super pointInside:point withEvent:event];
if (inside && !self.isHighlighted && (event.type == UIEventTypeTouches))
{
self.highlighted = true;
}
else
{
nil;
}
return inside;
}
#end
I was able to fix this issue by disabling delaysTouchesBegan in viewWillAppear
self.navigationController?.interactivePopGestureRecognizer?.delaysTouchesBegan = false
I've prepared this answer in the hopes that someone else might find it helpful.
My problem was a little more difficult to discover the cause, but way easier to resolve. The button in question was in a custom sideBarView and XIB with four other buttons that I had programmatically initialized and loaded into the viewController. The top four worked fine. Only the bottom didn't seem to work...
CAUSE: The CGRect defined programmatically for the custom sideBarView was actually smaller in height than the XIB needed. However, since the sideBarView wasn't clipToBounds, it showed it the lowest button, but any taps on it were registered as taps on the view below and not as taps on the lowest button.
To discover this, I checked the 3D viewer, the order of the objects in the XIB and even took comparison snaps of each of the buttons in the simulator with Color-blended layers selected and with breakpoints on didTap... it wasn't until I shortened the spaces between the constraints between each button and discovered that nothing but only the top of the lowest button would accept the tap, which gave me the clue that it was height limitation somewhere (like the initialization code).
Related
I have a custom subclass of UITabBarController which adapts a delegate that has a function to shift the tabBar's frame (specifically frame.origin.y). When the offset is equal to the height of the screen (that is, it is hidden off-screen) I have a UIScrollView extending to the bottom of the screen. Within that UIScrollView, I cannot receive touches in the initial frame of the tabBar view.
I have seen recommendations to add intractable subviews to the UITabBar or the controller's view. This is far from elegant, and creates a multitude of design issues when working with views that possibly take up the whole screen. I have checked out the little public implementation code of UITabBarController and UITabBar but nothing I saw there shows how they are blocking those touches.
I'm aware of the recursive nature of hit tests, but short of overriding the hit test and rerouting the touch in the UITabBarController subclass, which seems rather unclean, I can't think of a generic way to handle this. This question dives into Apple's UITabBarController / UITabBar implementation, but I have included some relevant code for clarity:
class tab_bar_controller: UITabBarController, UITabBarControllerDelegate, tab_bar_setter //has included function
{
//.... irrelevant implementation
func shift(visibility_percent: CGFloat) -> CGFloat //returns origin
{
self.tabBar.frame.origin.y = screen_size().height - (visibility_percent * self.tabBar.frame.size.height)
self.tabBar.userInteractionEnabled = visibility_percent != 0 //no effect
//self.view.userInteractionEnabled = visibility_percent != 0 //blocks all touches within screen.bounds
return self.tabBar.frame.origin.y
}
}
I have a UIScrollView (with a clear background) and behind it I have a UIImage that takes up about 1/3 of the devices height. In order to initial display the image which is sitting being the scroll view I set the scrollviews contentInset to use the same height as the image. This does exactly what I want, initialing showing the image, but scrolling down will eventually cover the image with the scroll views content.
The only issue is I added a button onto of the image. However it cannot be touched because the UIScrollView is actually over the top of it (even though the button can be seen due to the clear background). How can I get this to work.
Edit:
The following solved the problem:
//viewdidload
self.scrollView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: "onScrollViewTapped:"))
...
func onScrollViewTapped(recognizer:UITapGestureRecognizer)
{
var point = recognizer.locationInView(self.view)
if CGRectContainsPoint(self.closeButton.frame, point) {
self.closeButton.sendActionsForControlEvents(UIControlEvents.TouchUpInside)
}
}
Thanks for the screenshots and reference to Google maps doing what you're looking for, I can see what you're talking about now.
I noticed that the image is clickable and is scrolled over but there is no button showing on the image itself. What you can do is put a clear button in your UIScrollView that covers the image in order to make it clickable when you're able to see it. You're not going to be able to click anything under a UIScrollView as far as I can tell.
Please let me know if that works for you.
a simple solution is to reorder the views in the document out line. The higher the view in the outline, the lower the view is as a layer
Two things to test:
1) Make sure the image that contains the button has its userInteractionEnabled set to true (the default is false). Although, since the button is a subview and added on top of the ImageView (I assume) then this might not help.
2) If that doesn't help, can you instead add the button as a subview of the UIScrollView and set its position to be where the image is? This way it should stay on the image and will be hidden as the user scrolls down, but clickable since it is a child of the ScrollView.
Some code and/or images would help as well.
I think the way to do this is to subclass whatever objects are in your UIScrollView and override touches began / touches ended. Then figure out which coordinates are being touched and whether they land within the bounds of your button
e.g. in Swift this would be:
override func touchesBegan(touches: Set<NSObject>, withEvent event: UIEvent?) {
println("!!! touchesBegan")
if var touch = touches.first {
var touchObj:UITouch = touch as! UITouch
println("touchesBegan \(touchObj.locationInView(self))") //this locationInView should probably target the main screen view and then test coordinates against your button bounds
}
super.touchesBegan(touches, withEvent:event!)
}
See :
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIResponder_Class/index.html#//apple_ref/occ/instm/UIResponder/touchesBegan:withEvent:
And:
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UITouch_Class/index.html#//apple_ref/occ/instm/UITouch/locationInView:
You should subclass UIScrollView and override -hitTest:withEvent: like so, to make sure it only eats touches it should.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *const inherited = [super hitTest:point withEvent:event];
if (inherited == self) return nil;
return inherited;
}
Also make sure to set userInteractionEnabled to YES in your image view.
There is 2 way you can checked that weather touch event is fire on UIButton or not?
Option 1 : You need to add UITapGesture on UIScrollView. while tapping on UIScrollView. Tap gesture return touch point with respect to UIScrollView. you need to convert that touch point with respect to main UIView(that is self.view) using following method.
CGPoint originInSuperview = [superview convertPoint:CGPointZero fromView:subview];
after successfully conversation, you can checked that weather touch point is interact with UIButton frame or what. if it interact then you can perform you action that you are going to perform on UIButton selector.
CGRectContainsPoint(buttonView.frame, point)
Option 2 : Received first touch event while user touch on iPhone screen. and redirect touch point to current UIViewController. where you can check interact as like in option 1 describe. and perform your action.
Option 2 is already integrated in one of my project successfully but i have forgot the library that received first tap event and redirect to current controller. when i know its name i will remind you.
May this help you.
here's an odd one..
I've got a UIView xib file that looks like this:
I've connected every UIButton touchDown and touchUpInside events to two IBAction methods:
- (IBAction)touchUpInside:(id)sender
{
NSLog(#"touch up inside");
if (((UIButton *)sender == _enter) | ((UIButton *)sender == _back)) {
[(UIButton *)sender setBackgroundColor:_color2];
}
else {
[(UIButton *)sender setBackgroundColor:_color1];
}
}
- (IBAction)touchDown:(id)sender
{
NSLog(#"touch down");
[(UIButton *)sender setBackgroundColor:_color2];
}
Everything works except for the bottom-most row of UIButton's, that's the odd part:
The touch down event is fired, but the button must be held for 0.5 second for it to change background color, whereas it is instantaneous for the other buttons.
It ONLY happens for the bottom-most row of UIButton's, as I've tried to switch buttons 7, 8, 9 with buttons #back, 0, #enter like this:
I've checked in Interface Builder all the UIButton attributes are the same, and I've tried moving the UIButton's objects order around as you can see on the left side of the picture, and I'm about out of ideas already. Basically what's odd is the UIControl behavior differs based on its position on the parent view...
UPDATE: I made the parent UIView height value large enough that there is 50 free pixels below the last row and the UIButton's work fine now. The only reason I can think of now is that there is a UITabBar there 2 view controllers level underneath. Even so it doesn't make sense.
The document says:
Expect users to swipe up from the bottom of the screen to reveal
Control Center. If iOS determines that a touch that begins at the
bottom of the screen should reveal Control Center, it doesn’t deliver
the gesture to the currently running app. If iOS determines that the
touch should not reveal Control Center, the touch may be slightly
delayed before it reaches the app.
One solution is here:
UIButton fails to properly register touch in bottom region of iPhone screen
But, in your case, I think you should use inputView in UIResponder.
See: https://developer.apple.com/library/ios/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/InputViews/InputViews.html
The inputView is not affected by that problem.
I have a custom UICollectionViewCell that has a few custom UIView objects residing inside them. Each of these UIViews has a UIButton which responds to Touch Down and Touch Up Inside linked by IBActions. Basically, I want these buttons to shrink down when pressed down and spring back to their original size when let go. I can easily accomplish this with the controls and the press down and press up works. However, the problem I am facing happens when scrolling is introduced into the mix. The UICollectionView these cells are apart of is a scrolling one. If I happen to touch a button as I start my scroll, the Touch Down event is triggered as well as the scrolling event of the UICollectionView. If I recall correctly, this was never the case pre-iOS7. When a scrolling event was started, the UIButton event wasnt fired off, I think it had to do with the delaysContentTouches. This looks to be broken or changed now. It actually still works decently on iPhone, just not on iPad. If I scroll my view on iPad, with my touch starting inside the embedded UIButton, the button will shrink and the buttons action will be fired off.
So to restate the issue as plainly as I can: Is there anyway to ignore touches on embedded UIButtons when scrolling is occurring? Touches work fine when there is no scrolling triggered, I just dont want the events to fire off if the user is indeed scrolling. Is there any workaround?
If you need any more specific details, I would be happy to help you understand.
you need to subclass scrollView (collectionView or tableView) and override
- (BOOL)touchesShouldCancelInContentView:(UIView *)view {
if ([view isKindOfClass:UIButton.class]) {
return YES;
}
return [super touchesShouldCancelInContentView:view];
}
swift
override func touchesShouldCancelInContentView(view: UIView) -> Bool {
if view is UIButton {
return true
}
return super.touchesShouldCancelInContentView(view)
}
thats it now you can scroll over button and not lose button tap event.
In a UICollectionView of mine, buttons inside of UICollectionViewCells registered TouchUpInside-taps even though the UICollectionView was still decelerating, which sounds like a similar problem to what you're having. I created a UIButton subclass that overrides beginTrackingWithTouch:withEvent and will return NO in case the UIScrollView it's contained in is decelerating or dragging.
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event
{
UIView *superView = self;
while((superView = [superView superview])) {
if ([superView isKindOfClass:UIScrollView.class]) {
UIScrollView *scrollView = (UIScrollView *)superView;
if (scrollView.isDecelerating || scrollView.isDragging) {
return NO;
}
}
}
return [super beginTrackingWithTouch:touch withEvent:event];
}
The easiest thing to try that comes to mind is to check if the UIScrollView (your UICollectionView) is scrolling or dragging when the button action is triggered.
if(! self.collectionView.dragging && ! self.collectionView.decelerating)
{
// do action because we are not moving
}
Have you tried that?
I have a Custom Scroll View, subclassing UIScrollView. I have added a scroll view in my viewcontroller nib file and changed its class to CustomScrollView. Now, this custom scroll view (made from xib) is added as a subview on self.view.
In this scroll view, I have 3 text fields and 1 UIImageView(named signImageView) added from xib. On clicking UIImageView (added a TapGestureRecogniser), a UIView named signView is added on the custom scroll view. I want to allow User to sign on this view, So I have created a class Signature.m and .h, subclassing UIView and implemented the touches methods (touchesBegan, touchesMoved and touchesEnded) and initialised the signView as follows:
signView = [[Signature alloc]initWithFrame:signImageView.frame];
[customScrollView addSubview:signView];
But when I start signing on the signView, the view gets scrolled and hence the touches methods don't get called.
I have tried adding signView on self.view instead of custom scroll view, but in that case the view remains glued to a fixed position when I start scrolling. (Its frame remains fixed in this case)
Try setting canCancelContentTouches of the scrollView to NO and delaysContentTouches to YES.
EDIT:
I see that similiar question was answered here Drag & sweep with Cocoa on iPhone (the answer is exactly the same).
If the user tap-n-holds the signView (for about 0.3-0.5 seconds) then view's touchesBegan: method gets fired and all events from that moment on go to the signView until touchesEnded: is called.
If user quickly swipes trough the signView then UIScrollView takes over.
Since you already have UIView subclassed with touchesBegan: method implemented maybe you could somehow indicate to user that your app is prepared for him to sign ('green light' equivalent).
You could also use touchesEnded: to turn off this green light.
It might be better if you add signImageView as as subView of signView (instead of to customScrollView) and hide it when touchesBegan: is fired). You would add signView to customScrollview at the same place where you add signImageView in existing code instead.
With this you achieve that there is effectively only one subView on that place (for better touch-passing efficiency. And you could achieve that green light effect by un-hiding signImageView in touchesBegan:/touchesEnded:
If this app-behaviour (0.3-0.5s delay) is unacceptable then you'd also need to subclass UIScrollView. There Vignesh's method of overriding UIScrollView's touchesShouldBegin: could come to the rescue. There you could possibly detect if the touch accoured in signView and pass it to that view immediately.
When ever you add a scrollview in your view hierarchy it swallows all touches.Hence you are not getting the touches began. So to get the touches in your signon view you will have to pass the touches to signon view. This is how you do it.
We achieved this with a UIScrollView subclass that disables the pan gesture recogniser for a list of views that you provide.
class PanGestureSelectiveScrollView: UIScrollView {
var disablePanOnViews: [UIView]?
override func gestureRecognizerShouldBegin(_ gestureRecognizer: UIGestureRecognizer) -> Bool {
guard let disablePanOnViews = disablePanOnViews else {
return super.gestureRecognizerShouldBegin(gestureRecognizer)
}
let touchPoint = gestureRecognizer.location(in: self)
let isTouchingAnyDisablingView = disablePanOnViews.first { $0.frame.contains(touchPoint) } != nil
if gestureRecognizer === panGestureRecognizer && isTouchingAnyDisablingView {
return false
}
return true
}
}