I have a Custom Scroll View, subclassing UIScrollView. I have added a scroll view in my viewcontroller nib file and changed its class to CustomScrollView. Now, this custom scroll view (made from xib) is added as a subview on self.view.
In this scroll view, I have 3 text fields and 1 UIImageView(named signImageView) added from xib. On clicking UIImageView (added a TapGestureRecogniser), a UIView named signView is added on the custom scroll view. I want to allow User to sign on this view, So I have created a class Signature.m and .h, subclassing UIView and implemented the touches methods (touchesBegan, touchesMoved and touchesEnded) and initialised the signView as follows:
signView = [[Signature alloc]initWithFrame:signImageView.frame];
[customScrollView addSubview:signView];
But when I start signing on the signView, the view gets scrolled and hence the touches methods don't get called.
I have tried adding signView on self.view instead of custom scroll view, but in that case the view remains glued to a fixed position when I start scrolling. (Its frame remains fixed in this case)
Try setting canCancelContentTouches of the scrollView to NO and delaysContentTouches to YES.
EDIT:
I see that similiar question was answered here Drag & sweep with Cocoa on iPhone (the answer is exactly the same).
If the user tap-n-holds the signView (for about 0.3-0.5 seconds) then view's touchesBegan: method gets fired and all events from that moment on go to the signView until touchesEnded: is called.
If user quickly swipes trough the signView then UIScrollView takes over.
Since you already have UIView subclassed with touchesBegan: method implemented maybe you could somehow indicate to user that your app is prepared for him to sign ('green light' equivalent).
You could also use touchesEnded: to turn off this green light.
It might be better if you add signImageView as as subView of signView (instead of to customScrollView) and hide it when touchesBegan: is fired). You would add signView to customScrollview at the same place where you add signImageView in existing code instead.
With this you achieve that there is effectively only one subView on that place (for better touch-passing efficiency. And you could achieve that green light effect by un-hiding signImageView in touchesBegan:/touchesEnded:
If this app-behaviour (0.3-0.5s delay) is unacceptable then you'd also need to subclass UIScrollView. There Vignesh's method of overriding UIScrollView's touchesShouldBegin: could come to the rescue. There you could possibly detect if the touch accoured in signView and pass it to that view immediately.
When ever you add a scrollview in your view hierarchy it swallows all touches.Hence you are not getting the touches began. So to get the touches in your signon view you will have to pass the touches to signon view. This is how you do it.
We achieved this with a UIScrollView subclass that disables the pan gesture recogniser for a list of views that you provide.
class PanGestureSelectiveScrollView: UIScrollView {
var disablePanOnViews: [UIView]?
override func gestureRecognizerShouldBegin(_ gestureRecognizer: UIGestureRecognizer) -> Bool {
guard let disablePanOnViews = disablePanOnViews else {
return super.gestureRecognizerShouldBegin(gestureRecognizer)
}
let touchPoint = gestureRecognizer.location(in: self)
let isTouchingAnyDisablingView = disablePanOnViews.first { $0.frame.contains(touchPoint) } != nil
if gestureRecognizer === panGestureRecognizer && isTouchingAnyDisablingView {
return false
}
return true
}
}
Related
Please check below a simplified version of my view hierarchy. The scroll view has a clear background and covers the two buttons (1 & 2):
UIView
| UIView
| UIButton 1
| UIButton 2
| UIScrollView (pinned to the superview edges - clear color)
At the beginning I wasn't receiving tap gestures for the buttons as the scrollview is covering them. So I tried to put the scroll view in a container view and override the pointInside method.
class PassThroughView: UIView {
//...
var buttons = [UIView]()
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
return buttons.all { view in
return view.isHidden ||
!view.isUserInteractionEnabled ||
!view.point(inside: convert(point, to: view), with: event)
}
}
}
The PassThoughView holda the array of buttons (1 & 2). The new hierarchy becomes:
UIView
| UIView
| UIButton 1
| UIButton 2
| PassThoughView (pinned to the superview edges - clear color)
| UIScrollView (pinned to the superview edges - clear color)
This solved the first part of the problem. I was able to tap on the buttons as well as drag the scroll view.
The bad part is that I can't scroll anymore unless I start the dragging outside the buttons. The buttons are a bit large and this makes the experience not very friendly.
Is there a way to allow the tap gestures to traverse the scroll view and allow the scroll view to receive pan gestures even if the touches started within one of the buttons
Edit:
I hope that the pictures explain a little bit what's the purpose of this. I need to put a scroll view over all the views so that I can increase the scrollable area. (This is very specific to my case). The user might want to initiate a scroll within the take picture button and if the gesture is tap we take a picture.
If I understand your problem right you need a gesture dependency. The pan gesture of the UIScollView should have a higher priority than the tap gesture of the UIButton.
In case of a UIButton I have no idea yet how to solve it, because a UIButton is a UIControl without using a UIGestureRecognizer. But if you use an own UIView with a UITapGestureRecognizer instead it should be possible.
Get the UIPanGestureRecognizer from the UIScrollView by using the property
var panGestureRecognizer: UIPanGestureRecognizer { get } see here
Now you are able to combine both recognizer by using
func require(toFail otherGestureRecognizer: UIGestureRecognizer) see here
The final code line should look like that:
tapGestureRecognizer.require(toFail: panGestureRecognizer)
Sorry for such a long question, but felt I should convey what I have tried.
I've got a view viewA within a navigation controller. I am then adding a subview viewB (that contains a UITableView) to viewA and offsetting its origin height so that it covers only half the screen (with the other half overflowing off out the bottom of the screen). I want to be able to then drag this viewB upwards but it get stopped when it hits the bottom of the navigation bar and similarly get stopped when dragged back down when it hits the origin offset point. This I have achieved successfully.
However, I want the UITableView interaction to only be enabled when viewB is in its upper position and thus not respond to gestures in any other position. Essentially, dragging viewB up so that it completely covers viewA should enable interaction with the UITableView.
The tricky part here is that I want it to do the following:
If viewB is in its upper position so that it is covering the screen, the UITableView content offset is 0 (i.e. we are at the top of the table) and the user makes a pan gesture downwards, the gesture should not interact with the UITableView but should move viewB downwards.
Any other pan gesture in the above condition should be an interaction with the UITableView.
If viewB is in its upper position so that it is covering the screen, the UITableView content offset is NOT at 0 (i.e. we are NOT at the top of the table) and the user makes a pan gesture downwards, the gesture should interact with the UITableView.
I've been very close to achieving this but I can't get it quite right.
Attempts So Far
I'm using a UIPanGestureRecognizer to handle the dragging of the view. I have tried adding this to:
viewB with the UITableView user interaction initially disabled. This allows me to drag viewB up and down without interfering with the UITableView. Once viewB is in its upper position I enable UITableView user interaction which then correctly allows me to interact with the UITableView without moving viewB.
However, by enabling UITableView user interaction, this means touches never reach the UIPanGestureRecognizer, meaning I can never detect for the scenario described in point (1.) above and thus can't re-disable UITableView user interaction to make viewB movable again.
Maybe it is possible to do it this way by overriding the gesture recognition methods used by the UITableView? If this is possible can anyone point me in the right direction?
a new view added in front of the UITableView. I thought maybe I could forward the touch gestures to the UITableView behind it when necessary but I still haven't found a way to do this.
All I have been able to do is disable the gesture recognizer which allows me to interact with the UITableView, but then I have the same issue as above. I can't detect when to re-enable it.
the UITableView within viewB. This seemed to be the most promising way so far. By setting the return values of the following methods I can enable and disable recognition of either viewB and the UITableView.
func gestureRecognizer(gestureRecognizer: UIGestureRecognizer, shouldReceiveTouch touch: UITouch) -> Bool {
if pulloverVC.view.frame.origin.y == bottomNavbarY &&
pulloverVC.tableView?.contentOffset.y == 0 { // need to add gesture direction check to this condition
viewBisAtTop = true
return false // disable pullover control
}
return true // enable pullover control
}
func gestureRecognizer(gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWithGestureRecognizer otherGestureRecognizer: UIGestureRecognizer) -> Bool {
if (gestureRecognizer as! UIPanGestureRecognizer).velocityInView(view).y < 0 && viewBisAtTop { // gesture direction check not wanted here
return true // enable tableview control
}
viewBisAtTop = false
return false // disable tableview control
}
The top method is called first when a gesture is made (I have checked with print statements) followed by the bottom method. By making different combinations of true/false for the 2 methods I can alternate interaction between viewB and the UITableView.
To detect whether the user is swiping downwards I am calling velocityInView() on the recognizer (as shown in the bottom method). I was intending on making this check in the top methods if statement and I think this would work, however, although velocityInView() works fine in the bottom method, it does not in the top one (velocity is always 0).
I have scoured SO for some solution and find many similar queries about gesture handling for views that cover each other, but these all seem to be regarding one gesture type, e.g. pinch, on one view, and another type, e.g. pan, on the other. In my case the gesture type is the same for both.
Maybe someone has a clever idea? Or maybe this is actually very simple to do and I have made this incredibly complicated? xD
Managed to get this working.
Of the methods described in my question above I removed the top one keeping just this (it has a few changes):
func gestureRecognizer(gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWithGestureRecognizer otherGestureRecognizer: UIGestureRecognizer) -> Bool {
if ((gestureRecognizer as! UIPanGestureRecognizer).velocityInView(view).y < 0
|| pulloverVC.tableView.contentOffset.y > 0)
&& pulloverVC.view.frame.origin.y == bottomNavbarY {
return true // enable tableview control
}
return false
}
The if statement checks that the covering UITableView is in its upper position AND that either the user is is not dragging downwards or the table content is offset (we are not at the top of the table). If this is true, then we return true to enable the tableview.
After this method is called, the standard method implemented to handle my pan gesture is called. In here I have an if statement that sort of checks the opposite to above, and if that's true, it prevents control over the covering viewB from moving:
func handlePanGesture(recognizer: UIPanGestureRecognizer) {
let gestureIsDraggingFromTopToBottom = (recognizer.velocityInView(view).y > 0)
if pulloverVC.view.frame.origin.y != bottomNavbarY || (pulloverVC.view.frame.origin.y == bottomNavbarY && gestureIsDraggingFromTopToBottom && pulloverVC.tableView.contentOffset.y == 0) {
...
This now keeps the UITableView interaction off unless its parent view viewB is in the correct position, and when it is, disables the movement of viewB so that only interaction with the UITableView works.
Then when, we are at the top of the table, and drag downwards, interaction with the UITableView is re-disabled and interaction with its parent view viewB is re-enabled.
A wordy post and answer, but if someone can make sense of what I'm saying, hopefully it will help you.
I have a UIScrollView (with a clear background) and behind it I have a UIImage that takes up about 1/3 of the devices height. In order to initial display the image which is sitting being the scroll view I set the scrollviews contentInset to use the same height as the image. This does exactly what I want, initialing showing the image, but scrolling down will eventually cover the image with the scroll views content.
The only issue is I added a button onto of the image. However it cannot be touched because the UIScrollView is actually over the top of it (even though the button can be seen due to the clear background). How can I get this to work.
Edit:
The following solved the problem:
//viewdidload
self.scrollView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: "onScrollViewTapped:"))
...
func onScrollViewTapped(recognizer:UITapGestureRecognizer)
{
var point = recognizer.locationInView(self.view)
if CGRectContainsPoint(self.closeButton.frame, point) {
self.closeButton.sendActionsForControlEvents(UIControlEvents.TouchUpInside)
}
}
Thanks for the screenshots and reference to Google maps doing what you're looking for, I can see what you're talking about now.
I noticed that the image is clickable and is scrolled over but there is no button showing on the image itself. What you can do is put a clear button in your UIScrollView that covers the image in order to make it clickable when you're able to see it. You're not going to be able to click anything under a UIScrollView as far as I can tell.
Please let me know if that works for you.
a simple solution is to reorder the views in the document out line. The higher the view in the outline, the lower the view is as a layer
Two things to test:
1) Make sure the image that contains the button has its userInteractionEnabled set to true (the default is false). Although, since the button is a subview and added on top of the ImageView (I assume) then this might not help.
2) If that doesn't help, can you instead add the button as a subview of the UIScrollView and set its position to be where the image is? This way it should stay on the image and will be hidden as the user scrolls down, but clickable since it is a child of the ScrollView.
Some code and/or images would help as well.
I think the way to do this is to subclass whatever objects are in your UIScrollView and override touches began / touches ended. Then figure out which coordinates are being touched and whether they land within the bounds of your button
e.g. in Swift this would be:
override func touchesBegan(touches: Set<NSObject>, withEvent event: UIEvent?) {
println("!!! touchesBegan")
if var touch = touches.first {
var touchObj:UITouch = touch as! UITouch
println("touchesBegan \(touchObj.locationInView(self))") //this locationInView should probably target the main screen view and then test coordinates against your button bounds
}
super.touchesBegan(touches, withEvent:event!)
}
See :
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIResponder_Class/index.html#//apple_ref/occ/instm/UIResponder/touchesBegan:withEvent:
And:
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UITouch_Class/index.html#//apple_ref/occ/instm/UITouch/locationInView:
You should subclass UIScrollView and override -hitTest:withEvent: like so, to make sure it only eats touches it should.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *const inherited = [super hitTest:point withEvent:event];
if (inherited == self) return nil;
return inherited;
}
Also make sure to set userInteractionEnabled to YES in your image view.
There is 2 way you can checked that weather touch event is fire on UIButton or not?
Option 1 : You need to add UITapGesture on UIScrollView. while tapping on UIScrollView. Tap gesture return touch point with respect to UIScrollView. you need to convert that touch point with respect to main UIView(that is self.view) using following method.
CGPoint originInSuperview = [superview convertPoint:CGPointZero fromView:subview];
after successfully conversation, you can checked that weather touch point is interact with UIButton frame or what. if it interact then you can perform you action that you are going to perform on UIButton selector.
CGRectContainsPoint(buttonView.frame, point)
Option 2 : Received first touch event while user touch on iPhone screen. and redirect touch point to current UIViewController. where you can check interact as like in option 1 describe. and perform your action.
Option 2 is already integrated in one of my project successfully but i have forgot the library that received first tap event and redirect to current controller. when i know its name i will remind you.
May this help you.
I have UIPageViewController that animates programatically. The problem is that the view controllers inside it has UIButtons inside them. When I hold down a button and wait until the UIPageViewController animates, the app crashes with the error:
'Failed to determine navigation direction for scroll'
What I think I need to do is to somehow fake that the user releases the button before the UIPageviewController animates.
However, [self.button sendActionsForControlEvents:UIControlEventTouchCancel]; doesn't seem to do the trick. Neither do UIControlEventTouchUpInside.
Is there a better way do to it or am I using sendActionsForControlEvents wrong?
All sendActionsForControlEvents: does is call any methods you've assigned to the control events passed in for the button. It doesn't call any internal methods to programmatically lift up touches or anything like that.
Right before you programmatically animate your page view controller, try using this method to effectively cancel any touches on the pan gesture recognizer of the page view controller's internal scroll view:
- (void)cancelPanGestureTouchesOfPageViewController:(UIPageViewController *)pageVC
{
// Since UIPageViewController doesn't provide any API to access its scroll view,
// we have to find it ourselves by manually looping through its view's subviews.
for (UIScrollView *scrollView in pageVC.view.subviews) {
if ([scrollView isKindOfClass:[UIScrollView class]]) {
// We've found the scroll view, so use this little trick to
// effectively cancel any touches on its pan gesture recognizer
BOOL enabled = scrollView.panGestureRecognizer.enabled;
scrollView.panGestureRecognizer.enabled = !enabled;
scrollView.panGestureRecognizer.enabled = enabled;
}
}
}
(Note that this is messing with the internal view hierarchy of UIPageViewController, so this method is kind of ugly and may break in the future. I generally don't recommend doing stuff like this, but I think in this instance it should be okay.)
I have a UIView called view1. view1 has a subview called subview. I added UITapGestureRecognizer to subview as follow:
UITapGestureRecognizer *recognizer = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(handleTap:)];
[subview addGestureRecognizer:recognizer];
If I tapped an area overlapped between subview and view1 then the handleTap method got called. But if I tapped an area on the subview that was outside view1, then handleTap never got called. Is this behavior right? If not, any suggestion to what should I check for?
btw: The UIPanGestureRecognizer works fine. It does not exhibit the behavior mentioned above.
That is the default behaviour of UIView, the subview should be inside parent view bounds. If you want something different is better you create a custom subclass of the top view and override (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
You need to customize the parent view and change the way it handles touches. See this question for more details.
I've found the answers talking about overriding pointInside:withEvent: to be lacking detail on explanation or implementation. In the original question, when the user taps in the black, unlabeled area/view (we'll call it view2), the event framework will only trigger hitTest:withEvent: for the main window down through the view2 (and its immediate subviews), and will never hit it for view1 because the point tested for in pointInside:point is outside of the bounds of view1's frame. In order to get subview1 to register the gesture, you should override view2's implementation of hitTest:withEvent to include a check for subview's pointInside:point
//This presumes view2 has a reference to view1 (since they're nested in the example).
//In scenarios where you don't have access, you'd need to implement this
//in a higher level in the view hierachy
//In view2
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
let ptRelativeToSubviewBounds = convert(point, to: view1.subview)
if view1.subview.point(inside:ptRelativeToSubviewBounds, with:event){
return view1.subview
}
else{
return super.hitTest(point, with: event)
}