I have a side panel that has a UIImageView to which I attached a UIPanGestureRecognizer so that you could push/pull the sidebar.
It works well.
Thing is that I have some buttons that happen to sometimes occur underneath that sidebar. If I pull it and a button is underneath, the button would fire simultaneously with the Pan.
I am not sure what the problem is, so I find it hard to solve.
Why does my UIImageView pass the UIPanGestureRecognizer event on down the chain?
Problem solved with delegation.
I disable the events on toucheBegan if a BOOL called isPanning is set to YES for all buttons.
The protocol defines only one function:
-(void)setPanning:(BOOL)isPanning;
in the touchedBegan I check to see if the value is YES, if so I don't fire that event.
I wished it would be simpler.
Related
Edit: I am editing my initial question (see below for history) as I am getting new information.
I figured out that when the swipe motion starts from inside the button bounds, we never receive TouchesEnded or TouchesCancelled, only TouchesMoved. However, if I can react on WillEnddragging, it would be great. Is it possible to cancel a gesture on WillEndDragging and also pass this cancel down the children chain?
History:
I am using Xamarin Forms and I have the following issue
I have custom controls part of native scrolling views, like ScrollView or CollectionView, that remain in "clicked" state after the finger enters them but then initiates a scroll gesture.
I had a similar issue on UWP in the past and managed to solve it with the UIElement.PointerCaptureLost event.
Sorry if I am wasting your time on trivial stuff, but I am really stuck and I greatly appreciate your help.
I have tried different approaches suggested, including setting DelaysContentTouches to NO, and playing around with CanCancelContentTouches and overriding TouchesShouldCancelInContentView to always return NO, in a ScrollView custom renderer.
I have had a read of
Allow UIScrollView and its subviews to both respond to a touch
and
UIScrollView sending touches to subviews
Maybe the accepted answer here helps, but I am not sure how to get the tag of my custom view.
What I am expecting is my custom controls to receive the cancelled touch event (or something similar) as happens in both Android and Windows
This was easier than it looked. Solved by adding a UIGestureRecognizerDelegate to my UIGestureRecognizer class and in the delegate I overwrote ShouldRecognizeSimultaneously to return true.
I have a super view that has a UITapGestureRecognizer on it. It allows touches within the view because there are clickable items within the view.
When these items are clicked on, I want to take a specific action, not the generic one that covers the entire superview. Unfortunately in my TouchDown event of my child control I don't know how to stop the event here. I know I could create a kludge flag, but this seems like the wrong way to go.
Any advice?
James
OK I got a solution. Totally my problem. I was playing around with trying to get all touches to work and at one point I had set cancelTouchesInView = true on the UITapGestureRecognizer superview. While this didn't stop the other touches from happening, for whatever reason the touches carried through to the superview as well. I understand that this explanation probably makes no sense, but that's what did it. Still trying to wrap my head around how iOS does touch.
I'm working on an app with a musical keyboard component.
I need 2 types of "sent events" to trigger the keys of the keyboard (UIButtons).
1) "Touch Down" triggers the buttons they way I need it to
2) The 2nd way I need buttons to be triggered is by sliding onto a button,from another button/key to the side of it as if it is "touched down" upon, when it is slid upon from the left or right.
How do I achieve this?
You can't do this using the built-in control events of the buttons, for the simple reason that you don't get an event in a button at all unless the touch is initially in that button (as I explain here: https://stackoverflow.com/a/40414929/341994).
Still, this doesn't sound very hard to do. The simplest approach is probably to put the touch response (such as a gesture recognizer) into the common superview of all the buttons. The superview can then track the gesture. And it can very easily find out which button the touch is currently inside at any given moment. So it can manage the whole interaction. It can even send messages to the buttons telling them when to highlight and unhighlight. (And if you aren't going to use the button touch handling for anything, you might even want to give up the idea that these are buttons; they could just be views or custom controls that look like buttons.)
This is a pretty hypothetical question just to understand proper design but lets say I have two custom UIViews.
One of them is essentially a container that I'll call a drawer. Its purpose is to hide and show content. It's a lot like the notification center on iOS where you swipe to pull it open and flick it back up to close it. It's a generic container than can contain any other UIView. It has a UIPanGestureRecognizer to track the finger that's pulling it open/closed. It might also have a UISwipeGestureRecognizer to detect a "flick".
The other view is a custom map widget that has UIPan/Rotation/Pinch GestureRecognizers.
I think the drawer view should be the UIGestureRecognizerDelegate for the Pan/Swipe GestureRecognizers so that it can prevent touches from being delivered unless the user is grabbing "the handle".
My first instinct is for the map to be the UIGestureRecognizerDelegate of the pan/rotation/pinch gestures so that it can allow them to all run simultaneously.
The problem I'm having is that, I really don't want the map to receive any touches or begin recognizing gestures until the drawer is completely open. I'd like to be able to enforce this behavior automatically in the drawer itself so that it works for all subviews right out of the box.
The only way that I can think to do this is to wire all of the gestures handlers to the ViewController and let it do everything, but to me that breaks encapsulation as now it has to know that the map gestures need to run simultaneously, that the drawer should only get touches on it's handle and that the map should only get touches when it's open.
What are some ways of doing this where the logic can stay in the Views where I think it belongs?
I would do something like this to make the subviews of the drawer disabled while panning. Essentially loop through the drawer's subviews and disbale interaction on them.
[self.subviews enumerateObjectsUsingBlock:^(UIView *subview, NSUInteger idx, BOOL *stop){
subview.userInteractionEnabled = NO;
}];
And something similar again for when you want to re-enable user interaction on the subviews.
This should already Just Workâ˘. A gesture recogniser is attached to a view; when a continuous gesture is recognised, all subsequent touches associated with that gesture are associated with that view.
So in your case, when the drawer pan is recognised, no touches associated with that pan should ever cause behaviour in your map view's pan/pinch/rotation gestures (unless you explicitly specify that they should using the appropriate delegate methods).
Or do you mean that you want to prevent the user from, halfway through opening the drawer, using another finger (i.e. another gesture) to start scrolling the (half-visible) map? If so, you should just set userInteractionEnabled on the drawer's contentView (or equivalent) to NO at UIGestureRecognizerStateBegan/Changed and YES again at UIGestureRecognizerStateEnded/Cancelled.
I'm trying to use a UIView I've created in Storyboard as a button. I assumed it would be possible to use a UIButton, setting the type to custom. However I was unable to add subviews to a custom UIButton in Storyboard.
As such I've just spent the last hour reinventing the wheel by making my own custom gesture recoginizers to reimplement button functionality.
Surely this isn't the best way of doing it though, so my question - to more experienced iOS developers than myself - is what is the best way to make a custom button?
To be clear it needs to:
Use the UIView I've created as it's hittable area.
Be able to show a
different state depending on whether is currently highlighted or not
(i.e. touch down).
Perform some action when actually tapped.
Thank you for your help.
You can use a UIButton, set the type to custom, and then programmatically add your subviews...
Change your UIView into a UIControl in the storyboard. Then use the method [controlViewName addTarget:self action:#selector(*click handler method*) forControlEvents:UIControlEventTouchDown];. click handler method is a placeholder for the method name of your handler. Use this method except change out the UIControlEventTouchDown for UIControlEventTouchInside and UIControlEventTouchDragExit to call a method when the user finishes their click and drags their finger out of the view respectively. I used this for something I'm working on now and it works great.
In Touch down you will want to: highlight all subviews
In Touch up inside you will want to: unhighlight all subviews and perform segue or do whatever the button is supposed to do
In Touch Drag Exit you will want to: unhighlight all subviews
See second answer by LiCheng in this similiar SO post.
Subclass UIControl instead. You can add subviews to it and it can respond to actions
Why are you implementing your own GestureRecognizer? I recommend using the UIView so you can add subviews in the interface builder and adding a UITapGestureRecognizer. You can even do this graphically since you don't care about IOS4 support.