How to programmatically connect more UIPanGestureRecognizer on more views? - ios

How can I make one UIPanGestureRecognizer? I have multiple views and every one of them has their own UIPanGestureRecognizer. How can I make that, when user holds his finger on one and moves his finger across the screen, the tag of view is changing and also the view, until user lifts his finger from the screen? It' a little bit hard to explain... I hope you will understand. Thank you for your answers.

I believe you are trying to create a single pan gesture recogniser that will work with multiple views, and for you to know which view is currently under the users finger during the pan. If this is the case then this should help...
Create a view to act as a container for all the views you want to participate in the pan.
I've given each view it's own colour to make it visually obvious.
I've also added a label to each view to show it's tag.
Hook up a single UIPanGestureRecognizer to the container view and attach it's selector to a method in your view controller class.
- (IBAction)panGestureRecognizerTriggered:(UIPanGestureRecognizer *)recognizer
{
CGPoint location = [recognizer locationInView:recognizer.view];
// Find the view that is currently under the user's finger
for (UIView *view in recognizer.view.subviews) {
if (CGRectContainsPoint(view.frame, location)) {
NSLog(#"View %d at %#", (int)view.tag, NSStringFromCGPoint((location)));
// Found the view, stop searching :)
break;
}
}
}
This method iterates the subviews of the view attached to the gesture recogniser and determines which subview is currently under the users finger, printing the tag and current location.
Admittedly this probably isn't going to be particularly efficient if you're dealing with lots of views but for a simple case such as this it gets the job done.

Related

UIGestureRecognizer not called after setting UIViews frame inside UIScrollView

i am having some very strange issue. Basically i am implementing a drag and drop view with a snap to the horizontal 1D grid. Well, so when i drag a view and its center coordinate X is bigger or smaller then a different view, the non-dragged view should be animated to the left or right of its original position.
This works fine most of the time. But in some special cases its not working. In some situations specific situations the view does not receive any gesture callbacks anymore. I can reproduce this issue and have found out that when i remove the code that applies the animation, everything its working fine.
Basically this is the code that is called when the dragged view is at a position where the view below should be moved to the left or right
/**
* Animate element to its saved position
*/
- (void)switchElement:(unsigned int)draggedIndex with:(unsigned int)otherIndex
{
// first animate
UIView *view = views[draggedIndex];
UIView *otherView = views[otherIndex];
// IF I COMMENT THIS OUT, EVERYTHING WORKS FINE
otherView.frame = [self getImageRectForIndex:draggedIndex];
// now switch internally
if(draggedIndex != otherIndex)
{
// switch views
views[draggedIndex] = otherView;
views[otherIndex] = view;
}
}
Any idea if there is something to have in mind if i animate UIViews and have gesture recognizers attached to them?
If somebody is willing, i can paste the whole class here to test it.
SOLUTION
I am having some "highlight" views in my design. And i have moved the relevant views behind those transparent background views by accident. So now i am not using addSubview: but insertSubview:atIndex: instead.
But marking #Anthonin C. answers as the right one. Because it pointed me in the correct direction (I found it out by overriding the hitTest: method)
Would you please track the otherView.bounds property to verify that your touch isn't out of bounds. I faced this issue recently and Apple documentation provide solution here : Delivering touch events to a view outside the bounds of its parent view
Sorry, I don't have enough reputation to comment your answer.

control multiple buttons with one swipe gesture swift

I have array of UIImageView that printed on the main View (as subviews) in a matrix.
That UIImageViews interact while I am tap on them (work like a pixel when I touch one of them it turn on (move from black to green)
but I want to do it with swipe gesture so i can with one swipe trigger more than one "pixel" (UIImageView)
I found this for android triggering-multiple-buttonsonclick-event-with-one-swipe-gesture
and I wonder if there is something like that in ios with swift that recognise general touch (not tap or swipe) so I can looks for it.
The main purpose of all of this is to draw "shapes" on matrix of pixels with one swipe gestures.
If there is another way that you think will help I will be happy to here about it.
Many thanks
You are looking for the UIGestureRecognizer.
With this you can add many types of gestures, as swipe, touch..etc.
You can also get position, duration, and basically all the information about it.
You can check step by step tutorial in this link.
http://www.raywenderlich.com/76020/using-uigesturerecognizer-with-swift-tutorial
And also in the apple documentation.
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIGestureRecognizer_Class/
i Manage to do the the swipe action using touchesMoved and touchesEnded
while the main idea is to invoke the UIIMageViews using the Coordinate of the touch and compare it to the to the UIImageViews Coordinates in the touchesMoved function
using flags to disable already edited UIIMageViews (while i am in the same touch session , the finger still on the screen) and refresh the UIImageViews to be editable again in the touchesEnded
func swipeTouches(touches: NSSet!) {
// Get the first touch and its location in this view controller's view coordinate system
let touch = touches.allObjects[0] as! UITouch
let touchLocation = touch.locationInView(self.view)
for pixel in pixelArrays {
// Convert the location of the obstacle view to this view controller's view coordinate system
let pixelViewFrame = self.view.convertRect(pixel.pixelImage.frame, fromView: pixel.pixelImage.superview)
// Check if the touch is inside the obstacle view
if CGRectContainsPoint(pixelViewFrame, touchLocation) {
// check if the pixel is Editable
if(!pixel.isEditable){
let index = pixel.index
pixelArrays.insert(updatePixel(index) , atIndex: index)
}
}
}
}
the only problem that i have now that if the swipe begin on one of the UIImageViews the touchesMoved function consider it the the view to looks for the coordinate and the other UIImageViews are not effected
my idea to solve it is to add layer on top all of the UIImageViews and disable the tap recognition that they alredy have and implement the tap also with the Coordinates way.
i will be happy to hear if there is another way to do it
Update :
i mange to solve the problem above by sort of what i wrote but instead of add another layer i disable the touch on all of the UIImageViews and invoke them using the Coordinates of the touch and them
many thanks

Proper UIGestureRecognizer and Delegate design

This is a pretty hypothetical question just to understand proper design but lets say I have two custom UIViews.
One of them is essentially a container that I'll call a drawer. Its purpose is to hide and show content. It's a lot like the notification center on iOS where you swipe to pull it open and flick it back up to close it. It's a generic container than can contain any other UIView. It has a UIPanGestureRecognizer to track the finger that's pulling it open/closed. It might also have a UISwipeGestureRecognizer to detect a "flick".
The other view is a custom map widget that has UIPan/Rotation/Pinch GestureRecognizers.
I think the drawer view should be the UIGestureRecognizerDelegate for the Pan/Swipe GestureRecognizers so that it can prevent touches from being delivered unless the user is grabbing "the handle".
My first instinct is for the map to be the UIGestureRecognizerDelegate of the pan/rotation/pinch gestures so that it can allow them to all run simultaneously.
The problem I'm having is that, I really don't want the map to receive any touches or begin recognizing gestures until the drawer is completely open. I'd like to be able to enforce this behavior automatically in the drawer itself so that it works for all subviews right out of the box.
The only way that I can think to do this is to wire all of the gestures handlers to the ViewController and let it do everything, but to me that breaks encapsulation as now it has to know that the map gestures need to run simultaneously, that the drawer should only get touches on it's handle and that the map should only get touches when it's open.
What are some ways of doing this where the logic can stay in the Views where I think it belongs?
I would do something like this to make the subviews of the drawer disabled while panning. Essentially loop through the drawer's subviews and disbale interaction on them.
[self.subviews enumerateObjectsUsingBlock:^(UIView *subview, NSUInteger idx, BOOL *stop){
subview.userInteractionEnabled = NO;
}];
And something similar again for when you want to re-enable user interaction on the subviews.
This should already Just Work™. A gesture recogniser is attached to a view; when a continuous gesture is recognised, all subsequent touches associated with that gesture are associated with that view.
So in your case, when the drawer pan is recognised, no touches associated with that pan should ever cause behaviour in your map view's pan/pinch/rotation gestures (unless you explicitly specify that they should using the appropriate delegate methods).
Or do you mean that you want to prevent the user from, halfway through opening the drawer, using another finger (i.e. another gesture) to start scrolling the (half-visible) map? If so, you should just set userInteractionEnabled on the drawer's contentView (or equivalent) to NO at UIGestureRecognizerStateBegan/Changed and YES again at UIGestureRecognizerStateEnded/Cancelled.

Can two UICollectionViews respond to a single gesture?

I have two fullscreen child UICollectionViews. One is a transparent overlay on the other. I'd like them both to respond when I drag around the screen - both of them when it's a horizontal drag and only one of them when it's a vertical drag, a little like some media centre home screens. Is this possible without reimplementing the private UICollectionView gesture recognisers, and if so how?
If not then any pointers to example reimplementations would be appreciated.
Some things I know, or have tried:
I have a pan gesture recogniser on the View Controller with a Delayed Begin that can detect the vertical or horizontal movement before events are sent through to the views.
I know that simply forwarding events from my parent view's touchesBegan: etc. won't work because the touches' view property is set to my parent view, and UITouches can't be copied (naively at least) since they don't implement the NSCopying protocol. Perhaps I can synthesise suitable UITouch events and forward them?
I know I can send scrollToItemAtIndexPath:atScrollPosition:animated: messages manually but I'd prefer to have the natural drag, swipe and snap paging behaviour for the Collections.
Alternatively, is it possible to modify the private gesture recognisers' delegates and implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: - without explicitly accessing private APIs - to allow both collections to see the touches? Is the responder chain smart enough to call this with gesture recognisers from two sibling views?
Another approach might be to manually control the overlay, and not manage it as a Collection View, but Collection Views seem like a more natural fit, and in theory provide the interactivity I'd like out of the box. The box, at the moment, seems to need a crowbar to get in!
This question seems similar (if less explicit), and has no answers. The other questions I've looked at all seem to be about adding pinch, or having subviews of collections also respond to gestures; not quite my situation.
I'm scratching my head a little, so thanks for any pointers.
The short answer is you can't, easily, anyway.
The approach that worked for me is a lot simpler, and cleaner: embed one collection view within another. The containing one is limited to horizontal scrolling, and the overlay one to vertical, both with paging turned on. Both share the same controller as their delegate and datasource, and - since a collection view is a subclass of scroll view - this also keeps track of which container and overlay page we're on in the scrollViewDidEndDecelerating: method:
-(void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
if ([scrollView isEqual:containerCollection]) {
containerNumber = scrollView.contentOffset.x / scrollView.frame.size.width;
}
else {
overlayNumber = scrollView.contentOffset.y / scrollView.frame.size.height;
}
}
The only real bit of trickery was in my cellForItemAtIndexPath: method where, when I instantiate the container cell, I need to register .xibs for reuse (each overlay is different) and use the remembered overlay page and issue both scrollToItemAtIndexPath: and reloadItemsAtIndexPaths: to the embedded overlay collection to get it to appear correctly.
I've managed to keep both cells as separate .xibs as well, with associated convenience classes for any extra data they need (and in the case of the container collection the overlay collection IBOutlet).
And not a gesture recogniser in sight.

How to make UIView stop receiving touch events?

I'm working on an app where the user is expected to rapidly touch and swipe across multiple UIViews, each of which is supposed to do an action once the user's finger has reached it. I've got a lot of views and so the typical thing to do, where I'd iterate over each view to see if a touch is inside of its bounds, is a no-go - there's just too much lag. Is there any other way to get touch events from one view to another (that is beside the first one)? I thought maybe there is some way to cancel the touch event, but I've searched and so far have come up empty.
One of the big problems I have is that if I implement my touch handling in my view controller, touchesBegan only fires for the first touch - if the user touches something and then, without moving the first finger, taps on something else, that tap is not recorded in either touchesBegan or touchesMoved. But if I implement my touch handling in the UIViews themselves, once a view registers a touch, if the user does not lift their finger up and moves it, the views around the first view do not register the touch. Only if the user lifts his finger and then puts it back down will the surrounding views register the touch.
So my question is, lets say I have two views side by side, my touch handling code is implemented in the views, and I put my finger down on view 1. I then slide my finger over to view 2 - what do I need to do to make view 2 register that touch, which started in view 1 and never "ended"?
Set userInteractionEnabled property of UIView to NO.
view.userInteractionEnabled = NO;
UIView has the following property:
#property(nonatomic, getter=isUserInteractionEnabled) BOOL userInteractionEnabled
Ok, I figured out what was going on. Thing is, I have my views as subviews of a scrollview, which is itself a subview of my main view. With scrollEnabled = NO, I could touch my subviews - but apparently the scrollview was only forwarding me the initial touch event, and all subsequent touches were part of that initial event. Because of that, I had many weird problems such as touching two views one after the other, both would select and highlight, but if I took the first finger off the screen both views would de-select. This was not the desired behavior.
So what I did is I subclassed the scrollview and overrode the touch handling methods to send the events to its first responder, which is its superview, which is the view where I'm doing my touch handling. Now it works!

Resources