Can two UICollectionViews respond to a single gesture? - ios

I have two fullscreen child UICollectionViews. One is a transparent overlay on the other. I'd like them both to respond when I drag around the screen - both of them when it's a horizontal drag and only one of them when it's a vertical drag, a little like some media centre home screens. Is this possible without reimplementing the private UICollectionView gesture recognisers, and if so how?
If not then any pointers to example reimplementations would be appreciated.
Some things I know, or have tried:
I have a pan gesture recogniser on the View Controller with a Delayed Begin that can detect the vertical or horizontal movement before events are sent through to the views.
I know that simply forwarding events from my parent view's touchesBegan: etc. won't work because the touches' view property is set to my parent view, and UITouches can't be copied (naively at least) since they don't implement the NSCopying protocol. Perhaps I can synthesise suitable UITouch events and forward them?
I know I can send scrollToItemAtIndexPath:atScrollPosition:animated: messages manually but I'd prefer to have the natural drag, swipe and snap paging behaviour for the Collections.
Alternatively, is it possible to modify the private gesture recognisers' delegates and implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: - without explicitly accessing private APIs - to allow both collections to see the touches? Is the responder chain smart enough to call this with gesture recognisers from two sibling views?
Another approach might be to manually control the overlay, and not manage it as a Collection View, but Collection Views seem like a more natural fit, and in theory provide the interactivity I'd like out of the box. The box, at the moment, seems to need a crowbar to get in!
This question seems similar (if less explicit), and has no answers. The other questions I've looked at all seem to be about adding pinch, or having subviews of collections also respond to gestures; not quite my situation.
I'm scratching my head a little, so thanks for any pointers.

The short answer is you can't, easily, anyway.
The approach that worked for me is a lot simpler, and cleaner: embed one collection view within another. The containing one is limited to horizontal scrolling, and the overlay one to vertical, both with paging turned on. Both share the same controller as their delegate and datasource, and - since a collection view is a subclass of scroll view - this also keeps track of which container and overlay page we're on in the scrollViewDidEndDecelerating: method:
-(void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
if ([scrollView isEqual:containerCollection]) {
containerNumber = scrollView.contentOffset.x / scrollView.frame.size.width;
}
else {
overlayNumber = scrollView.contentOffset.y / scrollView.frame.size.height;
}
}
The only real bit of trickery was in my cellForItemAtIndexPath: method where, when I instantiate the container cell, I need to register .xibs for reuse (each overlay is different) and use the remembered overlay page and issue both scrollToItemAtIndexPath: and reloadItemsAtIndexPaths: to the embedded overlay collection to get it to appear correctly.
I've managed to keep both cells as separate .xibs as well, with associated convenience classes for any extra data they need (and in the case of the container collection the overlay collection IBOutlet).
And not a gesture recogniser in sight.

Related

Proper UIGestureRecognizer and Delegate design

This is a pretty hypothetical question just to understand proper design but lets say I have two custom UIViews.
One of them is essentially a container that I'll call a drawer. Its purpose is to hide and show content. It's a lot like the notification center on iOS where you swipe to pull it open and flick it back up to close it. It's a generic container than can contain any other UIView. It has a UIPanGestureRecognizer to track the finger that's pulling it open/closed. It might also have a UISwipeGestureRecognizer to detect a "flick".
The other view is a custom map widget that has UIPan/Rotation/Pinch GestureRecognizers.
I think the drawer view should be the UIGestureRecognizerDelegate for the Pan/Swipe GestureRecognizers so that it can prevent touches from being delivered unless the user is grabbing "the handle".
My first instinct is for the map to be the UIGestureRecognizerDelegate of the pan/rotation/pinch gestures so that it can allow them to all run simultaneously.
The problem I'm having is that, I really don't want the map to receive any touches or begin recognizing gestures until the drawer is completely open. I'd like to be able to enforce this behavior automatically in the drawer itself so that it works for all subviews right out of the box.
The only way that I can think to do this is to wire all of the gestures handlers to the ViewController and let it do everything, but to me that breaks encapsulation as now it has to know that the map gestures need to run simultaneously, that the drawer should only get touches on it's handle and that the map should only get touches when it's open.
What are some ways of doing this where the logic can stay in the Views where I think it belongs?
I would do something like this to make the subviews of the drawer disabled while panning. Essentially loop through the drawer's subviews and disbale interaction on them.
[self.subviews enumerateObjectsUsingBlock:^(UIView *subview, NSUInteger idx, BOOL *stop){
subview.userInteractionEnabled = NO;
}];
And something similar again for when you want to re-enable user interaction on the subviews.
This should already Just Work™. A gesture recogniser is attached to a view; when a continuous gesture is recognised, all subsequent touches associated with that gesture are associated with that view.
So in your case, when the drawer pan is recognised, no touches associated with that pan should ever cause behaviour in your map view's pan/pinch/rotation gestures (unless you explicitly specify that they should using the appropriate delegate methods).
Or do you mean that you want to prevent the user from, halfway through opening the drawer, using another finger (i.e. another gesture) to start scrolling the (half-visible) map? If so, you should just set userInteractionEnabled on the drawer's contentView (or equivalent) to NO at UIGestureRecognizerStateBegan/Changed and YES again at UIGestureRecognizerStateEnded/Cancelled.

Detect user dragging items out of UICollectionView?

I've got a UICollectionView, and I'd like to be able to touch-and-drag items up and out of the View, and thus delete them. (Very much along the same lines as how the Dock works on OS X: drag something off and let go, and it is removed).
I've done some research, but almost everything I find is looking for CollectionViews that are drag-and-drop to reorder. I don't need to reorder (I'm happy to just remove the item at the given index from the source array and then reload), I just need to detect when an item is moved outside of the View and released.
So I suppose my questions are these:
1) Is that possible with the built-in CollectionView, some kind of itemWasDraggedOutsideViewFromIndex: method or something?
2) If not, is it something that can be done with a subclass (and specifically is it possible for a CollectionView beginner)?
3) Are there any code samples or tutorials you can recommend that do this?
Here is a helper class that I've been working on that does just that: implementation: https://github.com/Ice3SteveFortune/i3-dragndrop, hope it helps. There's examples on how to use it in the TestApp.
UPDATE
About a year on, this is now a full-on drag-and-drop framework. Hope this proves useful: https://github.com/ice3-software/between-kit
There is no built-in method like you're suggesting. What you're wanting to be can be done but you'll have to handle it with a gesture recognizer and appropriate code to handle the drag/drop operation.
I tried using a subclass to do this and finally went back to putting it in my view controller. In my case, though, I was dragging stuff in/out of the collection view as well as two other views on the screen.
I don't know if you have the book, but the most helpful thing I found was Erica Sadun's Core iOS6 Develper's Cookbook, which has excellent code on drag/drop within Collection Views. I don't think it specifically addresses dragging outside of the CV, but for me the solution was to put the gesture recognizer on the common superview and always use its coordinates rather than the subview's coordinates.
One problem I hit was I wanted to be able to select cells with a tap as well as drag, and there is no way (despite Apple's docs to the contrary) to require the single tap gesture to fail on the collection view. As a result, I ended up having to use the long press gesture to perform the entire operation, and there is no translationInView for long press (there is locationInView) so that required some additional work:
iOS - Gesture Recognizer translationInView
Another thing that will make it harder or easier is the number of possible drop targets you have. I had many, in many different types of views (straight UIView, collectionview, and scrollViews). I found it necessary to maintain a list of "drop targets" and to test for intersections with targets as the dragged object was moved. Somehow, you have to be able to determine whether the view you're intersecting is a place where a drop can occur.
If you are addressing the specific situation of dragging something out of a view to delete it (like dragging to a trash can view) and that's it, this should not be complicated. You have to remember that when you do a transform your frame becomes meaningless, but the center is still good; so you end up using the center for everything that you would normally use the frame for.
Here is the closest thing I found online that was helpful; I didn't end up using this class though as I thought it would be too complicated to implement in my app.
http://www.ancientprogramming.com/2012/04/05/drag-and-drop-between-multiple-uiviews-in-ios/
Hope this has been some help.
Yes there is.
1 - Conform your view to UIDropInteractionDelegate.
2 - Then add this line to your viewload or init:
For viewcontroller add to ViewDidload:
self.view.addInteraction(UIDropInteraction(delegate: self))
Or, for UIViews add to init:
self.addInteraction(UIDropInteraction(delegate: self))
3 - Then get the location for item being dragged here and have fun with it:
func dropInteraction(_ interaction: UIDropInteraction, sessionDidUpdate session: UIDropSession) -> UIDropProposal {
print(session.location(in: self))
return UIDropProposal(operation: .move)
}

hacking ios ui responder chain

I'm facing a delicated problem handling touch events. This is problably not a usual stuff to make but i think it is possible. I just don't know how...
I have a Main View (A) and Main View (B) with a lot of subviews 1,2,3,4,5,...
MainView
SubView(A)
1
2
3
SubView(B)
1
2
3
Some of these sub sub views (1,2,4) are scrollviews.
It happens that I want to change between A and B with a two finger pan.
I have tried to attach a UIPanGestureRecognizer to MainView but the scrollviews cancel the touches and it only works sometimes.
I need a consistent method to first capture the touches, detect if it is a two finger pan, and only then decide if it will pass the touches downwards (or upwards... i'm not sure) the responder chain.
I tried to create a top level view to handle that, but I cant make to have the touches being passed through that view.
I have found a lot of people with similar problems but couldn't find a solution to this problem from their solutions.
If any one could give me a light, that would be great as i'm already desperate with this.
You can create a top level view to capture the touches and the coordinates of touches then you can check if the coordinates of touch is inside of the sub views. You can do that using
BOOL CGRectContainsPoint(CGRect rect, CGPoint point)
Method. Rect is a frame of the view, and point is point of touch.
Please not that the frames and touch locations are relative to their super views, therefore you need to convert them to coordinate system of app window.
Or maybe it can be more helpful
Receiving touch events on more then one UIView simultaneously

Gestures that steal touches like iOS multitasking swipe

I know what I want to do, but I'm stumped as to how to do it: I want to implement something like the iOS multitasking gestures. That is, I want to "steal" touches from any view inside my view hierarchy if the number of touches is greater than, say, two. Of course, the gestures are not meant to control multitasking, it's just the transparent touch-stealing I'm after.
Since this is a fairly complex app (which makes extensive use of viewController containment), I want this to be transparent to the views that it happens to (i. e. I want to be able to display arbitrary views and hierarchies, including UIScrollViews, MKMapViews, UIWebViews etc. without having to change their implementation to play nice with my gestures).
Just adding a gestureRecognizer to the common superview doesn't work, as subviews that are interaction enabled eat all the touches that fall on them.
Adding a visually transparent UI-enabled view as a sibling (but in front) of the main view hierarchy also doesn't work, since now this view eats all the touches. I've experimented with reimplementing touchesBegan: etc. in the touchView, but forwarding the touches to nextResponder doesn't work, because that'll be the common superview, in effect funnelling the touches right around the views that are supposed to be receiving them when the touchView gives them up.
I am sure I'm not the only one looking for a solution for this, and I'm sure there are smarter people than me that have this already figured out. I even suspect it might not actually be very hard, and just maybe my brain won't see the forest for the trees today. I'm thankful for any helpful answers anyway :)
I would suggest you to try using method swizzling, reimplementing the touchesbegan on UIView. I think that the best way is to store in a static shared variable the number of touches (so that each view can increment/decrement this value). It's just a very simple idea, take it with a grain of salt.
Hope this helps.
Ciao! :)
A possible, but potentially dangerous (if you aren't careful) approach is to subclass your application UIWindow and redefine the sendEvent: method.
As this method is called for each touch event received by the app, you can inspect it and then decide to call [super sendEvent:] (if the touch is not filtered), or don't call it (if the touch is filtered) or just defer its call if you are still recognizing the touch.
Another possibility is to play with the hitTest:withEvent: method but this would require your stealing view to be placed properly in the subview, and I think it doesn't fit well when you have many view controllers. I believe the previous solution is more general purpose.
Actually, adding a gesture recognizer on the common superview is the right way to do this. But it sound like you may need to set either delaysTouchesBegan or cancelsTouchesInView (or both) to ensure that the gesture recognizer handles everything before letting it through to the child views.

UIPanGestureRecognizer.maximumNumberOfTouches not respected in nested scroll views?

I have a root UIScrollView that only scrolls vertically, this scrollview represents rows in my jagged grid. I have configured this scroll view's pan gesture recognizer for two touches for both minimum and maximum number of touches requires.
Inside this scrollview I have one or more UIScrollView instances that only scrolls horizontally, these scrollviews each represent a single row in my jagged grid view. I have configured the pan gesture recognizers for all of these scroll views for one touch minimum, and two touches maximum.
So far it works, I get a nice jagged grid view where I can scroll vertically between rows, and horizontally to scroll each row independently. I have intentionally set to minimum number of touches as 2, as not to inter fear with scrolling if I add fro example a UITableView as a subview for any of cell within this jagged grid view (cell == a position defined by a row and column in that row).
Using a UITableView as a cell works, the table view works as expected that is. But scrolling with two fingers also scrolls inside the table view, not at the root scroll view for vertically scrolling between rows.
I have tried configuring the table views pan gesture recognizer to allow a maximum of one touches, in hope that two finger touches would be ignored. This does not work, the maximumNumberOfTouches property of the table view's pan gesture recognizer seams to be ignored.
What could I have done wrong?
A screen shot displaying the layout to clarify what I have done:
Multiple scrolling tends to get tricky, and I don't for sure, but I think Apple does not encourage this. Even so, I still think it's possible. It may be that vertical scrolling on the table view gets mixed with the scroll view vertical scrolling or something else.
Try checking if the delegates for the gesture recognizers are correctly set.
Another way around this is:
- having a Scroll view with buttons, from which you can open popovers with custom controllers (insert there whatever you want).
- create a big UITableViewController and setting the cell's contents as scrollviews etc. I think you could get the same result.
My advice is not to get stuck on just one method, when there could be others more simpler and more intuitive.
TableViews on Scroll views are generally not a great idea. When a TableView receives the touches, even if doesn't need to do anything with it, it won't send them to it's superView.
You might wanna try either of these 2 things:
In your TableView you should send the touches to your superView manually and let them handle them appropriately. I've seen this method being used in one of my side-projects but I'm not able to post an example of it at this time.
The second thing might be easier to implement. Since TableView is a subclass of ScrollView you can call upon the delaysContentTouches of those TableViews. This property will delay the touch-down even on that TableView until it can determine if scrolling is the intent, as is written in the AppleDocs: http://developer.apple.com/library/ios/#documentation/uikit/reference/UIScrollView_Class/Reference/UIScrollView.html#//apple_ref/occ/cl/UIScrollView
Let me know if either of the 2 ways works for you, I'm quite curious about this subject generally.
But don't you try some tricks rather than implementing all such changes :
1) By Default, disable the scrolling of the TableView when the view is created.
2) Once the view gets generated, Recognize the gestures whether its Scrolling using single or multiple touches, if user touch the Child Scrollview.Look out the tag ,based on gestures, you can enable the Scrolling of Tableview.
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
{
//Get the tag of ScrollView
// Check for the Parent as well as Child SCrollview.
// If its child, just enable the scrolling of tableView.
}
- (void)tapAction:(UIGestureRecognizer *)gestureRecognizer
{
// CGPoint *poit = [Tile locationInView:gestureRecognizer.view];
/// [[[gestureRecognizers.view] objectAtIndex:0] removeFromSuperview];
// imageContent = [[UIImageView alloc]initWithFrame:CGRec tMake(0, 0, 200, 250)];
// [imageContent setImage:[UIImage imageNamed:#"Default.png"]];
NSLog(#"You tapped # this :%d",gestureRecognizer.view.tag);
//Regonize the gestures
}
There may be some unnecessary code since, there is no code snippet with your question.I gave a try & showed a trick if that could work for you & solve the problem. ;)
Try this link and pay attention to how they solve nested views.
Remember the best Practices for Handling Multitouch Events:
When handling events, both touch events and motion events, there are a few recommended techniques and patterns you should follow.
Always implement the event-cancellation methods.
In your implementation, you should restore the state of the view to what it was before the current multitouch sequence, freeing any transient resources set up for handling the event. If you don’t implement the cancellation method your view could be left in an inconsistent state. In some cases, another view might receive the cancellation message.
If you handle events in a subclass of UIView, UIViewController, or (in rare cases) UIResponder,
You should implement all of the event-handling methods (even if it is a null implementation).
Do not call the superclass implementation of the methods.
If you handle events in a subclass of any other UIKit responder class,
You do not have to implement all of the event-handling methods.
But in the methods you do implement, be sure to call the superclass implementation. For example,
[super touchesBegan:theTouches withEvent:theEvent];
Do not forward events to other responder objects of the UIKit framework.
The responders that you forward events to should be instances of your own subclasses of UIView, and all of these objects must be aware that event-forwarding is taking place and that, in the case of touch events, they may receive touches that are not bound to them.
Custom views that redraw themselves in response to events should only set drawing state in the event-handling methods and perform all of the drawing in the drawRect: method.
Do not explicitly send events up the responder (via nextResponder); instead, invoke the superclass implementation and let the UIKit handle responder-chain traversal.

Resources