I have a root UIScrollView that only scrolls vertically, this scrollview represents rows in my jagged grid. I have configured this scroll view's pan gesture recognizer for two touches for both minimum and maximum number of touches requires.
Inside this scrollview I have one or more UIScrollView instances that only scrolls horizontally, these scrollviews each represent a single row in my jagged grid view. I have configured the pan gesture recognizers for all of these scroll views for one touch minimum, and two touches maximum.
So far it works, I get a nice jagged grid view where I can scroll vertically between rows, and horizontally to scroll each row independently. I have intentionally set to minimum number of touches as 2, as not to inter fear with scrolling if I add fro example a UITableView as a subview for any of cell within this jagged grid view (cell == a position defined by a row and column in that row).
Using a UITableView as a cell works, the table view works as expected that is. But scrolling with two fingers also scrolls inside the table view, not at the root scroll view for vertically scrolling between rows.
I have tried configuring the table views pan gesture recognizer to allow a maximum of one touches, in hope that two finger touches would be ignored. This does not work, the maximumNumberOfTouches property of the table view's pan gesture recognizer seams to be ignored.
What could I have done wrong?
A screen shot displaying the layout to clarify what I have done:
Multiple scrolling tends to get tricky, and I don't for sure, but I think Apple does not encourage this. Even so, I still think it's possible. It may be that vertical scrolling on the table view gets mixed with the scroll view vertical scrolling or something else.
Try checking if the delegates for the gesture recognizers are correctly set.
Another way around this is:
- having a Scroll view with buttons, from which you can open popovers with custom controllers (insert there whatever you want).
- create a big UITableViewController and setting the cell's contents as scrollviews etc. I think you could get the same result.
My advice is not to get stuck on just one method, when there could be others more simpler and more intuitive.
TableViews on Scroll views are generally not a great idea. When a TableView receives the touches, even if doesn't need to do anything with it, it won't send them to it's superView.
You might wanna try either of these 2 things:
In your TableView you should send the touches to your superView manually and let them handle them appropriately. I've seen this method being used in one of my side-projects but I'm not able to post an example of it at this time.
The second thing might be easier to implement. Since TableView is a subclass of ScrollView you can call upon the delaysContentTouches of those TableViews. This property will delay the touch-down even on that TableView until it can determine if scrolling is the intent, as is written in the AppleDocs: http://developer.apple.com/library/ios/#documentation/uikit/reference/UIScrollView_Class/Reference/UIScrollView.html#//apple_ref/occ/cl/UIScrollView
Let me know if either of the 2 ways works for you, I'm quite curious about this subject generally.
But don't you try some tricks rather than implementing all such changes :
1) By Default, disable the scrolling of the TableView when the view is created.
2) Once the view gets generated, Recognize the gestures whether its Scrolling using single or multiple touches, if user touch the Child Scrollview.Look out the tag ,based on gestures, you can enable the Scrolling of Tableview.
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
{
//Get the tag of ScrollView
// Check for the Parent as well as Child SCrollview.
// If its child, just enable the scrolling of tableView.
}
- (void)tapAction:(UIGestureRecognizer *)gestureRecognizer
{
// CGPoint *poit = [Tile locationInView:gestureRecognizer.view];
/// [[[gestureRecognizers.view] objectAtIndex:0] removeFromSuperview];
// imageContent = [[UIImageView alloc]initWithFrame:CGRec tMake(0, 0, 200, 250)];
// [imageContent setImage:[UIImage imageNamed:#"Default.png"]];
NSLog(#"You tapped # this :%d",gestureRecognizer.view.tag);
//Regonize the gestures
}
There may be some unnecessary code since, there is no code snippet with your question.I gave a try & showed a trick if that could work for you & solve the problem. ;)
Try this link and pay attention to how they solve nested views.
Remember the best Practices for Handling Multitouch Events:
When handling events, both touch events and motion events, there are a few recommended techniques and patterns you should follow.
Always implement the event-cancellation methods.
In your implementation, you should restore the state of the view to what it was before the current multitouch sequence, freeing any transient resources set up for handling the event. If you don’t implement the cancellation method your view could be left in an inconsistent state. In some cases, another view might receive the cancellation message.
If you handle events in a subclass of UIView, UIViewController, or (in rare cases) UIResponder,
You should implement all of the event-handling methods (even if it is a null implementation).
Do not call the superclass implementation of the methods.
If you handle events in a subclass of any other UIKit responder class,
You do not have to implement all of the event-handling methods.
But in the methods you do implement, be sure to call the superclass implementation. For example,
[super touchesBegan:theTouches withEvent:theEvent];
Do not forward events to other responder objects of the UIKit framework.
The responders that you forward events to should be instances of your own subclasses of UIView, and all of these objects must be aware that event-forwarding is taking place and that, in the case of touch events, they may receive touches that are not bound to them.
Custom views that redraw themselves in response to events should only set drawing state in the event-handling methods and perform all of the drawing in the drawRect: method.
Do not explicitly send events up the responder (via nextResponder); instead, invoke the superclass implementation and let the UIKit handle responder-chain traversal.
Related
I have a custom text input view that adopts the UITextInput protocol. This view is embedded within a UIScrollView. The UITextinput adopting view includes UITextInteraction. When I drag the selection handles to the edges (top or bottom) of the current visible area, the view should scroll and select text automatically (like UITextView). Does anybody know how to achieve this?
I was hoping that UITextInteraction might inform me via its delegate of necessary events, but it does not (or even takes care of this functionality automatically).
I have tried to intercept closestPosition(to point: CGPoint) -> UITextPosition?, which is called whenever the user touches the UITextInput adopting view. Therefore, I can use it to track the dragging operation of a selection handle. Once, the user reaches the top of the view, I scroll up. However, I cannot detect when the user lets go of the handle (when touch ends). When this happens, the scrollView should stop scrolling. In my case, the scroll view keeps scrolling to the top.
I also tried to intercept selectionRects(for range: UITextRange) -> [UITextSelectionRect], but it is called sporadically during a scrolling.
I also cannot detect touchesEnded(). The UITextInteraction seems to block the call. Further, I cannot implement my own pan gesture. UITextInteraction blocks this as well during a selection operation.
Has anybody successfully used UITextInteraction? It seems very pre-mature at this stage.
Here's the way I got this info out of UITextInteraction for the text insertion point placement. I have not yet attempted to track the selection handles, but will update the answer if I manage it
UITextInteraction has a property called gesturesForFailureRequirements. If you dump the content of these you'll notice one named UIVariableDelayLoupeGesture which is a subclass of UILongPressGestureRecognizer. This is the one we want when the use "picks up" the cursor with their finger
I add my own target/selector to this gesture recogniser when adding my text interaction like so:
for gesture in interaction.gesturesForFailureRequirements {
if gesture.isKind(of: UILongPressGestureRecognizer.self) {
gesture.addTarget(self, action: #selector(longPressEdgeScrollGesture(_:)))
}
}
Then in longPressEdgeScrollGesture you can get the coordinates of the cursor in your scroll view to activate your edge scrolling timer as necessary
#objc private func longPressEdgeScrollGesture(_ gesture: UILongPressGestureRecognizer) {
guard gesture.numberOfTouches > 0 else {
return
}
let location = gesture.location(in: gesture.view)
print(location)
//Start/stop your edge scrolling as required
}
I found a solution to my original problem. It is somewhat of a work around, but hopefully Apple will improve UITextInteraction.
Trying to intercept any functions in UITextInput led nowhere. Thankfully some very clever people on Twitter figured out what to do. You need to intercept the right UIGestureRecognizer that is added automatically. Much like Simeon explained in his answer. The recognizer in question is called UITextRangeAdjustmentGestureRecognizer. But you cannot find it via gesturesForFailureRequirements. You need to become a delegate of the gestures and then you can find mentioned gesture via the delegate method of shouldRecognizeSimultaneouslyWith otherGestureRecognizer.
Once you add a target + action to that gesture you can observe the dragging and handle edge scrolling.
I have a UIView with a bunch of subviews. Each subview can either be clicked (UITouchUpInside) or 'swiped' (UITouchDragExit) to perform a different action. Both actions work as intended separately, but since the subviews are really close together, the UITouchDragExit Control event of one view accidentally activates the UITouchUpInside of the subview above it.
In order to avoid this, I was thinking of making the subview ignore all other touches to its other subviews until the UITouchDragExit gesture is over. What would be the best way of accomplishing this? In other words how can I detect the start/end of UITouchDragExit?
Thanks in advance!
This is a pretty hypothetical question just to understand proper design but lets say I have two custom UIViews.
One of them is essentially a container that I'll call a drawer. Its purpose is to hide and show content. It's a lot like the notification center on iOS where you swipe to pull it open and flick it back up to close it. It's a generic container than can contain any other UIView. It has a UIPanGestureRecognizer to track the finger that's pulling it open/closed. It might also have a UISwipeGestureRecognizer to detect a "flick".
The other view is a custom map widget that has UIPan/Rotation/Pinch GestureRecognizers.
I think the drawer view should be the UIGestureRecognizerDelegate for the Pan/Swipe GestureRecognizers so that it can prevent touches from being delivered unless the user is grabbing "the handle".
My first instinct is for the map to be the UIGestureRecognizerDelegate of the pan/rotation/pinch gestures so that it can allow them to all run simultaneously.
The problem I'm having is that, I really don't want the map to receive any touches or begin recognizing gestures until the drawer is completely open. I'd like to be able to enforce this behavior automatically in the drawer itself so that it works for all subviews right out of the box.
The only way that I can think to do this is to wire all of the gestures handlers to the ViewController and let it do everything, but to me that breaks encapsulation as now it has to know that the map gestures need to run simultaneously, that the drawer should only get touches on it's handle and that the map should only get touches when it's open.
What are some ways of doing this where the logic can stay in the Views where I think it belongs?
I would do something like this to make the subviews of the drawer disabled while panning. Essentially loop through the drawer's subviews and disbale interaction on them.
[self.subviews enumerateObjectsUsingBlock:^(UIView *subview, NSUInteger idx, BOOL *stop){
subview.userInteractionEnabled = NO;
}];
And something similar again for when you want to re-enable user interaction on the subviews.
This should already Just Work™. A gesture recogniser is attached to a view; when a continuous gesture is recognised, all subsequent touches associated with that gesture are associated with that view.
So in your case, when the drawer pan is recognised, no touches associated with that pan should ever cause behaviour in your map view's pan/pinch/rotation gestures (unless you explicitly specify that they should using the appropriate delegate methods).
Or do you mean that you want to prevent the user from, halfway through opening the drawer, using another finger (i.e. another gesture) to start scrolling the (half-visible) map? If so, you should just set userInteractionEnabled on the drawer's contentView (or equivalent) to NO at UIGestureRecognizerStateBegan/Changed and YES again at UIGestureRecognizerStateEnded/Cancelled.
I have two fullscreen child UICollectionViews. One is a transparent overlay on the other. I'd like them both to respond when I drag around the screen - both of them when it's a horizontal drag and only one of them when it's a vertical drag, a little like some media centre home screens. Is this possible without reimplementing the private UICollectionView gesture recognisers, and if so how?
If not then any pointers to example reimplementations would be appreciated.
Some things I know, or have tried:
I have a pan gesture recogniser on the View Controller with a Delayed Begin that can detect the vertical or horizontal movement before events are sent through to the views.
I know that simply forwarding events from my parent view's touchesBegan: etc. won't work because the touches' view property is set to my parent view, and UITouches can't be copied (naively at least) since they don't implement the NSCopying protocol. Perhaps I can synthesise suitable UITouch events and forward them?
I know I can send scrollToItemAtIndexPath:atScrollPosition:animated: messages manually but I'd prefer to have the natural drag, swipe and snap paging behaviour for the Collections.
Alternatively, is it possible to modify the private gesture recognisers' delegates and implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: - without explicitly accessing private APIs - to allow both collections to see the touches? Is the responder chain smart enough to call this with gesture recognisers from two sibling views?
Another approach might be to manually control the overlay, and not manage it as a Collection View, but Collection Views seem like a more natural fit, and in theory provide the interactivity I'd like out of the box. The box, at the moment, seems to need a crowbar to get in!
This question seems similar (if less explicit), and has no answers. The other questions I've looked at all seem to be about adding pinch, or having subviews of collections also respond to gestures; not quite my situation.
I'm scratching my head a little, so thanks for any pointers.
The short answer is you can't, easily, anyway.
The approach that worked for me is a lot simpler, and cleaner: embed one collection view within another. The containing one is limited to horizontal scrolling, and the overlay one to vertical, both with paging turned on. Both share the same controller as their delegate and datasource, and - since a collection view is a subclass of scroll view - this also keeps track of which container and overlay page we're on in the scrollViewDidEndDecelerating: method:
-(void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
if ([scrollView isEqual:containerCollection]) {
containerNumber = scrollView.contentOffset.x / scrollView.frame.size.width;
}
else {
overlayNumber = scrollView.contentOffset.y / scrollView.frame.size.height;
}
}
The only real bit of trickery was in my cellForItemAtIndexPath: method where, when I instantiate the container cell, I need to register .xibs for reuse (each overlay is different) and use the remembered overlay page and issue both scrollToItemAtIndexPath: and reloadItemsAtIndexPaths: to the embedded overlay collection to get it to appear correctly.
I've managed to keep both cells as separate .xibs as well, with associated convenience classes for any extra data they need (and in the case of the container collection the overlay collection IBOutlet).
And not a gesture recogniser in sight.
I'm working on an app where the user is expected to rapidly touch and swipe across multiple UIViews, each of which is supposed to do an action once the user's finger has reached it. I've got a lot of views and so the typical thing to do, where I'd iterate over each view to see if a touch is inside of its bounds, is a no-go - there's just too much lag. Is there any other way to get touch events from one view to another (that is beside the first one)? I thought maybe there is some way to cancel the touch event, but I've searched and so far have come up empty.
One of the big problems I have is that if I implement my touch handling in my view controller, touchesBegan only fires for the first touch - if the user touches something and then, without moving the first finger, taps on something else, that tap is not recorded in either touchesBegan or touchesMoved. But if I implement my touch handling in the UIViews themselves, once a view registers a touch, if the user does not lift their finger up and moves it, the views around the first view do not register the touch. Only if the user lifts his finger and then puts it back down will the surrounding views register the touch.
So my question is, lets say I have two views side by side, my touch handling code is implemented in the views, and I put my finger down on view 1. I then slide my finger over to view 2 - what do I need to do to make view 2 register that touch, which started in view 1 and never "ended"?
Set userInteractionEnabled property of UIView to NO.
view.userInteractionEnabled = NO;
UIView has the following property:
#property(nonatomic, getter=isUserInteractionEnabled) BOOL userInteractionEnabled
Ok, I figured out what was going on. Thing is, I have my views as subviews of a scrollview, which is itself a subview of my main view. With scrollEnabled = NO, I could touch my subviews - but apparently the scrollview was only forwarding me the initial touch event, and all subsequent touches were part of that initial event. Because of that, I had many weird problems such as touching two views one after the other, both would select and highlight, but if I took the first finger off the screen both views would de-select. This was not the desired behavior.
So what I did is I subclassed the scrollview and overrode the touch handling methods to send the events to its first responder, which is its superview, which is the view where I'm doing my touch handling. Now it works!