I understand the concept that the focus engine will decide what can be selected. It also seems that it moves linearly vertically or horizontally and tries to find the next closest neighbor whose view intersects that vertical or horizontal line.
The problem I haven't solved just yet is how to set things up so that panning to switch between focusable subviews does not get prevented if one of them has a scrollable area (like a map).
I have two collection views that take up the width of the screen and sit one on top of the other. I can pan to switch between these just fine. Here is the code that overrides their shared custom UICollectionView class
override public func canBecomeFocused() -> Bool {
return true
}
public override func shouldUpdateFocusInContext(context: UIFocusUpdateContext) -> Bool {
return true
}
public override func didUpdateFocusInContext(context: UIFocusUpdateContext, withAnimationCoordinator coordinator: UIFocusAnimationCoordinator) {
super.didUpdateFocusInContext(context, withAnimationCoordinator: coordinator)
}
In a separate view controller, I have a map view and a collection view above it. Both take up the width of the screen. I can pan to switch from the collection view down to the map view, but no matter how slow or fast I try to pan/swipe up, I cannot get the map view to lose its focus.
I did try adding some gesture recognizers and setting the delegate methods to try and make my GR win over the map view's scrolling GRs, but to no avail.
Any one else have similar experience? How do I get back out of the map view without having to add another dialogue or something to switch context back to the collection view?
Thank you in advance.
I'm sure there are a number of ways to solve the problem (like the one suggested in the comment by Eugene for example). In general, you'll probably need to determine when you want the focus to leave the map and trigger a focus update with setNeedsFocusUpdate(), updateFocusIfNeeded(). You would then override preferredFocusView (part of UIFocusEnvironment which is adopted by UIViewController and UIView among other things), to set the focus to whatever you want.
The real trick is to determine when it's appropriate to do this. A map is particularly hard because it's possible that it may scroll for a very long time prior to hitting a boundary (if it ever does). As such, you may need to utilize a button press as suggested by Eugene or perhaps by implementing some of the MKMapViewDelegate methods like mapView(_:regionWillChangeAnimated:) to determine when the map has moved a large distance. The "correct" answer would be determined by your desired behavior, of course.
Related
I am developing an application, which has a map as its core feature. On this map, users can draw and edit polygons, lines, and add points of interest (POIs). To edit a polygon, for instance, one should tap on it, and the application would enter an editing mode after that.
To accomplish this kind of behaviour, I have a transparent overlay view (UIView), that lays just above the map. This view can either ‘capture’ user’s gesture (i.e. tap, long press, etc), if it hits an area of screen, that contains a polygon, or pass it down to the map, if there is no polygon met at the point of the tap. This behaviour is achieved by overriding a UIView method point(inside:with:) (docs here). The pseudocode for the implementation goes like this:
override public func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
if containsPolygongs(at: point) {
return true
} else {
return false
}
}
However, I have an edge-case that got me stuck a bit. Additionally to previously described behaviour, I want to be able to move the map when I put my finger down at the polygon and start sliding my finger around. Or the same with pinch-to-zoom behaviour. So, basically, depending on the type of gesture, I want my point(inside:with:) to return either true (for Tap gesture), or false (for Pan/Pinch gesture). Or, speaking more generically, I want my view to be ‘physically present’ when I tap on a polygon, and ‘physically absent’ when the received gesture is anything but a tap.
P. S.: I am not sure that my idea with point method is 100% correct and is a single possible one way of accomplishing this behaviour. Maybe there is a way to capture all the gestures and dispatch them to the view that lays below. Any idea is great as long as it works. Thanks!
I have a horizontally scrolled UICollectionView with a title label above it and a UIPageControl below it.
UILabel
UICollectionView
UIPageControl
When I turn on the VoiceOver accessibility feature and start traversing the screen sequentially, the collection view scrolls to the beginning or end automatically. Making the page jump off suddenly. For example, if I scroll to the 2nd page using page control, and move back to the collection view, it shows and reads the last page unexpectedly. Since I'm using the page control for navigation in the accessibility mode, I'd like to prevent the automatic scrolling.
How do I prevent or counter that?
I found an issue that seems to describe the same problem, but there's no workaround suggestion: iOS 8.4: Scroll view resets contentOffset with Voice Over enabled shortly after view appear
I encountered it on iOS 13.4.1 iPhone 11 Pro
UIScrollViewDelegate.scrollViewDidScroll(_:)
A change in the accessibility focus that triggers an automatic scrolling also triggers a call to scrollViewDidScroll(_:) in your UIScrollViewDelegate. Use that to counter the automatic scrolling effect, f.i. by setting contentOffset the way you prefer it.
You may need to detect that the scrolling was actually triggered by accessibility features, and not the user dragging or pinching. UIAccessibility.isVoiceOverRunning and UIAccessibilityFocus.accessibilityElementDidBecomeFocused() are your friends here. Beware that changing contentOffset (or zoomScale or whatever is needed) may trigger another call to scrollViewDidScroll(_:), so you need to prevent an infinite recursion.
Using #pommy's suggestions, I was able to fix my similar issue. In the code I was working on, the most appropriate place to make the change ended up being CalendarCollectionView.setContentOffset(_:animated:), where CalendarCollectionView is a UICollectionView subclass. Specifically, it's a JTACMonthView subclass, but that should not be of any relevance to this answer.
From the name, you can see my use case: a calendar which shows a month at a time. It could have many months both into the future and the past, but the usual user focus is likely to start somewhere in the middle.
Like the OP, I found that swiping from an outer element to the collection view with VoiceOver enabled caused the focus to go to the first date in the calendar, in my case 1st January 1951 (Date.farPast, I believe.) An interesting aside: Switch Control navigation did not cause the same behaviour.
The underlying behaviour was that contentOffset was getting set to 0.0 in the dimension that the collection view scrolls. In my code, that direction is held in style, and changes based on configuration, but in most applications it's likely to be fixed.
My code simply blocks any offset changes to 0.0 when VoiceOver is enabled. This is pretty naïve, and won't be suitable for all apps, but gives a concrete example which I hope will help some others!
override func setContentOffset(_ contentOffset: CGPoint, animated: Bool) {
if shouldPreventAccessibilityFocusScrollback(for: contentOffset) {
return
}
super.setContentOffset(contentOffset, animated: animated)
}
func shouldPreventAccessibilityFocusScrollback(for newContentOffset: CGPoint) -> Bool {
if UIAccessibility.isVoiceOverRunning {
switch style {
case .horizontal:
return newContentOffset.x == 0
case .vertical:
return newContentOffset.y == 0
}
}
return false
}
I spent quite a long time trying to determine when UIAccessibilityFocus moved from something outside the collection view, to something inside the collection view, which is ideally the only time we want to block these automatic scrolls. I was unsuccessful, but I think that was mostly due to subclassing a third party collection view (the calendar). There's definitely more merit to that approach, if you can get it to work... but it will require some careful management of state.
Is it possible to disable the dock that pops up in iOS?
This is my View Controller. Notice that it has a draggable view controller in the footer.
But when I try to pull it up quickly, the dock shows up:
Is there any way to disable it?
I think the closest you can get is iOS 11's preferredScreenEdgesDeferringSystemGestures(), which will show an indicator at the bottom but not pull up the dock on the first swipe. For example, in your view controller:
override func preferredScreenEdgesDeferringSystemGestures() -> UIRectEdge {
return [.bottom]
}
In my experience it still eats the swipe gesture, but it still gives the user a second chance to hit the right target.
On iOS <11 however, this behavior can only be obtained by hiding the status bar.
Edit:
Usually when faced with implementing a design choice like this, I try to offer a second, non-interfering gesture as a backup, such as a tap in that area, that has the same effect.
As in iOS 11, you cannot disable the dock in an application, nor in Settings. I'd suggest providing a larger area for swiping up from the bottom.
Normally such conflicts should be avoided, as they degrade user experience: how do you know that the user does not actually want to use the dock?
But if you really want, you can override the preferredScreenEdgesDeferringSystemGestures() method in the root controller to specify which edges should NOT (immediately) trigger system gestures.
e.g.
override func preferredScreenEdgesDeferringSystemGestures() -> UIRectEdge {
return .bottom
}
I have two fullscreen child UICollectionViews. One is a transparent overlay on the other. I'd like them both to respond when I drag around the screen - both of them when it's a horizontal drag and only one of them when it's a vertical drag, a little like some media centre home screens. Is this possible without reimplementing the private UICollectionView gesture recognisers, and if so how?
If not then any pointers to example reimplementations would be appreciated.
Some things I know, or have tried:
I have a pan gesture recogniser on the View Controller with a Delayed Begin that can detect the vertical or horizontal movement before events are sent through to the views.
I know that simply forwarding events from my parent view's touchesBegan: etc. won't work because the touches' view property is set to my parent view, and UITouches can't be copied (naively at least) since they don't implement the NSCopying protocol. Perhaps I can synthesise suitable UITouch events and forward them?
I know I can send scrollToItemAtIndexPath:atScrollPosition:animated: messages manually but I'd prefer to have the natural drag, swipe and snap paging behaviour for the Collections.
Alternatively, is it possible to modify the private gesture recognisers' delegates and implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: - without explicitly accessing private APIs - to allow both collections to see the touches? Is the responder chain smart enough to call this with gesture recognisers from two sibling views?
Another approach might be to manually control the overlay, and not manage it as a Collection View, but Collection Views seem like a more natural fit, and in theory provide the interactivity I'd like out of the box. The box, at the moment, seems to need a crowbar to get in!
This question seems similar (if less explicit), and has no answers. The other questions I've looked at all seem to be about adding pinch, or having subviews of collections also respond to gestures; not quite my situation.
I'm scratching my head a little, so thanks for any pointers.
The short answer is you can't, easily, anyway.
The approach that worked for me is a lot simpler, and cleaner: embed one collection view within another. The containing one is limited to horizontal scrolling, and the overlay one to vertical, both with paging turned on. Both share the same controller as their delegate and datasource, and - since a collection view is a subclass of scroll view - this also keeps track of which container and overlay page we're on in the scrollViewDidEndDecelerating: method:
-(void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
if ([scrollView isEqual:containerCollection]) {
containerNumber = scrollView.contentOffset.x / scrollView.frame.size.width;
}
else {
overlayNumber = scrollView.contentOffset.y / scrollView.frame.size.height;
}
}
The only real bit of trickery was in my cellForItemAtIndexPath: method where, when I instantiate the container cell, I need to register .xibs for reuse (each overlay is different) and use the remembered overlay page and issue both scrollToItemAtIndexPath: and reloadItemsAtIndexPaths: to the embedded overlay collection to get it to appear correctly.
I've managed to keep both cells as separate .xibs as well, with associated convenience classes for any extra data they need (and in the case of the container collection the overlay collection IBOutlet).
And not a gesture recogniser in sight.
I've got a UICollectionView, and I'd like to be able to touch-and-drag items up and out of the View, and thus delete them. (Very much along the same lines as how the Dock works on OS X: drag something off and let go, and it is removed).
I've done some research, but almost everything I find is looking for CollectionViews that are drag-and-drop to reorder. I don't need to reorder (I'm happy to just remove the item at the given index from the source array and then reload), I just need to detect when an item is moved outside of the View and released.
So I suppose my questions are these:
1) Is that possible with the built-in CollectionView, some kind of itemWasDraggedOutsideViewFromIndex: method or something?
2) If not, is it something that can be done with a subclass (and specifically is it possible for a CollectionView beginner)?
3) Are there any code samples or tutorials you can recommend that do this?
Here is a helper class that I've been working on that does just that: implementation: https://github.com/Ice3SteveFortune/i3-dragndrop, hope it helps. There's examples on how to use it in the TestApp.
UPDATE
About a year on, this is now a full-on drag-and-drop framework. Hope this proves useful: https://github.com/ice3-software/between-kit
There is no built-in method like you're suggesting. What you're wanting to be can be done but you'll have to handle it with a gesture recognizer and appropriate code to handle the drag/drop operation.
I tried using a subclass to do this and finally went back to putting it in my view controller. In my case, though, I was dragging stuff in/out of the collection view as well as two other views on the screen.
I don't know if you have the book, but the most helpful thing I found was Erica Sadun's Core iOS6 Develper's Cookbook, which has excellent code on drag/drop within Collection Views. I don't think it specifically addresses dragging outside of the CV, but for me the solution was to put the gesture recognizer on the common superview and always use its coordinates rather than the subview's coordinates.
One problem I hit was I wanted to be able to select cells with a tap as well as drag, and there is no way (despite Apple's docs to the contrary) to require the single tap gesture to fail on the collection view. As a result, I ended up having to use the long press gesture to perform the entire operation, and there is no translationInView for long press (there is locationInView) so that required some additional work:
iOS - Gesture Recognizer translationInView
Another thing that will make it harder or easier is the number of possible drop targets you have. I had many, in many different types of views (straight UIView, collectionview, and scrollViews). I found it necessary to maintain a list of "drop targets" and to test for intersections with targets as the dragged object was moved. Somehow, you have to be able to determine whether the view you're intersecting is a place where a drop can occur.
If you are addressing the specific situation of dragging something out of a view to delete it (like dragging to a trash can view) and that's it, this should not be complicated. You have to remember that when you do a transform your frame becomes meaningless, but the center is still good; so you end up using the center for everything that you would normally use the frame for.
Here is the closest thing I found online that was helpful; I didn't end up using this class though as I thought it would be too complicated to implement in my app.
http://www.ancientprogramming.com/2012/04/05/drag-and-drop-between-multiple-uiviews-in-ios/
Hope this has been some help.
Yes there is.
1 - Conform your view to UIDropInteractionDelegate.
2 - Then add this line to your viewload or init:
For viewcontroller add to ViewDidload:
self.view.addInteraction(UIDropInteraction(delegate: self))
Or, for UIViews add to init:
self.addInteraction(UIDropInteraction(delegate: self))
3 - Then get the location for item being dragged here and have fun with it:
func dropInteraction(_ interaction: UIDropInteraction, sessionDidUpdate session: UIDropSession) -> UIDropProposal {
print(session.location(in: self))
return UIDropProposal(operation: .move)
}