scrolling UIScrollView from edge of the screen - ios

I have an iPad app that uses a horizontal scroll view with a bunch of controls as subviews. The user has to be able to use all the controls inside the scrollview and only scroll the view if they deliberately drag a finger from outside the left or right edge of the screen.
I've implemented this by putting this code in the UIScrollView's pointInside:forEvent: method.
- (BOOL) pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// scroll the view if the user's finger has dragged from off-screen.
[self setScrollEnabled:(point.x - self.contentOffset.x < 9 || point.x > self.contentOffset.x + self.frame.size.width - 9) ||
self.isDecelerating];
return [super pointInside:point withEvent:event];
}
When the iPad is in UIInterfaceOrientationLandscapeRight (home button on right side of the screen), I can drag my fingers from outside the right side or left side inwards and the UIScrollView scrolls as intended. However, when I switch orientations, dragging from the left side (home button) scrolls fine while the right side (near the camera) only works about 50% of the time.
I've tried extending the frame of the scroll view to be slightly outside the bounds of the screen, but I still only have problems with this type of gesture on that one side on UIInterfaceOrientationLandscapeLeft. Ideas?
EDIT: I added two UIScreenEdgePanGestureRecognizers for both sides of the screen — the gesture is more responsive now, but dragging from the right side of the screen while the home button is on the left is still sketchy at best. I have no idea why this would be happening for just that interface orientation.

Related

UIScrollView: How to kick start keyboard dismiss interactively operation (Like WhatsApp) before the drag down operation touching the keyboard?

In UIScrollView, there is a feature named "Keyboard Dismiss interactively"
By using such option, this enables me to implement the following drag down to hide keyboard
However, the keyboard dismiss operation only kick start, when the UIScrollView drag action touches keyboard edge.
What I would like to achieve is, the keyboard dismiss operation kick start, when the UIScrollView drag operation touches the bottom toolbar edge.
What I wish to achieve (Same as WhatsApp)
As you can see from the video, the keyboard dismiss operation will kick start, when the drag operation touches the bottom bar edge, even before touching keyboard edge.
May I know, what technique WhatsApp is using, to achieve such behavior?
Side note
You may notice our bottom toolbar does move along with keyboard. This is because there is a bottom constraint for bottom toolbar's bottom with Safe Area's bottom.
We adjust the bottom constraint's constant value, by installing a gesture recognizer in global Window. This is the code snippet to achieve such technique.
#objc private func didPan(_ sender: UIPanGestureRecognizer){
if keyboardHeight > 0 {
let mainScrollView = editable.mainScrollView
let isScrolling = (mainScrollView.isDragging || mainScrollView.isDecelerating)
if isScrolling {
if let mainScrollViewGlobalOrigin = mainScrollView.globalOrigin {
let point = sender.location(in: sender.view!)
// Take safe area into consideration, like iPhone 12 Pro Max.
let key = UIWindow.key
let bottomSafeArea = key?.safeAreaInsets.bottom ?? 0
let dy = point.y - (
mainScrollViewGlobalOrigin.y +
mainScrollView.frame.height +
toolbarHeightLayoutConstraint.constant +
bottomSafeArea -
bottomLayoutConstraint.constant -
self.keyboardHeight
)
if dy > 0 {
bottomLayoutConstraint.constant = -(keyboardHeight - dy)
}
}
}
}
}
The reason that WhatsApp behaves like this is that their view is considered to be part of the keyboard, so when the swipe gesture reaches their custom view it will begin interactive dismissal.
To achieve this yourself all you need to do is provide the toolbar view as the inputAccessoryView for your view controller. You won't need the constraints for positioning as the keyboard window would then control your toolbar's position.
There is also inputAccessoryViewController for the times where your toolbar may not be a UIView, but instead an entire UIViewController.
The views in either of these properties will only be visible when the keyboard is visible, so to get around that you'll still want to put it into your view hierarchy, but remove/add it based on becoming/resigning first responder.
EDIT: Also, you should be using UIApplication.keyboardDidChangeFrameNotification to detect when the keyboard changes size/position/etc and allow you to adjust insets/positions of views appropriately. In modern iOS there are plenty of ways the keyboard can change size while open, and observing that notification is the correct way to handle the keyboard size.

UIButton positioned where half is in the air-is disabled?

I have some side UIView that is in the screen size .
This side UIView, is placed on the left where its X is at -screen.width , so we dont see him .
Than , in this side view class view, i am adding a small square UIView, at its right , at screen.width , that means on the main view i can see on the left , this little square from the side view .
This structure is to get a side menu effect .
In this little square view, there is a UIButton , but this button is not responding to touches .
I do not want to place this side menu where it is a few pixels inside the screen, because this may cover some content .
Hence, i created this structure where the side menu is in the screen size,it is placed on
minus screen size.width
and has this little square that comes into the screen ,and when you push it, the whole side view comes in .
Why is this button is disabled ? is that because he is hang in the air, where part of it is not inside its superview ?
EDIT:
Found this great answer which is not working for me
interaction beyond bounds of uiview
I have added this to the parent view that contains the button , and it does log the touch,but not working.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
CGFloat radius = 200.0;
CGRect frame = CGRectMake(self.frame.size.width, 0,
radius,
radius);
if (CGRectContainsPoint(frame, point))
{
NSLog(#"log");
return self;
}
return nil;
}
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
if (CGRectContainsPoint(self.bounds, point) ||
CGRectContainsPoint(menus.frame, point))
{
return YES;
}
return NO;
}
Thanks Aaron, I found that indeed this solution does works :
Interaction beyond bounds of UIView
But I found that a better way to do that side menu without covering the content of the main screen, is to set the height of the side menu to be very small, and set its x to appear a little bit inside the main scene.
Then when open it, just a moment before bringing it to cover the main scene, change its height to be in the screen height, this way, you don't cover the content when the menu is closed, but when you open it you change its height to be full height.

iOS 7 custom keyboard UIView touch down event delayed in bottom row

here's an odd one..
I've got a UIView xib file that looks like this:
I've connected every UIButton touchDown and touchUpInside events to two IBAction methods:
- (IBAction)touchUpInside:(id)sender
{
NSLog(#"touch up inside");
if (((UIButton *)sender == _enter) | ((UIButton *)sender == _back)) {
[(UIButton *)sender setBackgroundColor:_color2];
}
else {
[(UIButton *)sender setBackgroundColor:_color1];
}
}
- (IBAction)touchDown:(id)sender
{
NSLog(#"touch down");
[(UIButton *)sender setBackgroundColor:_color2];
}
Everything works except for the bottom-most row of UIButton's, that's the odd part:
The touch down event is fired, but the button must be held for 0.5 second for it to change background color, whereas it is instantaneous for the other buttons.
It ONLY happens for the bottom-most row of UIButton's, as I've tried to switch buttons 7, 8, 9 with buttons #back, 0, #enter like this:
I've checked in Interface Builder all the UIButton attributes are the same, and I've tried moving the UIButton's objects order around as you can see on the left side of the picture, and I'm about out of ideas already. Basically what's odd is the UIControl behavior differs based on its position on the parent view...
UPDATE: I made the parent UIView height value large enough that there is 50 free pixels below the last row and the UIButton's work fine now. The only reason I can think of now is that there is a UITabBar there 2 view controllers level underneath. Even so it doesn't make sense.
The document says:
Expect users to swipe up from the bottom of the screen to reveal
Control Center. If iOS determines that a touch that begins at the
bottom of the screen should reveal Control Center, it doesn’t deliver
the gesture to the currently running app. If iOS determines that the
touch should not reveal Control Center, the touch may be slightly
delayed before it reaches the app.
One solution is here:
UIButton fails to properly register touch in bottom region of iPhone screen
But, in your case, I think you should use inputView in UIResponder.
See: https://developer.apple.com/library/ios/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/InputViews/InputViews.html
The inputView is not affected by that problem.

UINavigationController and Touch Events in Left / Left-Bottom Corner of Screen

I'm trying to call action with UIControlEventTouchDown event for simple UIButton which placed in left-bottom corner of UIViewController (which pushed with UINavigationController).
I created storyboard to push view controller with button.
And added actions for button to trace touch events.
- (IBAction)touchUpInside:(id)sender {
NSLog(#"touchUpInside");
}
- (IBAction)touchDown:(id)sender {
NSLog(#"touchDown");
}
And also added touchesBegan to trace if it is called.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
NSLog(#"touchesBegan");
}
Now, with such setup, I have strange behaviour. There are some touch areas in left (width =13) and left-bottom (width = 50, height = 50) which respond differently on touches. If you will will make touch over those areas -touchesBegan is not called on touch down, as would be with normal behaviour. But it will be called only after touch up.
I believe left area is used in UINavigationControoler for interactive pop of pushed UIViewController. So two questions here.
For which functionality is responsible area in bottom-left?
How and where can I change behaviour to pass touch event to UIButton (for example if I want UIButton to respond on long touch event, when I pressing in "red" area)?
I had this same problem, and I fixed it by disabling the "swipe to go back" (technically called "interactive pop gesture" in UIKit) feature introduced in iOS 7.
Sample code to disable interactive pop gesture:
if ([self.navigationController respondsToSelector:#selector(interactivePopGestureRecognizer)]) {
self.navigationController.interactivePopGestureRecognizer.enabled = NO;
}
I believe this is due to the interactive pop gesture recognizer consuming/delaying touch events near the left edge of the screen (because a swipe to go back starts from the left edge) and thus causing the touch events to not be delivered to controls that are situated near the left edge of the view.

Lost Gestures iPad landscape

I have an app with an MGSplitView containing a table view and a UIWebView, which is fixed to landscape. The web view has a UITapGestureRecognizer (for a triple tap) attached to the web view. Taps in the left portion of the web view work; taps on the right side of the web view are lost - the action is not triggered, and the gesture delegate messages are not received.
The problem seems not to lie in the MGSplitViewController, as switching to a UISplitViewController has the same issues; changing from a tap to a long press also has the same results.
Tap locations are reported with the x coordinate at or close to the max width of the gesture.view, and yet are clearly made close to the centre of the display, which I expect has something to do with the root of the problem - and yet the web view contents are clearly visible and correctly placed.
All the view controllers involved implement shouldAutorotate and supportedInterfaceOrientations, so being stuck in portrait seems unlikely i.e. MGSplitViewController, my UITableView subclass (left hand panel) and UIViewController subclass for the right hand panel.
My gesture recognizer delegate and output from one triple-tap (the view in the right hand panel web view):
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
NSLog(#"%s", __PRETTY_FUNCTION__);
NSLog(#"gestureRecognizer view frame: %#", NSStringFromCGRect(gestureRecognizer.view.frame));
NSLog(#"location %#", NSStringFromCGPoint([gestureRecognizer locationInView:gestureRecognizer.view]));
return YES;
}
-[DocumentBrowser gestureRecognizer:shouldReceiveTouch:]
gestureRecognizer view frame: {{0, 0}, {703, 704}}
location {703, -20}
-[DocumentBrowser gestureRecognizer:shouldReceiveTouch:]
gestureRecognizer view frame: {{0, 0}, {703, 704}}
location {414.5, 204.5}
-[DocumentBrowser gestureRecognizer:shouldReceiveTouch:]
gestureRecognizer view frame: {{0, 0}, {703, 704}}
location {414.5, 204.5}
The first location reported seems strange.
Check your orientation and view resizing mechanisms. I've seen this several times when something is wrong in these areas - if you log the touch locations, I think you will probably find that they stop at 768 points from the left-hand side, i.e. there is a view somewhere that thinks it is in portrait orientation.

Resources