I have a button in an iPad app with frame.origin.x == 0. The user has to be able to move the button with his finger. I realized this with a UIPanGestureRecognizer and it works fine. However I got a bug report, that the button doesn't move if the user moves his finger from outside the screen (from the black margin of the iPad). I checked and the selector of the UIPanGestureRecognizer doesn't get called when the touch enters the button's frame.
Is there any way to fix this? In the iOS main screen it works, so you can scroll the apps even with a gesture started from outside the screen.
I figured it out. I've had a UIView outside the screen that came in with that button and it's right-most corner was at x == 0. If I set that view to hidden when it is outside the screen, the PanRecognizer works.
Related
On an iOS app, I have the following view structure:
UIViewController > UIView > UIScrollView > UITextView
The UIScrollView has the "Dismiss Interactively" setting. When I tap on the UITextView, the keyboard pops-up properly. However I now try to gradually dismiss the keyboard by slowly swiping my finger down, but nothing happens.
Did I forget anything in my configuration ?
Example project
Since iOS 7, you can use
scrollView.keyboardDismissMode = .Interactive
The keyboard follows the dragging touch offscreen and can be pulled upward again to cancel the dismiss.
Issue was linked to the fact that the scrollView contents were shorter than the screen size.
For the UI Test case I need to "zoom out" in the map view. When using pinch zoom out code it only moves map to the left.
let app = XCUIApplication()
app.maps.element.pinch(withScale: 0.9, velocity: -0.5)
Dose someone knows how to achieve "zoom out" functionality in the UI Testing?
I want to notice that "zoom in" works fine.
The problem is that one finger of the simulated pinch gesture is falling on the status bar, so only the other finger is registered.
You can solve the problem in your example project by adding the following to ViewController.swift:
override var prefersStatusBarHidden: Bool {
return true
}
When zooming out, the UI test system seems to simulate a pinch from the very top left and bottom right of the view, constrained to the edges of the screen. This doesn't seem to take the status bar into account, and so the top-left touch falls on the status bar and is ignored, and only the bottom-right touch is registered by the view, causing the pinch to become instead a drag up and left. This isn't just with maps, it seems to be an issue with all scroll views.
If you do want to display the status bar on top of the view, you may need to figure out another way to constrain your pinch to not fall on the status bar. In my case, I was fine simply removing the status bar.
I will try to describe this issue--never seen this behavior before.
I have a button that moves from the right of the app's screen to the middle when the app launches on the simulator or actual device. I want it to always be in the middle (where it moves to).
Basically, it moves from one position on the parent view to another position (I want it to ALWAYS be in the position defined by the constraints)
How do I accomplish this?
Here are images to show the behavior and the constraints:
Button is on the right, higher up; then fades out
Button fades in to the bottom, centered position (correct per constraints listed below)
Button constraints:
Button.centerX = centerX (center the button to the container's horizontal center)
Bottom Layout Guilde.top = Button.bottom + 112 (anchor the button to the bottom of the app regardless of screen size or orientation).
Just add two constraints BottomSpace and CentreX as below. (And if required explicit height and width)
1.Also provide button.left.
Or
2. Use constraints show in image.
OK, I fixed the issue. Somehow, the splash screen storyboard got a copy of the button and other views that i had on the main storyboard. Once I deleted the button from the splash screen story board, the button issue stopped.
Apparently the first screen shot was the splash screen, and the second screen shot was the actual app (main storyboard).
Thanks to everyone for trying to help!
I have a strange problem where I have a scroll view on one of my views that works perfectly on 4-inch devices, both in simulator and actual device. That same scroll view (and views inside it) doesn't respond to any touch event (both tap and scroll) in 3.5-inch devices (same iOS version, 7.0.3). It renders/displays perfectly though. I've even tried to add a tap handler to the view itself directly to see if it will hit (categoriesScrollView is my view):
//viewDidLoad:
UITapGestureRecognizer *test = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(test)];
[categoriesScrollView addGestureRecognizer:test];
-(void)test{
NSLog(#"test");
}
It doesn't hit. Inside the view there are buttons, that can be tapped perfectly on an 4-inch device. This problem occurs both on device and simulator. The first thing that I checked was any device/screensize-specific code, but there isn't any. The second thing I've checked is the layout constraints: Maybe the scrollview's actual bounds was getting smaller (maybe even zero) on 3.5 inch device, but I've checked the frames and everything is normal (the scroll view has clipsToBounds set to YES anyway, so if this was the case, the content would be invisible anyway).
All the other views inside the same view other than this scroll view and its subviews are receiving touch/running perfectly normal. I can't think of anything else, what can be causing this?
I've solved the problem. I've first put another button somewhere inside the main view on top of the scroll view (but scroll view was inside another container, top bar), the button worked. I've then moved the button inside the same container with scroll view (on top of it), and it didn't work where alarm bells rang for me. Apparently, I had a progress indicator and that indicator was centered in the main below with top constraint to the top bar. Top bar had clipsToBounds set no NO and the indicator was pushing the size of the top bar to almost zero, making the scroll view draw but not respond to touch. It was a simple autolayout issue, and the top bar which contains the scroll view didn't clip the scroll view out of it's bounds, which made the bug harder to find.
I am writing an ipad application where you can drag around the screen with 1 or 2 fingers. I use the touchesBegan, touchesMoved and touchesEnded methods to recognize the touches. I have multipletouches enabled for the view.
Now I have recognized one strange behaviour. If I place a finger outside of the upper screen border and drag it down to the screen I won't receive a call to touchesBegan or touchesMoved. This can not be reproduced for the other screen borders.
The only other case where it can be reproduced is when I use landscape orientation with the home button to the left side. In this case I get the same behaviour for the upper screen border.
Has anyone information about this or does experience the same if testing?
Edit:
I did some additional testing. The area where the statusbar would normally be is receiving touchesBegan or touchesMoved very fine if i put down the finger in that region or slide it upwards from any lower screen position. The input is not recognized only if i slide the finger down from a position that is completly above the screen edge.
Edit2:
Additional Info:
- My app uses an OpenGL view.
- The statusbar is hidden.
- touchesBegan is called properly when touching the (hidden) statusbar areas in all orientations.
I believe the underlying issue is that the touch-sensitive layer on the screen extends a few pixels above the visible screen, meaning that touches originating there technically start outside your UIViewController view. Since your UIViewController only sees touches that originate within its view, it won't see these touches. So if your device screen is touched a pixel or two above the visible pixels, the touch will be sent to whatever virtual layer exists in that offscreen area (perhaps the status bar?). This prevents your UIViewController from receiving any touches, since the swipe did not originate in your UIViewController's view area.
To work around this, at applicationDidLaunch time you can resize your UIWindow to be (say) 10 pixels larger on each side, and create an intermediate UIViewController 'background' layer inside it, matching the expanded size of the window. Then set your main interface UIViewController view to be a child of the 'background' view, inset 10 pixels on all sides to re-match the physical screen size. If your interface rotates, the 10-pixel enlargement of the background view will have to be re-applied in the willAnimateRotationToInterfaceOrientation method in your background UIViewController each time.
Finally, have the touchesBegan/Moved/Ended methods in the background view controller forward their touches directly to the corresponding methods in your main UI view controller. Voila!
Hope this helps,
Ben
Sounds right to me. If you drag your finger across the status bar first, it will receive all touch events for this finger until you lift it. All touch events are delivered to the view in which the touch began.
unfortunately touchesBegan cannot be called from the status bar area.