Prevent Alert View from cancelling Long Press Gesture - ios

I have a UILongPressGestureRecognizer that detects 3 second presses, in order to get the app into edit mode. Sometimes it can happen that an alertview pops up during that time, due to other things going on in the app. When that happens, the user could abandon the long press, and handle the alert. But it must also be possible for the user to keep pressing, and the app needs to go into edit mode even with the alert still there (the app will then dismiss the alert itself).
However, when the UIAlertView pops up, the OS automatically cancels all ongoing gestures, thereby shortcutting the long press. The GestureRecognizer receives a touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event. By setting breakpoint in XCode I can clearly see that it is caused by the UIAlertView popping up.
Does anybody know whether this can be prevented, or knows a work-around?
This is for iOS-7 if it makes a difference.

Related

XCUITest: Auto-Accepting System-Alerts. Need more fine grained control

I have read that addUIInterruptionMonitorWithDescription could be used to accept / tap on a particular button in a system alert. On recently trying some test code out, I was doing the following:
Adding a Photos Alert
Adding a Calendar Alert
Adding a Location Alert.
On the simulator, I was surprised to find that the Calendar and Location prompt automatically had their "Allow" buttons tapped. For the Photos prompt, the "Don't Allow" was hit. My question is - is there no need for addUIInterruptionMonitorWithDescription anymore? I tried using it for tapping on the dialogs but it didn't do anything. Even when I tried to hit another button on the alerts, I didn't see it working. How do I tap on the individual buttons on a system alert her?
If there is an alert on screen, and none of your interruption handlers handle it, XCTest will dismiss it for you if you are using Xcode <9.1.
To gain control of the alerts, you should create an interruption handler for each alert, returning true from the closure when (and only when) you have handled the alert that handler was intended for.
If the test tries all your alert handlers or receives a true return value from any of your handlers, and there is still an alert on screen, XCTest will handle the alert itself.

Delaying application while UIAlertView is active

I added an Alert view, asking for user input, at the start of my app. The app works fine without the Alert view. But with the code for the Alert view added, part of the UI is blacked out after hitting the 'ok' button on the alert.
I'm not well versed in ios, but is there a good way to delay the app from running until the Alert (text input) is completed (ok button pressed). This might avoid whatever is causing the screen to go black in one section. Apparently the app is executing while the alert is active, and the alert is affecting the UI. Basically, I am asking the user to input their phone number via an alert that will be used later in the app.
When alert view is shown on screen, getting back ground dimmed (reduced alpha) is a normal thing and is practiced by iOS.
However, if you feel some part of the code you want to run only on tap on OK button on alert, move that method call to OK button action handler.

iOS 6.1.5 UITextField - text not populating

My customer has a POS (Enterprise) app that was working fine on iPod6.1.3. Some of the devices were upgraded to 6.1.5 last week and began exhibiting a problem with taps on the keyboard not populating the UITextField. Unfortunately it is an intermittent problem and I do not have reliable steps to replicate, but I have seen it happen on a 6.1.5. device while running under the debugger. Here's what I know:
User taps on any of 7 UITextFields on the view. Keyboard slides up from bottom. Cursor starts blinking in the field.
User taps any key (including return). The key popover shows the key being pressed. Nothing appears in the UITextField.
Once the problem starts, it will persist until I quit the app (in this case, the app really quits with an exit(0) - required due to credit card security). Running the app again will most likely work correctly.
In the debugger, when things are working correctly I get callbacks to textFieldShouldBeginEditing:, textFieldDidEndEditing:, textFieldShouldReturn:, and textField:shouldChangeCharactersInRange:replacementString: as expected. When things don't work, I DO get callbacks for textFieldShouldBeginEditing: and textFieldDidEndEditing:, but I DO NOT get call backs for textFieldShouldReturn:, and textField:shouldChangeCharactersInRange:replacementString:. I checked the delegate for the UITextField in the shouldBegin and DidEnd methods - they are of course set to self.
Again, this behavior is new, and only in iOS6.1.5. There is nothing "fancy" going on in the view controller.
Any help would be appreciated.

Getting touches at launch on iOS

On Mac, one can always get the location of the mouse "outside the event stream" (ie, even if you've not subscribed to any delegate methods for mouseUp: et al) by calling [NSEvent mouseLocation].
Is there any way on iOS to get current touch events without listening to touchesBegan:?
I ask because there is at least one situation in which touchesBegan is not called: at app launch, if a touch is already in progress, touchesBegan is never called. In fact, neither are any of the touch-related UIEvent methods called as near as I can tell (nor UIApplication's / UIWindow's sendEvent:, apparently).
I would like to vary the behavior of my app slightly based on whether a touch is in progress at launch. Is there any way to detect an in-progress touch at app launch?
This cannot be done. The simple reason: The touch events don't belong to your app. Each touch belongs to some UI element (or responder). As you already know, this element gets the began, moved, ended, cancelled messages.
This is even true within a properly programmed app: All events regarding one touch are delivered to the very same object. After all, how would another object know what to do with that event, and how should the first object properly finish its expected behavior?
While you can (or could, but probably shouldn't) find a work around within your app, there's just no way for cross-app-touch passings.
And on the Mac you may query the mouse position, but in normal application flow there'll always be a mouse down before you get a mouse up event.
To be honest, I don't see any reason why this would be needed anyway... oh wait... I could split my app icon into several areas... not sure if it would already break privacy laws, though, if you get to know where the user has his icon on screen.
I think you could simply "extend" the application launch. When I had time consuming tasks during my application launch, I used to show the same splash screen with a UIActivityIndicator while the action was being carried out.
You could simply create a NSTimer, wait for about 2 seconds and during this time, check for touches, while the splash screen will still be showing.
To do this, in applicationDidFinishLaunch, push a ViewController that looks exactaly like the splash screen and check for touches in this ViewController. After those 2 seconds, proceed with normal initialisation. This behaviour also helps if you have time consuming tasks during initialisation.
I know, it`s a workaround, but my guess, is that it is not possible to check for touches because application will be working on the main thread and the touches also processes on the main thread. This could happen also because there are no ViewControllers or UIWindow initialised and ready to listen to touches.
Hope it helps.
You might try handling the hitTest:withEvent: instead.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
Since according to apple doc "Returns the farthest descendant of the receiver in the view hierarchy (including itself) that contains a specified point."

UIAlertView is not shown when returning from sleep mode in iOS app

My app must sometimes show an UIAlertView when the Home button or the locking button is pushed or when the notification center is shown.
I show the Alert from the applicationWillResignActive delegate's method and everything is ok when home button is pushed or when notificacion center is shown. But there is a problem if the button which is pushed is the locking button (on/off button).
In that case, the Alert is not shown when I return to the app (if I used the Home button it is there). I don't do anything else in other AppDelegate methods which are executed. Also, then, when I show a new Alert (any Alert in the app) the Alert which hasn't been shown when I returned is shown after I dismiss the new one.
Please, could anybody help me?
Thanks in advance.
THE EASY, GIVE ME REP ANSWER:
When the app is put into the background, the app is suspended. Part of this process is closing open alert views.
THE I ACTUALLY KNOW WHAT I'M TALKING ABOUT ANSWER:
The logic behind this is that when the user hits the home button when an alert is displayed, they might be going to look for information on how to answer the alert. However, when the sleep button is pressed, the user has stopped using the device altogether. Apple knows that if they unlock thier device again 3 hours later and see something like Confirm Deletion, they will have absolutely no idea what they were just doing and what to do now.
This is known to cause a serious condition known as what-in-the-world-am-I-supposed-to-do-now-itis. Symptoms of this condition include hitting the round button at the bottom of the screen and subsequently holding on your app icon until it jiggles. They then hit the little 'x' button. This is not good for developer's pockets.

Resources