What other UIWindow receiving touch events is the Apple docs talking about? - ios

The Apple docs on windows say:
A window is considered the key window when it is currently receiving keyboard and non touch-related events. Whereas touch events are delivered to the window in which the touch occurred, events that don’t have an associated coordinate value are delivered to the key window. Only one window at a time can be key.
This means that the default window that Xcode provides us is the key window by default, but then our button taps are listened to by any other window which is not a key window. Which window is this, which is listening to the touch events? We do not generally add any other window. So from where did this window come?

UIWindow is just a UIView subclass that doesn't need to be added to any existing view hierarchy. While there's only one by default in an iOS application, it's not uncommon to use UIWindow instances for things like status bar overlays and full-screen overlays.

Related

want to catch a window change event in iOS

I want to detect when any notification of any app comes in.
I was looking for a sort of "window will open" event, to detect when the notification window pops up, but as far as I can see there is no such thing.
Would it be possible to use the "viewWillStartLiveResize" in some way or another for this purpose ?
Or is there any other way to detect a screen change event?
viewWillTransitionToSize:withTransitionCoordinator: or traitCollectionDidChange: should be enough to detect app screen size change (especially on an ipad split mode). But in your case that's not possible - you have an access to your and only your app views hierarchy. Push notification windows pops up over your app on a springboard view hierarchy. So as notification center, etc. They just don't resize your app's window, they overlap it.

Get UIWindow of AirPlay mirroring screen

I'm trying to get the UIWindow object of the external screen (such as a television) when using AirPlay mirroring. The tricky part: it really is AirPlay mirroring, not a separate display, as we're using the built-in functionality of AirPlay rather than setting up a new UIWindow object and assigning it to the new screen. My boss wants the app to mirror the device in every way, but to be able to add subviews (tutorial overlays) to the external window exclusively, by adding them to the secondary screen's UIWindow.
What I've Tried:
I can get the secondary UIScreen object easily, either from UIScreenDidConnectNotification or [UIScreen screens], but as far as I know UIScreens don't have references to the UIWindow being shown.
Since we're just using the display that AirPlay auto-generates, I can't save a reference to the UIWindow during its creation.
I've checked [[UIApplication sharedApplication] windows], and it doesn't seem to include the UIWindow associated with the external display. (At least, none of the window.screen objects match the UIScreen object I get from UIScreenDidConnectNotification when the TV first connects, and while [[UIScreen screens] count] goes up by 1 when a TV is connected, the count of windows remains static.)
Is there a way to access the window for a secondary screen using AirPlay mirroring? Or alternatively, is there a way to efficiently implement app-wide mirroring of the device, that allows greater control of the UIWindow object associated with the TV?
This document from Apple:
https://developer.apple.com/library/ios/documentation/WindowsViews/Conceptual/WindowAndScreenGuide/UsingExternalDisplay/UsingExternalDisplay.html
states that:
To re-enable mirroring after displaying unique content, simply remove
the window you created from the appropriate screen object.
This means that when mirroring is active, there is no window linked to it, and the mirroring just uses the window of the main screen.
There is only a single window, and I don't think it's possible to show different content in this mode.
If you want to show different content, you'll have to create a new window and assign it to that screen.

How can I detect from iOS Keyboard extension if user scrolled up the Control Center?

I develop an iOS Keyboard extension, and I'm using scroll gestures on keyboard. Sometimes when using the keyboard I scroll up the control center and my keyboard stops working fine. Is there any way to detect if control center become visible, or invisible?
You can't do it directly. The most you can know is that your app was deactivated and then activated again. It could be because of the control center, it could be because of the notification center, it could be because a phone call came in, it could be because the user went into the app switcher and came back again...
Here is the possible work around you can try:
It is the UIWindow subclass to enable behavior like adaptive round-corners & detecting when Control Center is opened. This UIWindow subclass does probably the thing you want. You simply subscribe to an NSNotification and can react to the user opening Control Center. Detailed instructions and setup on Github
https://github.com/aaronabentheuer/AAWindow
[AAWindow: The way this is accomplished is by using a combination of NSTimer and overwriting sendEvent in UIWindow to receive all touches without blocking them. So you basically receive all touches check if they are near the lower edge of the screen, if yes set a timer for a half a second and if during this timer is running applicationWillResignActive is called you can be almost certain that ControlCenter is opened. The time has to vary if there's no statusbar, because then the app is in fullscreen and it can take the user up to 3 seconds to launch Control Center.]
Hope it would help you figure out the exact solution to your problem.

iOS notified when control center is opened like QuizUp do

How can I be notified when the iOS Control Center is being opened?
UIApplicationWillResignActiveNotification isn't good enough since this notification is sent also when the Notifications Center is opened, alert view appeared and other possible scenarios.
I was sure this is not possible, but QuizUp app is notified when the user open the Control Center while the user is on middle of a gameplay to prevent cheating the game.
Thanks
Hey there I did a lot of trial and error investigation and came up with a solution that turns out being very reliable. It works in all orientations and both in fullscreen (no statusbar) and in regular mode. AAWindow is a subclass of UIWindow and you can find it on GitHub.
The way I accomplished this is by overriding sendEvent in UIWindow, separating out TouchEvents from the other events and checking whether touches occur in the bottom 10 percent of the screen (which is the part that can open Control Center). If there are touches and applicationWillResignActive is called within a timespan of .5 seconds (with statusbar) or 3 seconds (without statusbar) you can be very sure that this is because of Control Center being opened. Then a NSNotification is being fired and you can react to that anywhere in the application.
I tested a UIPanGestureRecognizer approach (with and without the status bar visible—changes if the little pull tab comes up instead of control center) along with watching for the applicationWillResignActive notification, and I couldn't reliably know if control center was opened. If the pan was slow enough the gesture recognizer would trigger first, but it's definitely easy to swipe up fast enough to trigger control center and bypass the gesture recognizer firing at all.
Attempting to check if the app goes from applicationWillResignActive and then to applicationDidBecomeActive would be a pretty reliable way to know if the app entered and exited one of a couple states (control center, notification center, answering a phone call, etc), but telling the difference between say notification and control center is impossible this way.
TL;DR: I don't think there is a reliable or accurate way to tell if control center was opened, but QuizUp may be doing something interesting to fake it, and I am open to being wrong!
When Control centre is opened the cycle isn't completed. Means only the method
applicationDidResignActivity will be called but applicationDidEnterBackground won't be called. When app is minimized both methods will be called. Here you can differentiate.

UIAlertViews, UIActionSheets and keyWindow problems

I created an iOS 7 passcode replica and I have this problem I can't seem to solve. I need the lock screen view to be on top of everything else, so the app is covered in iOS' multitasking view, so I add it directly to the keyWindow. Everything fine so far.
The problem arises if there's an alertView or actionSheet (will only mention alertViews in this post, to keep it simple) open when I have to display the lock screen. It has been answered several times that there are no references to alertViews in iOS 7, which is true, and the window in which they are displayed is _UIModalItemHostingWindow, which has 2 UIViews, indeed with no reference to the alertView.
This _UIModalItemHostingWindow also becomes the new keyWindow, so it's on top of everything else, but it can not be found in [UIApplication sharedApplication].windows meaning if I add the lock screen to my former keyWindow (the default keyWindow, if you will), it will be beneath the alertView and its dimmed background, so the user can't interact with the lock screen before dismissing the alertView. The other option is detailed a bit further below.
The lock screen works like this: on applicationDidEnterBackground it checks if the passcode is enabled; if it is enabled and the passcode duration is 0 (user selected to lock the app immediately), it adds the lock screen now, so it covers the app in the multitasking view. Now, the option I mentioned above is to add the alertView to this _UIModalItemHostingWindow window, but when returning to the app, the lock screen view is displayed with a 1+ second delay (even though I added it before I went to background!) and the app isn't covered by anything in the multitasking view. (Currently it's displayed in the wrong position too, if you go ahead and download it, that is fixed, but I didn't pushed the commit yet).
I tried hiding and removeFromSuperview this _UIModalItemHostingWindow, but when coming back to the app, the alertView animation still runs as if it was just fired. I suspect the delay mentioned above also happens due to how Apple handles alertViews when coming back to foreground.
I also tried creating a new window and to make that the new keyWindow, but same thing happens.
Here's a small discussion about this, covering all the stuff I tried, maybe I missed something in this post.
https://github.com/rolandleth/LTHPasscodeViewController/issues/16
Any ideas? Except creating manual references to every alertView and actionSheet inside my app, because I'm trying to find a fix for the passcode library, not my own apps; I can find dirty workarounds for that, no problem :)
Update: The window is _UIAlertOverlayWindow if an actionSheet is used instead of an alertView, but it behaves the same as far as I can tell.
The simplest solution is to have a lockscreen window instead of a lockscreen view.
Create a new UIWindow, set its frame to UIScreen bounds, put a simple rootViewController there that should handle rotation and display your "lock screen" views and set the windowLevel to UIWindowLevelAlert + 1.
Then set window's hidden to YES. Whenewer you want to show the lockscreen, just set hidden to NO.
I guess that adding a view to keyWindow also doesn't work when a popover/action sheet is displayed and also when a keyboard is displayed (keyboard has its own window on top of the key window).

Resources