How do I make iOS 7 Control Center behave like it does in Maps (small tab)? - ios

In Maps in iOS 7, if you swipe up from the bottom of the screen as if opening control center while the map view is full screen, you get a small tab instead of the full Control Center, and the map scrolls normally. You then have to grab that tab and pull it up to open Control Center fully. See this screenshot as an example:
I have an app with significant functionality triggered by dragging up on a small UI element at the bottom of the screen, and it's tricky to grab it in iOS 7 without swiping up from the bottom. How do I trigger the same Control Center behavior in my app? Is there a key in Info.plist that I can set? Or is that some kind of private API that Apple uses? It doesn't happen when the bottom toolbar is visible, so maybe it's some kind of state that can be set programmatically?
UPDATE: The same behavior occurs when you swipe down from the top of the screen as if to open Notification Center.
UPDATE 2: The camera app does the same thing, according to this question about UIScreenEdgePanGestureRecognizer.

I believe you have to hide the status bar for the grabber to come up first. That being said, I think users would expect Control Center to come up when they drag up from the bottom of the screen, so I'd say preventing that from happening isn't a very good user experience.

Related

SwiftUI: Accessibility sound feedback for a draggable element

I am making an application that works essentially like a simple Drag-and-Drop Playground with the command blocks on the left and a droppable area on the right. I want to make it fully compatible with VoiceOver and I'm running into trouble with some of the accessibility aspects since this is my first Swift application.
This is what the playground currently looks like: (App Screenshot)
My goal is to provide the users with audio cues/feedback while they are dragging the elements to help them figure out what part of the screen they are currently at. The ideal functionality would be exactly like what one uses when editing an iOS device's Home screen (the arrangement layout of the apps).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a row/column alert when you are dragging an app over an open area. I want a similar type of feedback that says "Droppable Area" when you are over the correct area (see scenario 1).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a sound when you tap on an area that has no app icon. (This also happens when you are not editing the layout and simply tap on an open area with no app.) I want that noise to be what you hear when you drag a command over an area that is not droppable (see scenario 2).
Any ideas on how this might be possible or good references to look at?

Is it recommended to place a button next to iPhone X virtual home button?

Is it recommended to place a button next to the home button like Apple did it in the system weather app? I would suggest that in this area you should not place any control elements. I didn't find any Information about that topic in Apples iOS Interface Guidlines.
(And why are the paging-dots in different positions from time to time?)
I think the best way is to place all UI elements inside the Safe Area. Please, see section Inset essential content to prevent clipping by the link Human Interface Guidelines for iPhone X
Apps should adhere to the safe area and layout margins defined by
UIKit, which ensure appropriate insetting based on the device and
context. The safe area also prevents content from underlapping the
status bar, navigation bar, toolbar, and tab bar.
And also see the video Designing for iPhone X. There is also some notes about Safe Area.

Support Apple Pencil in my app

I have a colouring app which allows the user to color images, select color and use different colouring techniques.
I need to know is there any requirements I should add to my app to support Apple Pencil input on iPad Pro? Also, is there specific limitations on Apple Pencil input, i.e. some touching functions that it can't be used for?
There are limitations.
Swiping with your Apple Pencil from the top or bottom of the screen does nothing. You'll need to use your finger if you want to access these edge gestures.
The Apple Pencil cannot be used to invoke Slide Over or Split View. Swiping from the right side of the screen gives no response either.
And if you have a second app open in Slide Over, you can't push it off the screen with the Pencil. You also cannot resize apps in Split View with the Pencil.

Get Rid of pull-down arrow from top and bottom of Iphone Screen

I am developing an app with lots of gestural interaction. There are interactive touch areas situated in all areas of the screen. The app is an interactive synthesizer and not some picture sharing network that tightly follows the human interface guidelines.
Whenever I interact with any of the gesture inputs near the top or bottom of the screen, these arrows that signal the OS info screens appear. Is there any possible way to turn this off?
Short answer is that this is not possible.
The best you can do is warn users and ask them to go to settings to turn the Control Center off.

How to increase tappable area of top navbar buttons in iOS PhoneGap/Corova Apps

I have been developing hybrid apps on iOS and the most glaring problem I am having is the back button that emulates the native back button on the top navbar has a much smaller area.
This may be due to the button being on the edge of the top edge of the screen and the webview doesn't interpret taps on the edge to be intended for the webview, maybe the status bar.
I have even enlarged the padding on the button element to the point where it takes up the whole top left corner of the screen and wont register a tap unless you aiming for 3.5mm beneath the top of the webview. On a native app you can aim 0mm away from the edge and it registers.
This may not seem that bad, however when you allow a long term iOS user that 3.5mm is very apparent, and their mental model of where a touch should register makes them immediately think the app is broken, instead of them tapping the wrong area.
I am interested in any other information regarding ways to minimize this discrepancy between native and hybrid, or proposed solutions/information leading to a better understanding on why this occurs.
Using Cordova / PhoneGap and Kendo Mobile to implement the app

Resources