Support Apple Pencil in my app - ios

I have a colouring app which allows the user to color images, select color and use different colouring techniques.
I need to know is there any requirements I should add to my app to support Apple Pencil input on iPad Pro? Also, is there specific limitations on Apple Pencil input, i.e. some touching functions that it can't be used for?

There are limitations.
Swiping with your Apple Pencil from the top or bottom of the screen does nothing. You'll need to use your finger if you want to access these edge gestures.
The Apple Pencil cannot be used to invoke Slide Over or Split View. Swiping from the right side of the screen gives no response either.
And if you have a second app open in Slide Over, you can't push it off the screen with the Pencil. You also cannot resize apps in Split View with the Pencil.

Related

SwiftUI: Accessibility sound feedback for a draggable element

I am making an application that works essentially like a simple Drag-and-Drop Playground with the command blocks on the left and a droppable area on the right. I want to make it fully compatible with VoiceOver and I'm running into trouble with some of the accessibility aspects since this is my first Swift application.
This is what the playground currently looks like: (App Screenshot)
My goal is to provide the users with audio cues/feedback while they are dragging the elements to help them figure out what part of the screen they are currently at. The ideal functionality would be exactly like what one uses when editing an iOS device's Home screen (the arrangement layout of the apps).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a row/column alert when you are dragging an app over an open area. I want a similar type of feedback that says "Droppable Area" when you are over the correct area (see scenario 1).
When trying to rearrange apps on the home screen with VoiceOver enabled, you hear a sound when you tap on an area that has no app icon. (This also happens when you are not editing the layout and simply tap on an open area with no app.) I want that noise to be what you hear when you drag a command over an area that is not droppable (see scenario 2).
Any ideas on how this might be possible or good references to look at?

Get Rid of pull-down arrow from top and bottom of Iphone Screen

I am developing an app with lots of gestural interaction. There are interactive touch areas situated in all areas of the screen. The app is an interactive synthesizer and not some picture sharing network that tightly follows the human interface guidelines.
Whenever I interact with any of the gesture inputs near the top or bottom of the screen, these arrows that signal the OS info screens appear. Is there any possible way to turn this off?
Short answer is that this is not possible.
The best you can do is warn users and ask them to go to settings to turn the Control Center off.

Mirror Apple Watch UI to improve usability for left/right handed users

Question
Since Apple seems to be forcing storyboards on us to develop watch apps, how can I mirror the UI for a WKInterfaceController to support a left-handed and right-handed interface without having to maintain 2 separate "Scenes" in my storyboard? Right now, I've copied and pasted the "left handed" scene and modified the UI so the controls are "mirrored" appropriately for use by righties. However, anytime I want to update any part of that UI, I have to do it twice, which is very error prone.
Background
I'm in the final stages of developing an Apple Watch app to complement an existing iOS app. One very important aspect of the watch app is that, while entering data on the watch, the hand interacting with the watch must not cover the screen. My idea to meet this requirement is to align all of the controls you can tap along the same vertical edge of the watch.
There are 2 main controls. Right now, they are stacked vertically and aligned on the left edge of the screen. Therefore, left handed people (like me) who wear their watch on their right hand do not cover the screen at all when tapping the controls on the left edge of the watch worn on their right wrist.
However, with this same UI, right handed people who wear the watch on their left hand will be completely covering the screen when using these controls, and won't see the results of their input until they move their hand.
Therefore, I would like to provide "mirrored" UIs that are configurable depending which hand you wear your watch on.

iOS iPad are there hover like workaround within apps? NOT websites

There seems to be dozens of questions on how to deal with :hover event on the websites when viewed in iPad.
My question is different - I'm building a native iOS game and it would be really good if a user can compare two items side by side. On PC this can easily be done by displaying one item a mouse-over panel when mouse hovers over an inventory item. The main benefit of such panel is that it is easy to show and easy to close on PC.
What are my alternatives for displaying a transient, hover-like interaction panel in a native iOS app?
For iPad (not iPhone) a UIPopover is pretty close to what you want. If you want to support iPhone/iPod as well, there are third party popover libraries for those devices.
However, I'm not sure how this would do for comparing 2 items, since the system only displays 1 popover at a time.
This is really more like a map callout bubble. You could build your own callout bubble sort of interface yourself without a lot of work. When you tap on an item, it would display it's callout, and when you tap on it again, or tap outside all items/callouts, it would hide it. I've done something like that for a custom map system I built for a client and it wasn't that hard.

How do I make iOS 7 Control Center behave like it does in Maps (small tab)?

In Maps in iOS 7, if you swipe up from the bottom of the screen as if opening control center while the map view is full screen, you get a small tab instead of the full Control Center, and the map scrolls normally. You then have to grab that tab and pull it up to open Control Center fully. See this screenshot as an example:
I have an app with significant functionality triggered by dragging up on a small UI element at the bottom of the screen, and it's tricky to grab it in iOS 7 without swiping up from the bottom. How do I trigger the same Control Center behavior in my app? Is there a key in Info.plist that I can set? Or is that some kind of private API that Apple uses? It doesn't happen when the bottom toolbar is visible, so maybe it's some kind of state that can be set programmatically?
UPDATE: The same behavior occurs when you swipe down from the top of the screen as if to open Notification Center.
UPDATE 2: The camera app does the same thing, according to this question about UIScreenEdgePanGestureRecognizer.
I believe you have to hide the status bar for the grabber to come up first. That being said, I think users would expect Control Center to come up when they drag up from the bottom of the screen, so I'd say preventing that from happening isn't a very good user experience.

Resources