iOS: Require 3D Touch? - ios

Has Apple documented any means (such as a UIRequiredDeviceCapabilities key) for requiring 3D Touch on a device?
Am looking into things that will need 3D Touch. Wouldn't be the best experience to run an alert telling the user their new app won't work...
Thanks in advance.

In my opinion you might not even see 3D Touch having a key in UIRequiredDeviceCapabilities. If that happens, it should be in a relatively long time until it is considered absolutely essential.
You should look at 3D Touch as a nice to have, not a mandatory capability.
If it enables certain features on your app, you need to make sure you have a fallback mechanism to access them when the capability is nonexistent.
Moreover, Users with a 3D Touch capable device can disable the feature in:
Settings > General > Accessibility > 3D Touch
So you should always code for both scenarios.
Think of it more like TouchID rather than like Metal. Even though nowadays almost everyone has an iOS device capable of using fingerprint scanning, people can still disable it and use a passcode. So if your app uses TouchID (e.g. most banking apps, Evernote, 1Password, etc) you still need a fallback.
Source: Accessibility and Human Interface Guidelines for 3D Touch
To ensure that all your users can access your app’s features, branch your code depending on whether 3D Touch is available. See Checking for 3D Touch Availability.
NOTE
3D Touch is available only on 3D Touch devices and when enabled. In iOS 9, 3D Touch is enabled by default. A user can turn off 3D Touch in Settings > General > Accessibility > 3D Touch.
When 3D Touch is available, take advantage of its capabilities. When it is not available, provide alternatives such as by employing touch and hold.

UIRequiredDeviceCapabilities is currently the only way to prevent customers from installing apps on a device that does not support certain capabilities by limiting them from installing from the App Store.
As of the time of this posting there is no key for 3D Touch, so your best option would probably be either putting it in the app description somewhere obvious, or as a warning in-app as you mentioned.
Hopefully they will add this key in the future, but I would assume that Apple's preferred usage of 3D touch is as a complementary but not necessary feature in iOS apps.

Even though there is no 3D Touch in UIRequiredDeviceCapabilities, the iPhone 6S and 6S Plus are the only iPhones to have 2 GB of RAM, the rest have at most 1 GB. So just make your app run on iPhones only and require 2 GB of RAM.

I also have a case that I want to show a menu tab for only 3DTouch devices.
For now, if a device without 3DTouch (EG iphone5) upgrades to iOS9.0, it will have 3DTouch enabled by default. When you read the value from traitCollection.forceTouchCapability, it tells you UIForceTouchCapabilityUnavailable. If you have a device with 3DTouch (which will also use iOS9), the value is UIForceTouchCapabilityAvailable.
One other option is to use a third party class to know what device you are using. If it is iPhone6S or iPhone6S+, then return true for supporting3DTouch. But you have to manually update future iOS devices.

Related

How can I use ARKit while using Slide Over/Split Screen on iPadOS?

I have an app that uses ARKit to detect faces and send over the network the coordinates of interest, which works well. I would like this app to run in background, still sending the data over the network, while I would be using another app (almost) fullscreen.
The option 'Enable multiple windows' is activated in info.plist, but as soon as I launch my other app, the ARKit app stops sending information (the app actually probably stops).
Is there a simple way to do this, and at least is this feasible? Thanks!
This is not possible at this point. Camera and AR stuff is disabled at a system level in apps when they are displayed in Slide Over or Split View.
I'd recommend displaying a warning message when Slide Over/Split Screen is being used saying that you should use the app in full screen mode. See this answer under a different question for details.

In iOS, how can I distinguish a Haptic Touch from a long press?

Apple's recent iPhone XR announcement replaces 3D Touch with Haptic Touch, letting you access 3D Touch features by just long-pressing a view.
I'm curious how that will interact with existing UILongPressGestureRecognizer interfaces. There are items in my app that currently have different 3D Touch and long-press functionality.
Since the iPhone XR hardware isn't yet available, I was wondering if anything had been published about how the two features will work together.
Based on testing in the simulator using the Hardware | Touch Pressure | Use Trackpad Force option, Haptic Touch is just a long press.
"In 2015, when the iPhone 6S and 6S Plus were announced, Apple also introduced 3D Touch. 3D Touch uses a Taptic Engine with haptic feedback, which allows the device to sense the pressure of a touch, thereby triggering specific actions. For example, pressing hard on an icon will enable us to quickly open an action menu." - Clayton, C. (January 2018). Learn iOS 11 Programming with Swift 4 - Second Edition.
"3D Touch is basically a technology that adds another dimension (hence the name) to the tapping mechanism that you find in every iOS device. Traditionally, a user could perform the following actions on an iOS device’s screen using her fingers:
Tap;
Long tap;
Swipe;
Double tap.
With the availability of 3D Touch on a device, a new vector that registers the strength of a touch on the screen is added to the formula, allowing the user to keep her finger on the screen and apply more pressure until an action (application specified) is performed. 3D Touch simply allows the iOS device to have access to how hard the user is pressing her finger on the screen, and this opens the door for a whole new set of applications." - Nahavandipoor, V. (2017). iOS 11 Swift Programming Cookbook.
"3D Touch works because the iPhone’s screen is pressure sensitive. You can press on the display, and the harder you press the more “happens.” Haptic Touch screens, like on the iPhone XR, don’t have this pressure sensitivity. Instead, Haptic Touch appears to be just a new name for our old friend, the long press.
(...)
seems like Haptic Touch can do most of what 3D Touch does already. In iOS 12, a new trackpad mode was added, one that works with every iPhone and iPad. Trackpad mode lets you press hard to turn the iPhone’s keyboard into a trackpad, which in turn lets you control the cursor in the text above. The new mode lets you do the same by long-pressing the keyboard’s space bar. I’m almost certain that this feature will get a Haptic Touch vibration in the iPhone XR." - Sorrel,C. (September 2018). What’s the difference between 3-D Touch and Haptic Touch?
"It’s important to remember that not all devices have a Taptic Engine; in the devices that don’t, the feedback simply won’t be played. This means that you mustn’t rely on tactile feedback (...) it should be used to underline effects that are already visually or aurally present." - Buttfield-Addison, P., Manning, J. (October 2018). iOS Swift Game Development Cookbook, 3rd Edition.
"Considering they removed 3D Touch from the device to cut manufacturing costs and to give us the new "Liquid Retina" display, we don't expect any real pressure sensitivity to be associated with Haptic Touch." - Peterson, J. (September 2018). All the 3D Touch Actions You'll Lose by Switching to the iPhone XR with 'Haptic Touch'.
At this stage it’s somewhat unclear how many of the functions supported by Apple and 3rd party apps on 3D Touch can and/or will be supported on Haptic Touch.
With this, what's happening now with 3D touch seems similar to what happened with Force touch back in 2015. The Haptic touch meaning is more specific describing the set of possible actions (as 3D touch has broader meaning). As mentioned by Buttfield-Addison, P., Manning, J., i wouldn't depend on this type of feedback. Since you have it in your app already and basing myself on the meaning of haptic (relating to the sense of touch, in particular relating to the perception and manipulation of objects using the senses of touch and proprioception), I reckon the long-press will remain as it's one of the possible touch interactions in the set of a normal / mainstream person's actions.
There's a good answer in apple.stackexchange I suggest you to read.

Do iOS apps need to be updated for the iPhone X's 120Hz touch array?

The iPhone X has a 120Hz touch array. Do I need to update my app to support this faster touch array, especially if my app support drawing?
TLDR: No, you don’t need to update your app to support 120Hz touch delivery on iPhone X.
However, if you have an app that benefits from precise touch handling, like a drawing app, you can take advantage of 120Hz touch delivery to improve your user experience. And you may already have for iPad Pro — read on for details.
Apple’s iOS Device Compatibility Reference talks about this a bit, if obliquely. The Touch Input table in that doc shows that iPhone X has a touch sample rate higher than its touch delivery rate, just like the first couple models of iPad Pro. (It’s also like how any iPad Pro gets Apple Pencil touches at 240Hz but delivers events only at 60Hz or 120Hz.)
Further down, it says:
When the capture rate is higher than the delivery rate, multiple events are coalesced into one touch event whose location reflects the most recent touch. However, the additional touch information is available for apps that need more precision.
To get the extra touch information, ask the UIEvent object in your touch handler (touchesBegan, touchesMoved, or touchesEnded) for its coalescedTouches(for:), passing the UITouch you got in your touch handler.
Apple has a couple of articles that go into more detail on coalesced touches:
Getting High-Fidelity Input with Coalesced Touches
Implementing Coalesced Touch Support in an App
Also, if you’re doing anything with coalesced touches, you can probably also benefit from handling predicted touches. They also have a few articles about that, and some sample code that uses both:
Minimizing Latency with Predicted Touches
Incorporating Predicted Touches into an App
In short, if you’ve been optimizing your apps for faster (finger) touch handling and Apple Pencil on iPad Pro, you also benefit from faster touch handling on iPhone X.
If you don’t do anything, you’re just fine — only certain kinds of interaction are really improved by custom touch handling code, like drawing apps. And most likely Apple has optimized a bunch of the system touch handling code, like scroll views, gesture recognizers, the new swipe-to-Home and app switching gestures, etc, so your app would benefit from those for free.

Determine 3D touch availability for the device

I want to know if the device supports 3D Touch. I don't like to use the forceTouchCapability property of the UITraitCollection, because it may return unavailable for iPhone 6s if the user has turned off 3D Touch in the settings.
I want to know if 3D Touch is available by hardware, regardless to the settings. The only possible solution which I see is to check device's model. But, may be, someone can suggest more stable and simple solution.
Background: I have added Home Screen Actions to the app, and I want to notify user about it, only if the device supports them.
I am afraid you won't be able to check the exact feature that you are asking for, checking the availability of 3d touch is not only related to the device model:
Keys that indicate the availability of 3D Touch on a device. Only
certain devices support 3D Touch. On those that do, the user can
disable 3D Touch in the Accessibility area in Settings.
UIForceTouchCapability
Meaning that the system tells your application that the 3d touch is unavailable, regardless whether the device does support 3d touch or not, which explain what you mentioned:
I don't like to use the forceTouchCapability property of the
UITraitCollection, because it may return unavailable for iPhone 6s if
the user has turned off 3D Touch in settings.
However, you are able to detect changes to 3D Touch availability while your app is running, by implementing traitCollectionDidChange(_:) as mentioned in Checking the Availability of 3D Touch article:
override func traitCollectionDidChange(_ previousTraitCollection: UITraitCollection?) {
// Update the app's 3D Touch support.
if self.traitCollection.forceTouchCapability == .available {
// Enable 3D Touch features
} else {
// Fall back to other non 3D Touch features.
}
}
Regarding to:
The only possible solution which I see is to check device model
Personally, I think it doesn't make sense to do such a thing, because -as I mentioned above- it is controlled by the system; At some point, it won't make a difference whether the device not support 3d touch or if the use disabled the 3d touch, it should leads the the same result referring to your app.

GPS based VS Beacon based ranging? Which governs Lock screen left corner app icon

There are two approaches for showing an app/app suggestion (incase not installed) on the iphone lock screen / app switcher. One is GPS based, in which the IOS decides which app to show as a suggestion. Another is beacon based, in which a particular beacon is identified.
If location services are enabled for multiple apps and say all these apps are also using beacon based approach to show their icons on the lock screen left corner, which app icon will be shown by the IOS?
Since location services are enabled for these apps,and say there is another relevant app which is NOT using beacon based approach (using just the GPS based approach), can IOS give preference to beacon based apps over the GPS based this new app.?
For instance, Estimote’s NYC office is on the same block as an Equinox gym and our phones intelligently and automatically alert us to use that app. It’s super easy and intuitive to open the app while walking into the gym - and in the process, streamline the check-in flow with the gym’s front desk. However, because it solely uses GPS geofences, the accuracy is poor. We actually get the Equinox icon over 1 block away, and there is no control for the brands or stores (in this case Equinox) on how this appears.
Apple's suggestion of apps not installed on the phone based on proximity uses an undocumented technique. While I have verified it uses GPS as an input, I have never been able to confirm that beacons are used at all.
Regardless of whether beacons are used, because this is an undocumented feature, it is unlikely you will find a way to customize the behavior.
AFAIK, Apple has never shared the implementation details of how the lock screen icon AKA "suggested apps" feature works.
However, we did some experiments at Estimote and noticed that being inside a CLRegion (both the "GPS" CLCircularRegion, and CLBeaconRegion work) that an app monitors for via Core Location, consistently makes the app's icon show up on the lock screen. So it seems that both beacons and GPS location fall into the same mechanism that governs the location-based suggestions. (Note that in iOS 9, that's not just the lock screen icon, but also a bar at the bottom of the app switcher.)
Unfortunately, we weren't able to establish what happens if you're inside multiple qualifying CLRegions, belonging to different apps. We suspect it might have something to do with the order in which the apps register regions for monitoring, but were never able to get consistent results.
Furthermore, since this whole behavior is undocumented, Apple can change it at any time. Just something to be aware of.
Side note: handoff always trumps suggested apps.

Resources