Suitable sensors in iPhone to pick the user interaction - ios

I'm working on an app - one of its main properties is to inform the app if the user is using the phone, interacting with it in any way, or even touching it!
What are the suitable sensors in iPhone that can help me to detect these things?
And how can I benefit from the sensors to make this property work?
Thanks

The iPhone has accelerometers, gyros, and GPS. With these, you can monitor motion of the phone, sudden shocks(like when the phone is picked up and put down), orientation, and over all motion. If outside, you can also use the GPS to pick up on motion and position (lat, long, course, speed, altitude).
When interacting with the app, you've also got touch events and multi-touch events(like using two fingers to zoom in or zoom out or rotate). Most of the 'gestures' are coded and defined by apple so you don't need to figure out the user intent, just respond to their event.
Numerous sensor monitoring apps exist... eg:
http://wavefrontlabs.com/Wavefront_Labs/Sensor_Data.html
Tutorials on how to do some of this stuff :
https://www.youtube.com/watch?v=Hml2jB_Qpds
https://www.youtube.com/watch?v=Xk5cJlhePCI
https://www.youtube.com/watch?v=qY4xCMTejH8
https://www.youtube.com/watch?v=7GHc8ySyWcY
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/GestureRecognizer_basics/GestureRecognizer_basics.html

Related

Do iOS apps need to be updated for the iPhone X's 120Hz touch array?

The iPhone X has a 120Hz touch array. Do I need to update my app to support this faster touch array, especially if my app support drawing?
TLDR: No, you don’t need to update your app to support 120Hz touch delivery on iPhone X.
However, if you have an app that benefits from precise touch handling, like a drawing app, you can take advantage of 120Hz touch delivery to improve your user experience. And you may already have for iPad Pro — read on for details.
Apple’s iOS Device Compatibility Reference talks about this a bit, if obliquely. The Touch Input table in that doc shows that iPhone X has a touch sample rate higher than its touch delivery rate, just like the first couple models of iPad Pro. (It’s also like how any iPad Pro gets Apple Pencil touches at 240Hz but delivers events only at 60Hz or 120Hz.)
Further down, it says:
When the capture rate is higher than the delivery rate, multiple events are coalesced into one touch event whose location reflects the most recent touch. However, the additional touch information is available for apps that need more precision.
To get the extra touch information, ask the UIEvent object in your touch handler (touchesBegan, touchesMoved, or touchesEnded) for its coalescedTouches(for:), passing the UITouch you got in your touch handler.
Apple has a couple of articles that go into more detail on coalesced touches:
Getting High-Fidelity Input with Coalesced Touches
Implementing Coalesced Touch Support in an App
Also, if you’re doing anything with coalesced touches, you can probably also benefit from handling predicted touches. They also have a few articles about that, and some sample code that uses both:
Minimizing Latency with Predicted Touches
Incorporating Predicted Touches into an App
In short, if you’ve been optimizing your apps for faster (finger) touch handling and Apple Pencil on iPad Pro, you also benefit from faster touch handling on iPhone X.
If you don’t do anything, you’re just fine — only certain kinds of interaction are really improved by custom touch handling code, like drawing apps. And most likely Apple has optimized a bunch of the system touch handling code, like scroll views, gesture recognizers, the new swipe-to-Home and app switching gestures, etc, so your app would benefit from those for free.

Is there any way to keep touches alive in iOS?

I'm testing the touch detection capabilities of the iPad (specifically iPad 3 at the moment). I'm printing out a log of detected touches to see what is going on, and using a bottle with three foam (touch friendly) pads at the bottom.
The touches are detected fine and it logs any slight movements which is great. The issue is that after a while if the bottle isn't moved at all, all three touches are forcibly cancelled and remain undetected until the bottle is removed and placed back down.
So it seems there's a timeout on these touches... I have not found any specific information on this. Is there any way to revive or keep alive touches without having to physically remove and restore your touch points?
A capacitive touch sensor is a very sensitive instrument, and a lot of filtering is done before the OS reports touches to you. This is also the reason why the iPad cannot detect touches with smaller contact areas than what comes with cheap styli - rubber domes of about 7mm diameter. The touchscreen controller detects smaller touches alright, but filters them to avoid spurious signals. Note that the touchscreen of a Samsung Galaxy will already detect touches with a dull pencil - the screen is (for practical purposes) the same as that of an iPad, but the cutoff for the controller is lower.
I would expect that the self-calibrating logic inside the touch controller will calibrate your touches away. Basically, it will set the signal produced by your foam pads as the new normal. On the application side there is nothing that you can do to reverse this.
Disclaimer: I have no inside knowledge of what goes on inside the touchscreen controller. My answer is based on much reading and solid reasoning, nothing more.

GPS based VS Beacon based ranging? Which governs Lock screen left corner app icon

There are two approaches for showing an app/app suggestion (incase not installed) on the iphone lock screen / app switcher. One is GPS based, in which the IOS decides which app to show as a suggestion. Another is beacon based, in which a particular beacon is identified.
If location services are enabled for multiple apps and say all these apps are also using beacon based approach to show their icons on the lock screen left corner, which app icon will be shown by the IOS?
Since location services are enabled for these apps,and say there is another relevant app which is NOT using beacon based approach (using just the GPS based approach), can IOS give preference to beacon based apps over the GPS based this new app.?
For instance, Estimote’s NYC office is on the same block as an Equinox gym and our phones intelligently and automatically alert us to use that app. It’s super easy and intuitive to open the app while walking into the gym - and in the process, streamline the check-in flow with the gym’s front desk. However, because it solely uses GPS geofences, the accuracy is poor. We actually get the Equinox icon over 1 block away, and there is no control for the brands or stores (in this case Equinox) on how this appears.
Apple's suggestion of apps not installed on the phone based on proximity uses an undocumented technique. While I have verified it uses GPS as an input, I have never been able to confirm that beacons are used at all.
Regardless of whether beacons are used, because this is an undocumented feature, it is unlikely you will find a way to customize the behavior.
AFAIK, Apple has never shared the implementation details of how the lock screen icon AKA "suggested apps" feature works.
However, we did some experiments at Estimote and noticed that being inside a CLRegion (both the "GPS" CLCircularRegion, and CLBeaconRegion work) that an app monitors for via Core Location, consistently makes the app's icon show up on the lock screen. So it seems that both beacons and GPS location fall into the same mechanism that governs the location-based suggestions. (Note that in iOS 9, that's not just the lock screen icon, but also a bar at the bottom of the app switcher.)
Unfortunately, we weren't able to establish what happens if you're inside multiple qualifying CLRegions, belonging to different apps. We suspect it might have something to do with the order in which the apps register regions for monitoring, but were never able to get consistent results.
Furthermore, since this whole behavior is undocumented, Apple can change it at any time. Just something to be aware of.
Side note: handoff always trumps suggested apps.

Track device orientation when orientation is locked

I need to track device orientation even though device orientation is locked to Portrait mode. What I really need is to accomplish similar behaviour to what Instagram camera view has: when you rotate device, it will rotate buttons over camera view, even when your device is locked.
I used to track orientation with UIDeviceOrientationDidChangeNotification, but that is not fired when device orientation is locked :(
Is there perhaps implementation somewhere using accelerometer and/or gyroscope? I'm suprised I couldn't find something like that.
Use the accelerometer to detect device orientation yourself. You can use the Core Motion framework to get the data you need. There's a sample snippet in the linked docs that shows how to get the data. Use a low-pass filter to isolate the force of gravity from relatively short-term changes due to user movement. Apple has a sample project called AccelerometerGraph that demonstrates this.

Detecting continuous/repeated shaking

An app I'm writing requires the user to shake their device for X seconds.
I tried doing this via motionBegan and it works sometimes. But sometimes either motionEnded or motionCancelled get called in the middle of the shake process, and motionBegan doesn't get called again unless you stop shaking completely. And there doesn't seem to be a way to detect whether the device is currently shaking.
There are a number of apps in the App Store that do this successfully, so there's obviously something I'm missing.
Motion events are discrete: Once iOS detects the device has been shaken, it sends the corresponding event and that's that—you have no way to tell it you're interested in long or short shakes. In fact, the documentation here says:
An event is canceled if the shake motion is interrupted or if iOS determines that the motion is not valid after all—for example, if the shaking lasts too long.
If the basic shake motion events aren't adequate for your application, you'll need to implement your own custom shake detection using accelerometer data. This answer is a good place to start.

Resources