iOS Motion Detection: Motion Detection Sensitivity Levels - ios

I have a simple question. I'm trying to detect when a user shakes the iPhone. I have the standard code in place to detect the motion and this works no problem. However, in testing this on my actual phone, I've realized that you have to shake the device quite hard to get the motion detection to trigger. I would like to know if there is a way to implement a level of sensitivity checking. For example, a way to detect if a user lightly shakes the device or somewhere between light and hard shake. This will be targeted towards iOS 7 so any tips or advice that is not deprecated from older iOS version would be greatly appreciated. I've done my googling but have yet to find any good solutions to this problem (If there are any.)
Thanks!
-(void)motionEnded:(UIEventSubtype)motion withEvent:(UIEvent *)event
{
if(motion == UIEventSubtypeMotionShake)
{
//Detected motion, do something about it
//at this point.
}
}
-(BOOL)canBecomeFirstResponder
{
return YES;
}
-(void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[self becomeFirstResponder];
}
-(void)viewWillDisappear:(BOOL)animated
{
[self resignFirstResponder];
[super viewWillDisappear:animated];
}

Here is the solution I found. This works well but you do have to play with the deviceMotionUpdateInterval time value as well as the accelerationThreshold which can be tricky to get a fine balancing act for a actual "light shake" vs "picking up the phone and moving it closer to your face etc..." There might be better ways but here is one to start. Inside of my view didLoad I did something like this:
#import <CoreMotion/CoreMotion.h> //do not forget to link the CoreMotion framework to your project
#define accelerationThreshold 0.30 // or whatever is appropriate - play around with different values
-(void)viewDidLoad
{
CMMotionManager *motionManager;
motionManager = [[CMMotionManager alloc] init];
motionManager.deviceMotionUpdateInterval = 1;
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue] withHandler:^(CMDeviceMotion *motion, NSError *error)
{
[self motionMethod:motion];
}];
}
-(void)motionMethod:(CMDeviceMotion *)deviceMotion
{
CMAcceleration userAcceleration = deviceMotion.userAcceleration;
if (fabs(userAcceleration.x) > accelerationThreshold
|| fabs(userAcceleration.y) > accelerationThreshold
|| fabs(userAcceleration.z) > accelerationThreshold)
{
//Motion detected, handle it with method calls or additional
//logic here.
[self foo];
}
}

This is a swift version based on zic10's answer, with the addition of a flag that prevents getting a few extra calls to your motion handler even when the first line in that handler is motionManager.stopDeviceMotionUpdates().
Also, a value of around 3.0 can be useful if you want to ignore the shake, but detect a bump. I found 0.3 to be way too low as it ended up being more like "detect move". In my tests, the ranges were more like:
0.75 - 2.49 is a better range for shake sensitivity
2.5 - 5.0 is a good range for "ignore shake, detect bump"
Here is the complete view controller for an Xcode single VC template:
import UIKit
import CoreMotion
class ViewController: UIViewController {
lazy var motionManager: CMMotionManager = {
return CMMotionManager()
}()
let accelerationThreshold = 3.0
var handlingShake = false
override func viewWillAppear(animated: Bool) {
handlingShake = false
motionManager.startDeviceMotionUpdatesToQueue(NSOperationQueue.currentQueue()!) { [weak self] (motion, error) in
if
let userAcceleration = motion?.userAcceleration,
let _self = self {
print("\(userAcceleration.x) / \(userAcceleration.y)")
if (fabs(userAcceleration.x) > _self.accelerationThreshold
|| fabs(userAcceleration.y) > _self.accelerationThreshold
|| fabs(userAcceleration.z) > _self.accelerationThreshold)
{
if !_self.handlingShake {
_self.handlingShake = true
_self.handleShake();
}
}
} else {
print("Motion error: \(error)")
}
}
}
override func viewWillDisappear(animated: Bool) {
// or wherever appropriate
motionManager.stopDeviceMotionUpdates()
}
func handleShake() {
performSegueWithIdentifier("showShakeScreen", sender: nil)
}
}
And the storyboard I used for this test looks like this:
It's also worth noting that CoreMotion is not testable in the simulator. Because of this constraint you may still find it worthwhile to additionally implement the UIDevice method of detecting motion shake. This would allow you to manually test shake in the simulator or give UITests access to shake for testing or tools like fastlane's snapshot. Something like:
class ViewController: UIViewController {
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
becomeFirstResponder()
}
override func canBecomeFirstResponder() -> Bool {
return true
}
override func motionEnded(motion: UIEventSubtype, withEvent event: UIEvent?) {
if TARGET_OS_SIMULATOR != 0 {
if event?.subtype == .MotionShake {
// do stuff
}
}
}
}
And then use Ctrl-Cmd-Z to test shake in the simulator.

Use core motion.
Link your binary with the CoreMotion framework.
Include
#import <CoreMotion/CoreMotion.h>
in your class.
Create an instance of CMMotionManager.
Set the deviceMotionUpdateInterval property to a suitable value.
Then call startDeviceMotionUpdatesToQueue.
You will get continuous updates inside the block, which include acceleration, magnetic field, rotation, etc.
You will get the data you require.
One thing to be taken care of is that the update shall be so rapid if
the interval is too small, and hence you will have to employ suitable
logic to handle the same.

Heres how I did this using Swift 3.
Import CoreMotion and create an instance
import CoreMotion
let motionManager = CMMotionManager()
On ViewDidLoad or wherever you want to start checking for updates:
motionManager.startDeviceMotionUpdates(to: OperationQueue.current!, withHandler:{
deviceManager, error in
if(error == nil){
if let mgr = deviceManager{
self.handleMotion(rate: mgr.rotationRate)
}
}
})
This function takes the rotation rate and gets a sum for the absolute values for x,y and z movements
func handleMotion(rate: CMRotationRate){
let totalRotation = abs(rate.x) + abs(rate.y) + abs(rate.z)
if(totalRotation > 20) {//Play around with the number 20 to find the optimal level for your case
start()
}else{
print(totalRotation)
}
}
func start(){
//The function you want to trigger when the device is rotated
}

Related

proximityMonitoring may not be working as intended

for my iOS app i want to implement a feature where the screen should turns off (like when you answer a phone call) when the device is faced down.
so I've started by detecting the device orientation:
//in my ViewDidLoad
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(self.rotated(_:)), name: UIDeviceOrientationDidChangeNotification, object: nil)
//called when the device changes orientation
func rotated(notification: NSNotification){
if UIDevice.currentDevice().orientation == UIDeviceOrientation.FaceDown{
print("device = faced down")
}else{
print("device != faced down")
}
}
when device is down i've called
UIDevice.currentDevice().proximityMonitoringEnabled = true
else
UIDevice.currentDevice().proximityMonitoringEnabled = false
the problem is UIDeviceOrientationDidChangeNotification seems to act a little bit late so when the rotated() function is called, the device is already faced down and it turns out that in order for proximityMonitoringEnabled = true to turn off the screen the proximity sensor should not be already covered !
I'm pretty sure that this is an Apple limitation but maybe someone out there did found a solution or came across a workaround!
thanks in advance.
Approach:
Since iOS doesn't provide before changing Orientation we couldn't rely on 'UIDeviceOrientationDidChangeNotification'. Instead we can use CoreMotion Framework and access hardware's gyroscope for detecting possible FaceDown orientations and set proximityMonitoringEnabled appropriately.
Gyroscope Data:
Using Gyroscope observations below values can possibly detect FaceDown Orientation.
let gyroData = (minX:-3.78, minY:-3.38, minZ:-5.33, maxX:3.29, maxY:4.94, maxZ:3.36)
Solution in Swift:
class ProximityViewController: UIViewController {
let cmManager = CMMotionManager(), gyroData = (minX:-3.78, minY:-3.38, minZ:-5.33, maxX:3.29, maxY:4.94, maxZ:3.36)
override func viewDidLoad() {
super.viewDidLoad()
//Using GyroMotion
experimentCoreMotion()
}
//MARK: - Using Core Motion
func experimentCoreMotion() {
if cmManager.gyroAvailable {
//Enable device orientation notification
UIDevice.currentDevice().beginGeneratingDeviceOrientationNotifications()
cmManager.gyroUpdateInterval = 0.1
handleFaceDownOrientation()
}
}
func handleFaceDownOrientation() {
cmManager.startGyroUpdatesToQueue(NSOperationQueue.currentQueue()!, withHandler: { (data:CMGyroData?, error: NSError?) in
if self.isGyroDataInRange(data!) {
UIDevice.currentDevice().proximityMonitoringEnabled = (UIDevice.currentDevice().orientation == .FaceDown)
if UIDevice.currentDevice().orientation == .FaceDown { print("FaceDown detected") }
else { print("Orientation is not facedown") }
}
})
}
func isGyroDataInRange(val: CMGyroData) -> Bool {
return ((val.rotationRate.x > gyroData.minX && val.rotationRate.x < gyroData.maxX) &&
(val.rotationRate.y > gyroData.minY && val.rotationRate.y < gyroData.maxY) &&
(val.rotationRate.z > gyroData.minZ && val.rotationRate.z < gyroData.maxZ))
}
}
Hope my solution solves your query. Let me know the solution is working fine for your requirement.
Gyroscope & Accelerometer Observations:
I've experimented the possible values of FaceDown orientation using Gyroscope & Accelerometer. IMO, Gyroscope data seems fine but it's open to explore other hardware's sensors to detect FaceDown Orientation.

Switching Cameras slow in AVCaptureSession

I've looked at many other questions like this, and tried a lot of the solutions, but this case is a bit different. I'm using AVCaptureVideoDataOutputSampleBufferDelegate so that I can apply CIFilters to the live video feed. I'm using the following method to change cameras:
func changeCameras() {
captureSession.stopRunning()
var desiredPosition: AVCaptureDevicePosition?
if front {
desiredPosition = AVCaptureDevicePosition.Back
} else {
desiredPosition = AVCaptureDevicePosition.Front
}
let devices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) as? [AVCaptureDevice]
for device in devices! {
if device.position == desiredPosition {
self.captureSession.beginConfiguration()
do {
let input = try AVCaptureDeviceInput(device: device)
for oldInput in self.captureSession.inputs {
print(oldInput)
self.captureSession.removeInput(oldInput as! AVCaptureInput)
}
print(input)
self.captureSession.addInput(input)
self.captureSession.commitConfiguration()
dispatch_async(dispatch_get_main_queue(), { () -> Void in
self.captureSession.startRunning()
})
} catch { print("evic failed")}
}
}
front = !front
}
The methods that I am using to set up the camera (called in viewDidLoad) and receive the sampleBuffer from the delegate are here: https://gist.github.com/JoeyBodnar/17e22e3c04093caa54cf240ed8b1b601.
One problem is that when pressing the button to change cameras, it takes a solid 4-5 seconds of the screen freezing before changing. I've tried the above method, as well as creating a separate queue to run the entire function on, and it still takes a long time. I've never had this problem when switching cameras just using the regular AVVideoPreviewLayer, so I think this may be caused in part by the fact that i'm using the sample buffer delegate, but can't quite piece together how/why. Any help is appreciated. thanks!

Check if 3D touch is supported and enabled on the iOS9 device

I tried to access the trait collection and check "forceTouchCapability", but "forceTouchCapability" simply checks to see if the device is iOS 9.0 or greater.
So, this means that on any device with iOS 9, force touch is 'available'. I need to a way to check if 3D touch is actually supported on the users device (iPhone 6s) and I need to make sure that the 3D Touch option is actually enabled in the accessibility settings.
I was accidentally casting forceTouchCapability to a BOOL (using it as a return value to my method that was set to return a boolean). I needed to check if forceTouchCapability was equal to UIForceTouchCapabilityAvailable.
Instead of:
return [[MyView traitCollection] forceTouchCapability];
I need:
return [[MyView traitCollection] forceTouchCapability] == UIForceTouchCapabilityAvailable;
If you implement this in UIViewController, the timing matters. Checking in viewDidLoad will return Unknown when it will return Available later in the lifecycle.
- (void)traitCollectionDidChange:(nullable UITraitCollection *)previousTraitCollection {
[self checkForForceTouch];
}
- (void)checkForForceTouch {
if ([self.traitCollection respondsToSelector:#selector(forceTouchCapability)] &&
self.traitCollection.forceTouchCapability == UIForceTouchCapabilityAvailable) {
NSLog(#"Force touch found");
}
}
Sometimes we want just to have forceTouchCapability value right now synchronously and don't want to wait for traitCollectionDidChange: event. In that case, we can use pretty stupid function like this:
public func forceTouchCapability() -> UIForceTouchCapability {
return UIApplication.sharedApplication().keyWindow?.rootViewController?.traitCollection.forceTouchCapability ?? .Unknown
}
if ([MyView respondsToSelector:#selector(traitCollection)] &&
[MyView.traitCollection respondsToSelector:#selector(forceTouchCapability)] &&
MyView.traitCollection.forceTouchCapability == UIForceTouchCapabilityAvailable) {
return YES;
}
My point is, if you are supporting iOS 7 and iOS 8 as well, remember to check for both the conditions: [MyView respondsToSelector:#selector(traitCollection)] and [MyView.traitCollection respondsToSelector:#selector(forceTouchCapability)].
If you keep the first check, the app works fine on iOS 7 but crashes on iOS 8.
Basically, Apple introduced traitCollection property in iOS 8 but added forceTouchCapability property only in iOS 9.
from UITraitCollection.h:
#property (nonatomic, readonly) UITraitCollection *traitCollection NS_AVAILABLE_IOS(8_0);
#property (nonatomic, readonly) UIForceTouchCapability forceTouchCapability NS_AVAILABLE_IOS(9_0);
PS: Learnt it the hard way, after app started to crash on App Store.
Swift 4.0 and 4.1.
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Check device supports feature
if self.traitCollection.forceTouchCapability == .available {
// Enable 3D Touch feature here
} else {
// Fall back to other non 3D feature
}
}
override func traitCollectionDidChange(_ previousTraitCollection: UITraitCollection?) {
// Update the app's 3D Touch support
if self.traitCollection.forceTouchCapability == .available {
// Enable 3D Touch feature here
} else {
// Fall back to other non 3D feature
}
}
}
following Valentin Shergin, updated for swift 4.2 if You need a sync call:
func forceTouchCapability() -> UIForceTouchCapability {
return UIApplication.shared.keyWindow?.rootViewController?.traitCollection.forceTouchCapability ?? .unknown
}
func hasForceTouchCapability() -> Bool {
return forceTouchCapability() == .available
}

How to Implement shake gestures in an Apple Watch application?

The WatchKit reference seems to make no mention about it. Have I missed something? Or is it really not possible to implement a shake gesture in an Apple Watch application?
The following is a typical example of a shake gesture implementation on iOS:
// MARK: Gestures
override func motionEnded(motion: UIEventSubtype, withEvent event: UIEvent) {
if(event.subtype == UIEventSubtype.MotionShake) {
// do something
}
}
No, it is not possible to do anything having to do with UIEvents in WatchKit right now, with the current solution's "remoted UI" approach where you mostly just get to tell the watch how to use the pre-arranged UI from the storyboard and react to actions like tapping a button or a table row. There will be support for a lot more code running on the watch later this year, according to Apple.
Update: Native apps are now possible for watchOS 2. This functionality may be present.
This is now possible in Watch OS2 with CMMotionManager a part of CoreMotion.
You can have this workaround.
let motionManager = CMMotionManager()
if (motionManager.accelerometerAvailable) {
let handler:CMAccelerometerHandler = {(data: CMAccelerometerData?, error: NSError?) -> Void in
if (data!.acceleration.z > 0) {
// User did shake
}
}
motionManager.startAccelerometerUpdatesToQueue(NSOperationQueue.currentQueue()!, withHandler: handler)
}
How Roger say, an workaround is use the CoreMotion.
You can use coreMotion to get movement.
I build this simple wrapper on my Github.

Why am I not getting any accelerometer update?

I am trying to get accelerometer updates with CoreMotion and Swift, here is what I placed in my viewDidLoad :
override func viewDidLoad() {
super.viewDidLoad()
let motionManager = CMMotionManager()
motionManager.accelerometerUpdateInterval = 0.2
motionManager.startAccelerometerUpdatesToQueue(NSOperationQueue.currentQueue()) {
(info:CMAccelerometerData!,error:NSError!) in
if error != nil {
println(error)
}
else {
println("OK")
}
}
}
The problem is that it looks like my closure never gets called (I don't have anything in the console), do you know why?
The problem is that the variable motionManager, to which your CMMotionManager instance is assigned, is declared as a local variable (in the body of the function viewDidLoad), which means that it goes out of existence when the function finishes executing. Therefore its lifetime is about a 10000th of a second.
Well, that is not long enough for your CMMotionManager to obtain very many updates!

Resources