Detect when a chopping motion has been made - Swift - ios

I'm starting with my first app for iOS and I am trying to get gyro data to play a whip sound when you flick your phone.
From what I can tell, I should be using CoreMotion to get the state of the gyro, then doing some math to work out when a whip-like gesture is made, and then to run my function?
This is what I have so far - this is my ContentView.swift file.
import SwiftUI
import AVFoundation
import CoreMotion
let popSound = Bundle.main.url(forResource: "whip", withExtension: "mp3")
var audioPlayer = AVAudioPlayer()
var motionManager: CMMotionManager!
func audioPlayback() {
do {
audioPlayer = try AVAudioPlayer(contentsOf: popSound!)
audioPlayer.play()
} catch {
print("couldn't load sound file")
}
}
struct ContentView: View {
var body: some View {
Text("Press button!")
Button(action: {
audioPlayback()
}, label: {
Text("Press me!")
})
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
Currently it's set to a button. Can someone link me to a resource, or walk me though this?

Usually when dealing with such devices you can either get a current snapshot or you can request to get a feed of snapshot changes. In your case the promising methods seem to be startAccelerometerUpdates and startDeviceMotionUpdates for CMMotionManager. I am pretty sure somewhere in there should be sufficient information to nearly detect a gesture you are describing.
If you dig into these methods you will see that you get a feed of "frames" where each frame describes a situation at certain time.
Since you are detecting a gesture you are not interested into a single frame but rather a series of frames. So probably the first thing you need is some object to which you can append frames and this object will evaluate if current set of frames corresponds to your gesture or not. It should also be able to discard data which is not interesting. For instance frames older than 3 seconds can be discarded as this gesture should never need more than 3 seconds.
So now your problem is split into 2 parts. First part is creating an object that is able to collect frames. I would give it a public method like appendFrame or appendSnapshot. Then keep collecting frames on it. The object also needs to be able to report back that it has detected a required gesture so that you play a sound at that point. Without the detection you should be able to mock for instance that after 100 frames the buffer is cleared and that notification is reported back which then triggers the sound. So no detection at this point but everything else.
The second part is the detection itself. You now have a pool of samples, frames or snapshots. You can at any time aggregate data anyway you want to. You would probably use a secondary thread to process the data so the UI is not laggy and to be able to throttle how much CPU power you put into it. As for the detection itself I would say you may try to create some samples and try figure out the "math" part. When you have some idea or can at least preset the community with some recordings you could ask another specific question about that. It does look like a textbook example to use Machine Learning for instance.
From mathematical point of view there may be some shortcuts. A very simple example would be just looking at the direction of your device as normalized direction(x, y, z). I think you can actually already get that very easily from native components. In a "chopping" motion we expect that rotation suddenly (nearly) stopped and was recently (nearly) 90 degrees offset from current direction.
Speed:
Assuming you have an array of direction such as let directions[(x: CGFloat, y: CGFloat, z: CGFloat)] then you could identify some rotation speed changes with length of cross product.
let rotationSpeed = length(cross(directions[index], directions[index+1]))
the speed should always be a value between 0 and 1 where a maximum of 1 would mean 90 degrees change. Hope it never comes to that and you are always in values between 0 and 0.3. If you DO get to values larger than 0.5 then frame-rate of your device is too low and samples are best just discarded.
Using this approach you can map your rotations from array of vectors to array of speeds rotationSpeeds: [Float] which becomes more convenient for you. You are now looking within this array if there is a part where the rotation speed suddenly drops from high value to low value. What those values are you will need to test yourself and tweak them. But a "sudden drop" may not be on only 2 sequential samples. You need to find for instance 5 high speed frames followed by 2 low speed frames. Rather even more than that.
Now that you found such a point you found a candidate for end of your chop. At this point you can now go backwards and check all frames going back in time up to somewhere between 0.5 and 1.0 seconds from candidate (again a value you will need to try out yourself). If any of this frame is nearly 90 degrees away from candidate then you have your gesture. Something like the following should do:
length(cross(directions[index], directions[candidateIndex])) > 0.5
where the 0.5 is again something you will need to test. The closer to 1.0 the more precise the gesture needs to be. I think 0.5 should be pretty good to begin with.
Perhaps you can play with the following and see if you can get satisfying results:
struct Direction {
let x: Float
let y: Float
let z: Float
static func cross(_ a: Direction, _ b: Direction) -> Direction {
Direction(x: a.y*b.z - a.z*b.y, y: a.z*b.x - a.x*b.z, z: a.x*b.y - a.y*b.z) // Needs testing
}
var length: Float { (x*x + y*y + z*z).squareRoot() }
}
class Recording<Type> {
private(set) var samples: [Type] = [Type]()
func appendSample(_ sample: Type) { samples.append(sample) }
}
class DirectionRecording: Recording<Direction> {
func convertToSpeedRecording() -> SpeedRecording {
let recording = SpeedRecording()
if samples.count > 1 { // Need at least 2 samples
for index in 0..<samples.count-1 {
recording.appendSample(Direction.cross(samples[index], samples[index+1]).length)
}
}
return recording
}
}
class SpeedRecording: Recording<Float> {
func detectSuddenDrops(minimumFastSampleCount: Int = 4, minimumSlowSampleCount: Int = 2, maximumThresholdSampleCount: Int = 2, minimumSpeedTreatedAsHigh: Float = 0.1, maximumSpeedThresholdTreatedAsLow: Float = 0.05) -> [Int] { // Returns an array of indices where sudden drop occurred
var result: [Int] = [Int]()
// Using states to identify where in the sequence we currently are.
// The state should go none -> highSpeed -> lowSpeed
// Or the state should go none -> highSpeed -> thresholdSpeed -> lowSpeed
enum State {
case none
case highSpeed(sequenceLength: Int)
case thresholdSpeed(sequenceLength: Int)
case lowSpeed(sequenceLength: Int)
}
var currentState: State = .none
samples.enumerated().forEach { index, sample in
if sample > minimumSpeedTreatedAsHigh {
// Found a high speed sample
switch currentState {
case .none: currentState = .highSpeed(sequenceLength: 1) // Found a first high speed sample
case .lowSpeed: currentState = .highSpeed(sequenceLength: 1) // From low speed to high speed resets it back to high speed step
case .thresholdSpeed: currentState = .highSpeed(sequenceLength: 1) // From threshold speed to high speed resets it back to high speed step
case .highSpeed(let sequenceLength): currentState = .highSpeed(sequenceLength: sequenceLength+1) // Append another high speed sample
}
} else if sample > maximumSpeedThresholdTreatedAsLow {
// Found a sample somewhere between fast and slow
switch currentState {
case .none: break // Needs to go to high speed first
case .lowSpeed: currentState = .none // Low speed back to threshold resets to beginning
case .thresholdSpeed(let sequenceLength):
if sequenceLength < maximumThresholdSampleCount { currentState = .thresholdSpeed(sequenceLength: sequenceLength+1) } // Can still stay inside threshold
else { currentState = .none } // In threshold for too long. Reseting back to start
case .highSpeed: currentState = .thresholdSpeed(sequenceLength: 1) // A first transition from high speed to threshold
}
} else {
// A low speed sample found
switch currentState {
case .none: break // Waiting for high speed sample sequence
case .lowSpeed(let sequenceLength):
if sequenceLength < minimumSlowSampleCount { currentState = .lowSpeed(sequenceLength: sequenceLength+1) } // Not enough low speed samples yet
else { result.append(index); currentState = .none } // Got everything we need. This is a HIT
case .thresholdSpeed: currentState = .lowSpeed(sequenceLength: 1) // Threshold can always go to low speed
case .highSpeed: currentState = .lowSpeed(sequenceLength: 1) // High speed can always go to low speed
}
}
}
return result
}
}
func recordingContainsAChoppingGesture(recording: DirectionRecording, minimumAngleOffset: Float = 0.5, maximumSampleCount: Int = 50) -> Bool {
let speedRecording = recording.convertToSpeedRecording()
return speedRecording.detectSuddenDrops().contains { index in
for offset in 1..<maximumSampleCount {
let sampleIndex = index-offset
guard sampleIndex >= 0 else { return false } // Can not go back any further than that
if Direction.cross(recording.samples[index], recording.samples[sampleIndex]).length > minimumAngleOffset {
return true // Got it
}
}
return false // Sample count drained
}
}

Related

SpriteKit stop spinning wheel in a defined angle

I have a spinning wheel rotating at an angular speed ω, no acceleration involved, implemented with SpriteKit.
When the user push a button I need to slowly decelerate the wheel from the current angle ∂0 and end-up in a specified angle (lets call it ∂f).
I created associated to it a mass of 2.
I already tried the angularDamping and the SKAction.rotate(toAngle: duration:) but they do not fit my needs because:
With the angularDamping I cannot specify easy the angle ∂f where I want to end up.
With the SKAction.rotate(toAngle: duration:) I cannot start slowing down from the current rotation speed and it doesn't behave natural.
The only remaining approach I tried is by using the SKAction.applyTorque(duration:).
This sounds interesting but I have problems calculating the formula to obtain the correct torque to apply and especially for the inertia and radius of the wheel.
Here is my approach:
I'm taking the starting angular velocity ω as:
wheelNode.physicsBody?.angularVelocity.
I'm taking the mass from wheelNode.physicsBody?.mass
The time t is a constant of 10 (this means that in 10 seconds I want the wheel decelerating to the final angle ∂f).
The deceleration that I calculated as:
let a = -1 * ω / t
The inertia should be: let I = 1/2 * mass * pow(r, 2)*. (see notes regarding the radius please)
Then, finally, I calculated the final torque to apply as: let t = I * a (taking care that is opposite of the current angular speed of the wheel).
NOTE:
Since I don't have clear how to have the radius of the wheel I tried to grab it both from:
the wheelNode.physicsBody?.area as let r = sqrt(wheelNode.physicsBody?.area ?? 0 / .pi)
by converting from pixel to meters as the area documentation says. Then I have let r = self.wheelNode.radius / 150.
Funny: I obtain 2 different values :(
UNFORTUNATLY something in this approach is not working because so far I have no idea how to end up in the specified angle and the wheel doesn't stop anyway as it should (or the torque is too much and spins in the other direction, or is not enough). So, also the torque applied seems to be wrong.
Do you know a better way to achieve the result I need? Is that the correct approach? If yes, what's wrong with my calculations?
Kinematics makes my head hurt, but here you go. I made it to where you can input the amount of rotations and the wheel will rotate that many times as its slowing down to the angle you specify. The other function and extension are there to keep the code relatively clean/readable. So if you just want one giant mess function go ahead and modify it.
• Make sure the node's angularDampening = 0.0
• Make sure the node has a circular physicsbody
// Stops a spinning SpriteNode at a specified angle within a certain amount of rotations
//NOTE: Node must have a circular physicsbody
// Damping should be from 0.0 to 1.0
func decelerate(node: SKSpriteNode, toAngle: CGFloat, rotations: Int) {
if node.physicsBody == nil { print("Node doesn't have a physicsbody"); return } //Avoid crash incase node's physicsbody is nil
var cw:CGFloat { if node.physicsBody!.angularVelocity < CGFloat(0.0) { return -1.0} else { return 1.0} } //Clockwise - using int to reduce if statments with booleans
let m = node.physicsBody!.mass // Mass
let r = CGFloat.squareRoot(node.physicsBody!.area / CGFloat.pi)() // Radius
let i = 0.5 * m * r.squared // Intertia
let wi = node.physicsBody!.angularVelocity // Initial Angular Velocity
let wf:CGFloat = 0 // Final Angular Velocity
let ti = CGFloat.unitCircle(node.zRotation) // Initial Theta
var tf = CGFloat.unitCircle(toAngle) // Final Theta
//Correction constant based on rate of rotation since there seems to be a delay between when the action is calcuated and when it is run
//Without the correction the node stops a little off from its desired stop angle
tf -= 0.00773889 * wi //Might need to change constn
let dt = deltaTheta(ti, tf, Int(cw), rotations)
let a = -cw * 0.5 * wi.squared / abs(dt) // Angular Acceleration - cw used to determine direction
print("A:\(a)")
let time:Double = Double(abs((wf-wi) / a)) // Time needed to stop
let torque:CGFloat = i * a // Torque needed to stop
node.run(SKAction.applyTorque(torque, duration: time))
}
func deltaTheta(_ ti:CGFloat, _ tf:CGFloat, _ clockwise: Int, _ rotations: Int) -> CGFloat {
let extra = CGFloat(rotations)*2*CGFloat.pi
if clockwise == -1 {
if tf>ti { return tf-ti-2*CGFloat.pi-extra }else{ return tf-ti-extra }
}else{
if tf>ti { return tf-ti+extra }else{ return tf+2*CGFloat.pi+extra-ti }
}
}
}
extension CGFloat {
public var squared:CGFloat { return self * self }
public static func unitCircle(_ value: CGFloat) -> CGFloat {
if value < 0 { return 2 * CGFloat.pi + value }
else{ return value }
}
}

SKEmiterNode with AVAudioPlayer for music visuals

PLEASE SOMEONE HELP!
I want to have my SKEmiterNode's scale(meaning size) get larger and smaller to the music i have built into the application using AVAudioPlayer. Right now this is pretty much all I have for the SKEmiterNode and it looks great:
beatParticle?.position = CGPoint(x: self.size.width * 0.5, y: self.size.height * 0.5)
var beatParticleEffectNode = SKEffectNode()
beatParticleEffectNode.addChild(beatParticle!)
self.addChild(beatParticleEffectNode)
All the looks are done in the .sks file.
Here is where I call the "updateBeatParticle" function in a continual loop so that It can where i will put my code for making the particle's scale(meaning size) larger and smaller to the music.
var dpLink : CADisplayLink?
dpLink = CADisplayLink(target: self, selector: "updateBeatParticle")
dpLink?.addToRunLoop(NSRunLoop.currentRunLoop(), forMode: NSRunLoopCommonModes)
func updateBeatParticle(){
//Put code here
}
Any idea how i can do this? I looked at some tutorials such as this: https://www.raywenderlich.com/36475/how-to-make-a-music-visualizer-in-ios
However, i can't quite get my head around it because they're using an emitterLayer and its in Obj-C and am also interested in any other ideas you wonderful people may have!
WARNING: The following code has not been tested. Please let me know if it works.
Firstly, it looks like you are using SpriteKit, therefore you could put the code needed to alter the emitter scale in the SKScene method update:, which automatically gets called virtually as often as a CADisplayLink.
Essentially all you need to do is update the emitter scale in the update: method based on the volume of the channels of your AVAudioPlayer. Note that the audio player may have multiple channels running, so you need to average out the average power for each.
Firstly...
player.meteringEnabled = true
Set this after you initialise your audio player, so that it will monitor the levels of the channels.
Next, add something like this in your update method.
override func update(currentTime: CFTimeInterval) {
var scale: CGFloat = 0.5
if audioPlayer.playing { // Only do this if the audio is actually playing
audioPlayer.updateMeters() // Tell the audio player to update and fetch the latest readings
let channels = audioPlayer.numberOfChannels
var power: Float = 0
// Loop over each channel and add its average power
for i in 0..<channels {
power += audioPlayer.averagePowerForChannel(i)
}
power /= Float(channels) // This will give the average power across all the channels in decibels
// Convert power in decibels to a more appropriate percentage representation
scale = CGFloat(getIntensityFromPower(power))
}
// Set the particle scale to match
emitterNode.particleScale = scale
}
The method getIntensityFromPower is used to convert the power in decibels, to a more appropriate percentage representation. This method can be declared like so...
// Will return a value between 0.0 ... 1.0, based on the decibels
func getIntensityFromPower(decibels: Float) -> Float {
// The minimum possible decibel returned from an AVAudioPlayer channel
let minDecibels: Float = -160
// The maximum possible decibel returned from an AVAudioPlayer channel
let maxDecibels: Float = 0
// Clamp the decibels value
if decibels < minDecibels {
return 0
}
if decibels >= maxDecibels {
return 1
}
// This value can be adjusted to affect the curve of the intensity
let root: Float = 2
let minAmp = powf(10, 0.05 * minDecibels)
let inverseAmpRange: Float = 1.0 / (1.0 - minAmp)
let amp: Float = powf(10, 0.05 * decibels)
let adjAmp = (amp - minAmp) * inverseAmpRange
return powf(adjAmp, 1.0 / root)
}
The algorithm for this conversion was taken from this StackOverflow response https://stackoverflow.com/a/16192481/3222419.

SpriteKit how to repeat an SKAction after a VARYING random number of animation loops each time?

I have an SKSpriteNode with texture animations. I basically have a character idle cycle of 4 frames and a blink animation sequence. I want to loop the character idle cycle forever but make it play the blink animation sequence at random intervals.
I have the following code;
func playIdle() {
let idle_loop = SKAction.repeatAction(action_textureSequence_idle!, count: randomLoopCount())
let sequence = SKAction.sequence([idle_loop, action_textureSequence_blink!])
let repeatSequence = SKAction.repeatActionForever(sequence)
runAction(repeatSequence)
}
func randomLoopCount() -> Int {
return Int(arc4random_uniform(10) + 2)
}
The problem with the obove is, the random number is only generated once so the blink does not happen randomly at all. Just after the same number of loops each time. How do I achieve the effect I'm looking for?
You can use recursion to achieve what you want:
func playIdle() {
let idle_loop = SKAction.repeatAction(action_textureSequence_idle, count: Int(arc4random_uniform(10) + 2))
let sequence = SKAction.sequence([idle_loop,
action_textureSequence_blink,
SKAction.runBlock({[unowned self] in self.playIdle()})])
runAction(sequence)
}
The part unowned self protects you from creating a strong reference cycle.

accelerometer data not correct, delayed for few seconds

I am creating a very simple game using Swift and SpriteKit and I am moving a ball on the screen using the accelerometer data (acceleration x,y).
I would say the code works fine but I have noticed that sometimes (often right when I open the app) the accelerometer data is not correct and delayed for few seconds.
Why is that happening?
I am using the following code to read the accelerometer data:
if motionManager.accelerometerAvailable == true {
motionManager.startAccelerometerUpdatesToQueue(NSOperationQueue.currentQueue(), withHandler:{
data, error in
self.accX = CGFloat(data.acceleration.x)
self.accY = CGFloat(data.acceleration.y)
})
}
And the function update to apply some impulse to the ball:
override func update(currentTime: CFTimeInterval) {
var impulse = CGVectorMake(accX, accY)
var obj = childNodeWithName("ball") as SKSpriteNode
obj.physicsBody?.applyImpulse(impulse)
}
Am i missing something?
Thank you
With any accelerometer data, it is a good idea to run it through a filter to smooth out any irregular spikes. Here is my favorite:
double filteredAcceleration[3];
memset(filteredAcceleration, 0, sizeof(filteredAcceleration));
CMAccelerometerData *newestAccel = motionManager.accelerometerData;
filteredAcceleration[0] = (filteredAcceleration[0]*(1.0-alpha)) + (newestAccel.acceleration.x*alpha);
filteredAcceleration[1] = (filteredAcceleration[1]*(1.0-alpha)) + (newestAccel.acceleration.y*alpha);
filteredAcceleration[2] = (filteredAcceleration[2]*(1.0-alpha)) + (newestAccel.acceleration.z*alpha);
alpha can be any value from 0 to 1. The closer to 1 the more responsive it will be, the closer to zero the more smooth it will be. My favorite value on the iPhone is 0.2 It is a good compromise for smooth yet responsive for a game like doodle jump, or possibly moving a ball around.
I don't know why the accelerometer data is incorrect/delayed on startup, my guess would be that the hardware has to wake up and calibrate itself, but regardless of the why, if you implement a filter, it will smooth out these irregularities, and they won't be nearly as noticeable.
I have given priority to both functions and the issue seems fixed.
let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
dispatch_async(dispatch_get_global_queue(priority, 0)) {
// do some task
dispatch_async(dispatch_get_main_queue()) {
// code with priority
}
}

Detecting when someone begins walking using Core Motion and CMAccelerometer Data

I'm trying to detect three actions: when a user begins walking, jogging, or running. I then want to know when the stop. I've been successful in detecting when someone is walking, jogging, or running with the following code:
- (void)update:(CMAccelerometerData *)accelData {
[(id) self setAcceleration:accelData.acceleration];
NSTimeInterval secondsSinceLastUpdate = -([self.lastUpdateTime timeIntervalSinceNow]);
if (labs(_acceleration.x) >= 0.10000) {
NSLog(#"walking: %f",_acceleration.x);
}
else if (labs(_acceleration.x) > 2.0) {
NSLog(#"jogging: %f",_acceleration.x);
}
else if (labs(_acceleration.x) > 4.0) {
NSLog(#"sprinting: %f",_acceleration.x);
}
The problem I run into is two-fold:
1) update is called multiple times every time there's a motion, probably because it checks so frequently that when the user begins walking (i.e. _acceleration.x >= .1000) it is still >= .1000 when it calls update again.
Example Log:
2014-02-22 12:14:20.728 myApp[5039:60b] walking: 1.029846
2014-02-22 12:14:20.748 myApp[5039:60b] walking: 1.071777
2014-02-22 12:14:20.768 myApp[5039:60b] walking: 1.067749
2) I'm having difficulty figuring out how to detect when the user stopped. Does anybody have advice on how to implement "Stop Detection"
According to your logs, accelerometerUpdateInterval is about 0.02. Updates could be less frequent if you change mentioned property of CMMotionManager.
Checking only x-acceleration isn't very accurate. I can put a device on a table in a such way (let's say on left edge) that x-acceleration will be equal to 1, or tilt it a bit. This will cause a program to be in walking mode (x > 0.1) instead of idle.
Here's a link to ADVANCED PEDOMETER FOR SMARTPHONE-BASED ACTIVITY TRACKING publication. They track changes in the direction of the vector of acceleration. This is the cosine of the angle between two consecutive acceleration vector readings.
Obviously, without any motion, angle between two vectors is close to zero and cos(0) = 1. During other activities d < 1. To filter out noise, they use a weighted moving average of the last 10 values of d.
After implementing this, your values will look like this (red - walking, blue - running):
Now you can set a threshold for each activity to separate them. Note that average step frequency is 2-4Hz. You should expect current value to be over the threshold at least few times in a second in order to identify the action.
Another helpful publications:
ERSP: An Energy-efficient Real-time Smartphone Pedometer (analyze peaks and throughs)
A Gyroscopic Data based Pedometer Algorithm (threshold detection of gyro readings)
UPDATE
_acceleration.x, _accelaration.y, _acceleration.z are coordinates of the same acceleration vector. You use each of these coordinates in d formula. In order to calculate d you also need to store acceleration vector of previous update (with i-1 index in formula).
WMA just take into account 10 last d values with different weights. Most recent d values have more weight, therefore, more impact on resulting value. You need to store 9 previous d values in order to calculate current one. You should compare WMA value to corresponding threshold.
if you are using iOS7 and iPhone5S, I suggest you look into CMMotionActivityManager which is available in iPhone5S because of the M7 chip. It is also available in a couple of other devices:
M7 chip
Here is a code snippet I put together to test when I was learning about it.
#import <CoreMotion/CoreMotion.h>
#property (nonatomic,strong) CMMotionActivityManager *motionActivityManager;
-(void) inSomeMethod
{
self.motionActivityManager=[[CMMotionActivityManager alloc]init];
//register for Coremotion notifications
[self.motionActivityManager startActivityUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMMotionActivity *activity)
{
NSLog(#"Got a core motion update");
NSLog(#"Current activity date is %f",activity.timestamp);
NSLog(#"Current activity confidence from a scale of 0 to 2 - 2 being best- is: %ld",activity.confidence);
NSLog(#"Current activity type is unknown: %i",activity.unknown);
NSLog(#"Current activity type is stationary: %i",activity.stationary);
NSLog(#"Current activity type is walking: %i",activity.walking);
NSLog(#"Current activity type is running: %i",activity.running);
NSLog(#"Current activity type is automotive: %i",activity.automotive);
}];
}
I tested it and it seems to be pretty accurate. The only drawback is that it will not give you a confirmation as soon as you start an action (walking for example). Some black box algorithm waits to ensure that you are really walking or running. But then you know you have a confirmed action.
This beats messing around with the accelerometer. Apple took care of that detail!
You can use this simple library to detect if user is walking, running, on vehicle or not moving. Works on all iOS devices and no need M7 chip.
https://github.com/SocialObjects-Software/SOMotionDetector
In repo you can find demo project
I'm following this paper(PDF via RG) in my indoor navigation project to determine user dynamics(static, slow walking, fast walking) via merely accelerometer data in order to assist location determination.
Here is the algorithm proposed in the project:
And here is my implementation in Swift 2.0:
import CoreMotion
let motionManager = CMMotionManager()
motionManager.accelerometerUpdateInterval = 0.1
motionManager.startAccelerometerUpdatesToQueue(NSOperationQueue.mainQueue()) { (accelerometerData: CMAccelerometerData?, error: NSError?) -> Void in
if((error) != nil) {
print(error)
} else {
self.estimatePedestrianStatus((accelerometerData?.acceleration)!)
}
}
After all of the classic Swifty iOS code to initiate CoreMotion, here is the method crunching the numbers and determining the state:
func estimatePedestrianStatus(acceleration: CMAcceleration) {
// Obtain the Euclidian Norm of the accelerometer data
accelerometerDataInEuclidianNorm = sqrt((acceleration.x.roundTo(roundingPrecision) * acceleration.x.roundTo(roundingPrecision)) + (acceleration.y.roundTo(roundingPrecision) * acceleration.y.roundTo(roundingPrecision)) + (acceleration.z.roundTo(roundingPrecision) * acceleration.z.roundTo(roundingPrecision)))
// Significant figure setting
accelerometerDataInEuclidianNorm = accelerometerDataInEuclidianNorm.roundTo(roundingPrecision)
// record 10 values
// meaning values in a second
// accUpdateInterval(0.1s) * 10 = 1s
while accelerometerDataCount < 1 {
accelerometerDataCount += 0.1
accelerometerDataInASecond.append(accelerometerDataInEuclidianNorm)
totalAcceleration += accelerometerDataInEuclidianNorm
break // required since we want to obtain data every acc cycle
}
// when acc values recorded
// interpret them
if accelerometerDataCount >= 1 {
accelerometerDataCount = 0 // reset for the next round
// Calculating the variance of the Euclidian Norm of the accelerometer data
let accelerationMean = (totalAcceleration / 10).roundTo(roundingPrecision)
var total: Double = 0.0
for data in accelerometerDataInASecond {
total += ((data-accelerationMean) * (data-accelerationMean)).roundTo(roundingPrecision)
}
total = total.roundTo(roundingPrecision)
let result = (total / 10).roundTo(roundingPrecision)
print("Result: \(result)")
if (result < staticThreshold) {
pedestrianStatus = "Static"
} else if ((staticThreshold < result) && (result <= slowWalkingThreshold)) {
pedestrianStatus = "Slow Walking"
} else if (slowWalkingThreshold < result) {
pedestrianStatus = "Fast Walking"
}
print("Pedestrian Status: \(pedestrianStatus)\n---\n\n")
// reset for the next round
accelerometerDataInASecond = []
totalAcceleration = 0.0
}
}
Also I've used the following extension to simplify significant figure setting:
extension Double {
func roundTo(precision: Int) -> Double {
let divisor = pow(10.0, Double(precision))
return round(self * divisor) / divisor
}
}
With raw values from CoreMotion, the algorithm was haywire.
Hope this helps someone.
EDIT (4/3/16)
I forgot to provide my roundingPrecision value. I defined it as 3. It's just plain mathematics that that much significant value is decent enough. If you like you provide more.
Also one more thing to mention is that at the moment, this algorithm requires the iPhone to be in your hand while walking. See the picture below. Sorry this was the only one I could find.
My GitHub Repo hosting Pedestrian Status
You can use Apple's latest Machine Learning framework CoreML to find out user activity. First you need to collect labeled data and train the classifier. Then you can use this model in your app to classify user activity. You may follow this series if are interested in CoreML Activity Classification.
https://medium.com/#tyler.hutcherson/activity-classification-with-create-ml-coreml3-and-skafos-part-1-8f130b5701f6

Resources