iOS Calculate speed of device using Core Motion - ios

I'm trying to calculate speed of physical device
In google i got
Using via CLLocationManager// I DON'T want to use
Using UIAccelerometer class : DEPECRATED
Upto now i have tried like this
func coreMotion() { // CALLING FROM VIEW DID LOAD
if self.motionManager.isDeviceMotionAvailable {
self.motionManager.deviceMotionUpdateInterval = 0.5
self.motionManager.startDeviceMotionUpdates(to: OperationQueue.current!,
withHandler: { [weak self] (deviceMotion, error) -> Void in
if let error = error {
print("ERROR : \(error.localizedDescription)")
}
if let deviceMotion = deviceMotion {
self?.handleDeviceMotionUpdate(deviceMotion)
}
})
} else {
print("WHAT THE HELL")
}
}
func handleDeviceMotionUpdate(_ deviceMotion: CMDeviceMotion) {
let attitude = deviceMotion.attitude
let roll = self.degrees(attitude.roll)
let pitch = self.degrees(attitude.pitch)
let yaw = self.degrees(attitude.yaw)
let accl = deviceMotion.userAcceleration
self.calculateSpeed(accl)
self.previousAccl = accl
print("Roll: \(roll), Pitch: \(pitch), Yaw: \(yaw)")
print("ACCELRATION: \(accl.x) \(accl.y) \(accl.z)")
}
func degrees(_ radians: Double) -> Double {
return 180 / Double.pi * radians
}
I'm getting acceleration object as well i.e userAcceleration
How can i calculate speed from that?

To compute the speed of the device, there is two possibilities for you :
1 - by approximating the derivative function of the position : with two position and the time between those two position you can estimate the speed.
2 - or compute a primitive of the acceleration. But take into account that this method will give you the correct speed value only if you know the speed à t_0 (the begining of your measures)
But if you insist on doing it using acceleration you can compute speed at t_i (where i is the number of update you received from the accelerometer)
speed(t_i) = speed(t_i-1) + acceleration(t_i) * (t_i - t_i-1)
and speed(t_0) is supposed to be known
that way you must at each update from the accelerometer do
speed = speed + acceleration * (lastUpdateTime - currentTime)
[Edit]
this is indeed like you mentioned it in the comments only one dimension if you wish to compute speed for all three dimensions you will have to do this three time once for each axis
speedX = speedX + accelerationX * (lastUpdateTime - currentTime)
speedY = speedY + accelerationY * (lastUpdateTime - currentTime)
speedZ = speedZ + accelerationZ * (lastUpdateTime - currentTime)
And you will need to have knowledge of speedX/Y/Z at t_0 to initialise your var at the correct value.

Related

SpriteKit stop spinning wheel in a defined angle

I have a spinning wheel rotating at an angular speed ω, no acceleration involved, implemented with SpriteKit.
When the user push a button I need to slowly decelerate the wheel from the current angle ∂0 and end-up in a specified angle (lets call it ∂f).
I created associated to it a mass of 2.
I already tried the angularDamping and the SKAction.rotate(toAngle: duration:) but they do not fit my needs because:
With the angularDamping I cannot specify easy the angle ∂f where I want to end up.
With the SKAction.rotate(toAngle: duration:) I cannot start slowing down from the current rotation speed and it doesn't behave natural.
The only remaining approach I tried is by using the SKAction.applyTorque(duration:).
This sounds interesting but I have problems calculating the formula to obtain the correct torque to apply and especially for the inertia and radius of the wheel.
Here is my approach:
I'm taking the starting angular velocity ω as:
wheelNode.physicsBody?.angularVelocity.
I'm taking the mass from wheelNode.physicsBody?.mass
The time t is a constant of 10 (this means that in 10 seconds I want the wheel decelerating to the final angle ∂f).
The deceleration that I calculated as:
let a = -1 * ω / t
The inertia should be: let I = 1/2 * mass * pow(r, 2)*. (see notes regarding the radius please)
Then, finally, I calculated the final torque to apply as: let t = I * a (taking care that is opposite of the current angular speed of the wheel).
NOTE:
Since I don't have clear how to have the radius of the wheel I tried to grab it both from:
the wheelNode.physicsBody?.area as let r = sqrt(wheelNode.physicsBody?.area ?? 0 / .pi)
by converting from pixel to meters as the area documentation says. Then I have let r = self.wheelNode.radius / 150.
Funny: I obtain 2 different values :(
UNFORTUNATLY something in this approach is not working because so far I have no idea how to end up in the specified angle and the wheel doesn't stop anyway as it should (or the torque is too much and spins in the other direction, or is not enough). So, also the torque applied seems to be wrong.
Do you know a better way to achieve the result I need? Is that the correct approach? If yes, what's wrong with my calculations?
Kinematics makes my head hurt, but here you go. I made it to where you can input the amount of rotations and the wheel will rotate that many times as its slowing down to the angle you specify. The other function and extension are there to keep the code relatively clean/readable. So if you just want one giant mess function go ahead and modify it.
• Make sure the node's angularDampening = 0.0
• Make sure the node has a circular physicsbody
// Stops a spinning SpriteNode at a specified angle within a certain amount of rotations
//NOTE: Node must have a circular physicsbody
// Damping should be from 0.0 to 1.0
func decelerate(node: SKSpriteNode, toAngle: CGFloat, rotations: Int) {
if node.physicsBody == nil { print("Node doesn't have a physicsbody"); return } //Avoid crash incase node's physicsbody is nil
var cw:CGFloat { if node.physicsBody!.angularVelocity < CGFloat(0.0) { return -1.0} else { return 1.0} } //Clockwise - using int to reduce if statments with booleans
let m = node.physicsBody!.mass // Mass
let r = CGFloat.squareRoot(node.physicsBody!.area / CGFloat.pi)() // Radius
let i = 0.5 * m * r.squared // Intertia
let wi = node.physicsBody!.angularVelocity // Initial Angular Velocity
let wf:CGFloat = 0 // Final Angular Velocity
let ti = CGFloat.unitCircle(node.zRotation) // Initial Theta
var tf = CGFloat.unitCircle(toAngle) // Final Theta
//Correction constant based on rate of rotation since there seems to be a delay between when the action is calcuated and when it is run
//Without the correction the node stops a little off from its desired stop angle
tf -= 0.00773889 * wi //Might need to change constn
let dt = deltaTheta(ti, tf, Int(cw), rotations)
let a = -cw * 0.5 * wi.squared / abs(dt) // Angular Acceleration - cw used to determine direction
print("A:\(a)")
let time:Double = Double(abs((wf-wi) / a)) // Time needed to stop
let torque:CGFloat = i * a // Torque needed to stop
node.run(SKAction.applyTorque(torque, duration: time))
}
func deltaTheta(_ ti:CGFloat, _ tf:CGFloat, _ clockwise: Int, _ rotations: Int) -> CGFloat {
let extra = CGFloat(rotations)*2*CGFloat.pi
if clockwise == -1 {
if tf>ti { return tf-ti-2*CGFloat.pi-extra }else{ return tf-ti-extra }
}else{
if tf>ti { return tf-ti+extra }else{ return tf+2*CGFloat.pi+extra-ti }
}
}
}
extension CGFloat {
public var squared:CGFloat { return self * self }
public static func unitCircle(_ value: CGFloat) -> CGFloat {
if value < 0 { return 2 * CGFloat.pi + value }
else{ return value }
}
}

Removing the effects of the gravity on IOS iphone accelerometer

I am having issues with the accelometer.
If the device is lying flat, I get (0, 0, -1 ) which obviously is not right. As I rotate the phone, this -1 moves to other axes depending on the phone position.
I have very simple code so far :
override func viewDidAppear(_ animated: Bool) {
motionManager.accelerometerUpdateInterval = 0.1
motionManager.startAccelerometerUpdates(to: OperationQueue.current!){(data,error)
in
if let myData = data {
print(Double(myData.acceleration.y) )
}
You are accessing the raw acceleration, which will include gravity. This means that (0, 0, -1) is actually obviously correct.
If you'd like something a little more stable (and sans the gravity vector), use the device motion. It's worth noting that the data that comes from the device motion interface is filtered and stabilized using sensor fusion filtering methods, so the acceleration data will be more accurate.
import UIKit
import CoreMotion
class ViewController: UIViewController {
let motionManager = CMMotionManager()
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
motionManager.deviceMotionUpdateInterval = 0.1
motionManager.startDeviceMotionUpdates(to: .main) { (motion, error) in
if let motion = motion {
var x = motion.userAcceleration.x
var y = motion.userAcceleration.y
var z = motion.userAcceleration.z
// Truncate to 2 significant digits
x = round(100 * x) / 100
y = round(100 * y) / 100
z = round(100 * z) / 100
// Ditch the -0s because I don't like how they look being printed
if x.isZero && x.sign == .minus {
x = 0.0
}
if y.isZero && y.sign == .minus {
y = 0.0
}
if z.isZero && z.sign == .minus {
z = 0.0
}
print(String(format: "%.2f, %.2f, %.2f", x, y, z))
}
}
}
}
By the way, this is all in the docs, first paragraph.
After creating an instance of CMMotionManager, an app can use it to receive four types of motion: raw accelerometer data, raw gyroscope data, raw magnetometer data, and processed device-motion data (which includes accelerometer, rotation-rate, and attitude measurements). The processed device-motion data provided by Core Motion’s sensor fusion algorithms gives the device’s attitude, rotation rate, calibrated magnetic fields, the direction of gravity, and the acceleration the user is imparting to the device.

Using CMQuaternion to calculate yaw

I'm currently working on an app that has to track yaw (rotation around z-axis).
The user will have the iPhone in landscape orientation pointing to an object like a car in front of him. As the user starts walking around the object I have to track the yaw for a complete 360 rotation.
I'm currently using CoreMotion with CMQuaternion to track the rotation:
private func startDeviceMotionUpdates() {
if motionManager.isDeviceMotionAvailable {
let queue = OperationQueue()
motionManager.deviceMotionUpdateInterval = 1 / 60
motionManager.startDeviceMotionUpdates(using: .xArbitraryZVertical, to: queue, withHandler: { (motion, error) in
guard let motion = motion else { return }
let quat = motion.attitude.quaternion
let yaw = 2 * (quat.x * quat.y + quat.w * quat.z)
let yawDegrees = self.degreesFromRadians(yaw)
print(yawDegrees)
})
}
}
private func stopDeviceMotionUpdates() {
motionManager.stopDeviceMotionUpdates()
}
private func degreesFromRadians(_ radians: Double) -> Double {
return radians * 180 / .pi
}
This currently works fine as long as the roll or pitch of the device doesn't change.
What I would like to know is:
Why does the yaw change when the roll or pitch of device changes?
How can I accurately track the yaw around the z-axis without it being affected by change on the x and y-axis?
I've been banging my head for two weeks against this problem. I would really appreciate if someone could help me understand why this is happening and how to get around it.
try below code.
let yaw = atan2(motion.gravity.x, motion.gravity.y) - Double.pi
let yawDegrees = self.degreesFromRadians(yaw)
print(yawDegrees)

Reduce array of coordinates

I track the users location in my app to a database with all the coordinates. I then do some stuff to select a range of coordinates in a time frame, but when I save it to the server is takes a long time due to the big amount of data. (15 minutes is 900 CLCoordinate2D's and that is quite a bit).
What I wanna do is to remove the coordinate which are intersected by the preceding and following coordinate. Using overly simple coordinates for illustration purposes, but imagine this being done on real coordinates in an array of a couple thousand objects.
Example:
0,0 //Keep
1,1 //Drop
2,2 //Drop
3,3 //Keep
3,4 //Keep
4,4 //Keep
5,3 //Keep
Or, shitty visualized:
I know I should probably use some vector stuff, but I am not good at maths.
How can I reduce this array to remove the obsolete points?
You could try something like this...
var coordTimes:[(coord: CLLocationCoordinate2D, time: Double)] = []
// ...
func appendCoord(newCoord: CLLocationCoordinate2D, newTime: Double) {
guard coordTimes.count > 1 else {
coordTimes.append((newCoord, newTime))
return
}
let n = coordTimes.count
// So there are at least two already in the array
let c0 = coordTimes[n - 2].coord
let t0 = coordTimes[n - 2].time
let c1 = coordTimes[n - 1].coord
let t1 = coordTimes[n - 1].time
let dt = t1 - t0
let dtNew = newTime - t0
guard (dtNew > 0) && (dt > 0) else {
// decide what to do if zero time intervals. Shouldn't happen
return
}
// Scale the deltas by the time interval...
let dLat = (c1.latitude - c0.latitude) / dt
let dLon = (c1.longitude - c0.longitude) / dt
let dLatNew = (newCoord.latitude - c0.latitude) / dtNew
let dLonNew = (newCoord.longitude - c0.longitude) / dtNew
let tolerance = 0.00001 // arbitrary - choose your own
if (abs(dLat - dLatNew) <= tolerance) && (abs(dLon - dLonNew) <= tolerance) {
// Can be interpolated - replace the last one
coordTimes[n - 1] = (newCoord, newTime)
} else {
// Can't be interpolated, append new point
coordTimes.append((newCoord, newTime))
}
}
The tolerance is important, as you are very unlikely to get exactly matching intervals. Also, for the geodesists amongst you, there is no need to convert into map coordinates or calculate true distances, as the OP simply wants to know if the coordinates can be interpolated.

SKEmiterNode with AVAudioPlayer for music visuals

PLEASE SOMEONE HELP!
I want to have my SKEmiterNode's scale(meaning size) get larger and smaller to the music i have built into the application using AVAudioPlayer. Right now this is pretty much all I have for the SKEmiterNode and it looks great:
beatParticle?.position = CGPoint(x: self.size.width * 0.5, y: self.size.height * 0.5)
var beatParticleEffectNode = SKEffectNode()
beatParticleEffectNode.addChild(beatParticle!)
self.addChild(beatParticleEffectNode)
All the looks are done in the .sks file.
Here is where I call the "updateBeatParticle" function in a continual loop so that It can where i will put my code for making the particle's scale(meaning size) larger and smaller to the music.
var dpLink : CADisplayLink?
dpLink = CADisplayLink(target: self, selector: "updateBeatParticle")
dpLink?.addToRunLoop(NSRunLoop.currentRunLoop(), forMode: NSRunLoopCommonModes)
func updateBeatParticle(){
//Put code here
}
Any idea how i can do this? I looked at some tutorials such as this: https://www.raywenderlich.com/36475/how-to-make-a-music-visualizer-in-ios
However, i can't quite get my head around it because they're using an emitterLayer and its in Obj-C and am also interested in any other ideas you wonderful people may have!
WARNING: The following code has not been tested. Please let me know if it works.
Firstly, it looks like you are using SpriteKit, therefore you could put the code needed to alter the emitter scale in the SKScene method update:, which automatically gets called virtually as often as a CADisplayLink.
Essentially all you need to do is update the emitter scale in the update: method based on the volume of the channels of your AVAudioPlayer. Note that the audio player may have multiple channels running, so you need to average out the average power for each.
Firstly...
player.meteringEnabled = true
Set this after you initialise your audio player, so that it will monitor the levels of the channels.
Next, add something like this in your update method.
override func update(currentTime: CFTimeInterval) {
var scale: CGFloat = 0.5
if audioPlayer.playing { // Only do this if the audio is actually playing
audioPlayer.updateMeters() // Tell the audio player to update and fetch the latest readings
let channels = audioPlayer.numberOfChannels
var power: Float = 0
// Loop over each channel and add its average power
for i in 0..<channels {
power += audioPlayer.averagePowerForChannel(i)
}
power /= Float(channels) // This will give the average power across all the channels in decibels
// Convert power in decibels to a more appropriate percentage representation
scale = CGFloat(getIntensityFromPower(power))
}
// Set the particle scale to match
emitterNode.particleScale = scale
}
The method getIntensityFromPower is used to convert the power in decibels, to a more appropriate percentage representation. This method can be declared like so...
// Will return a value between 0.0 ... 1.0, based on the decibels
func getIntensityFromPower(decibels: Float) -> Float {
// The minimum possible decibel returned from an AVAudioPlayer channel
let minDecibels: Float = -160
// The maximum possible decibel returned from an AVAudioPlayer channel
let maxDecibels: Float = 0
// Clamp the decibels value
if decibels < minDecibels {
return 0
}
if decibels >= maxDecibels {
return 1
}
// This value can be adjusted to affect the curve of the intensity
let root: Float = 2
let minAmp = powf(10, 0.05 * minDecibels)
let inverseAmpRange: Float = 1.0 / (1.0 - minAmp)
let amp: Float = powf(10, 0.05 * decibels)
let adjAmp = (amp - minAmp) * inverseAmpRange
return powf(adjAmp, 1.0 / root)
}
The algorithm for this conversion was taken from this StackOverflow response https://stackoverflow.com/a/16192481/3222419.

Resources