Is heartRate data from an HKWorkoutSession a moving average? - ios

I'm using an HKWorkoutSession to get heart rate data every 5 seconds in workoutBuilder didCollectDataOf. The heart rates are reported as "beats/minute". The question is, are they calculated as moving averages, or just over the previous time interval? (I couldn't find this specified in the documentation anywhere.)
For example, if you get the following heart rates:
t=0: 69 beats/min
t=5: 71 beats/min
t=10: 72 beats/min
...
Is each value an average of beat intervals over the past 60 seconds, or is it just extrapolated from the last 5 seconds of data?
Here's what didCollectDataOf looks like:
func workoutBuilder(_ workoutBuilder: HKLiveWorkoutBuilder, didCollectDataOf collectedTypes: Set<HKSampleType>) {
for type in collectedTypes {
guard let hrType = HKQuantityType.quantityType(forIdentifier: .heartRate) else {
return
}
if collectedTypes.contains(hrType) {
if let hrQuantity = workoutBuilder.statistics(for: hrType)?.mostRecentQuantity() {
let hrUnit = HKUnit(from: "count/min")
let hr = Int(hrQuantity.doubleValue(for: hrUnit))
debugPrint("\(Date()) HR: \(hr)")
}
}
}
}

From my experience, it's not a moving average.

Related

FPS not consistent on Camera using AVAssetWriter and CoreML

I’m trying to create an app that can record video at 100 FPS using AVAssetWriter AND detect if a person is performing an action using the ActionClassifier from Create ML. But when I try to put the 2 together the FPS drops to 30 when recording and detecting actions.
If I do the recording by itself then it records at 100 FPS.
I am able to set the FPS of the camera to 100 FPS through the device configuration.
Capture output Function is setup
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
bufferImage = sampleBuffer
guard let calibrationData = CMGetAttachment(sampleBuffer, key: kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, attachmentModeOut: nil) as? Data else {
return
}
cameraCalibrationMatrix = calibrationData.withUnsafeBytes { $0.pointee }
if self.isPredictorActivated == true {
do {
let poses = try predictor.processFrame(sampleBuffer)
if (predictor.isReadyToMakePrediction) {
let prediction = try predictor.makePrediction()
let confidence = prediction.confidence * 100
DispatchQueue.main.async {
self.predictionLabel.text = prediction.label + " " + String(confidence.rounded(toPlaces: 0))
if (prediction.label == "HandsUp" && prediction.confidence > 0.85) {
print("Challenging")
self.didChallengeVideo()
}
}
}
} catch {
print(error.localizedDescription)
}
}
let presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
if assetWriter == nil {
createWriterInput(for: presentationTimeStamp)
} else {
let chunkDuration = CMTimeGetSeconds(CMTimeSubtract(presentationTimeStamp, chunkStartTime))
// print("Challenge\(isChallenging)")
if chunkDuration > 1500 || isChallenging {
assetWriter.endSession(atSourceTime: presentationTimeStamp)
// make a copy, as finishWriting is asynchronous
let newChunkURL = chunkOutputURL!
let chunkAssetWriter = assetWriter!
chunkAssetWriter.finishWriting {
print("finishWriting says: \(chunkAssetWriter.status.rawValue) \(String(describing: chunkAssetWriter.error))")
print("queuing \(newChunkURL)")
print("Chunk Duration: \(chunkDuration)")
let asset = AVAsset(url: newChunkURL)
print("FPS of CHUNK \(asset.tracks.first?.nominalFrameRate)")
if self.isChallenging {
self.challengeVideoProcess(video: asset)
}
self.isChallenging = false
}
createWriterInput(for: presentationTimeStamp)
}
}
if !assetWriterInput.append(sampleBuffer) {
print("append says NO: \(assetWriter.status.rawValue) \(String(describing: assetWriter.error))")
}
}
Performing action classification is quite expensive if you want to run it every frame so it may affect overall performance of the app (including video footage FPS). I don't know how often you need prediction but I would suggest you to try running Action Classifier 2-3 times per second maximum and see if that helps.
Running action classifier every frame won't change your classification that much because you're adding just one frame to your classifier action window so there is no need to run it so often.
For example if your action classifier was setup with window 3s and trained on 30fps videos, your classification is based on 3 * 30 = 90 frames. One frame won't make a difference.
Also make sure that your 100fps matches footage that you used for training action classifier. Otherwise you can get wrong predictions because running Action Classifier trained on 30fps video will treat 1s of 100fps footage as more than 3,333s.

How to get steps count on hourly basis from HealthKit Swift 4

I need to plot graph for steps taken by user on hourly basis on any specific date. But if the user's steps start today at 3:58 pm and end today at 4:10 pm then I am getting just one HKStatistics object for this period of time. I am not able to break this data into two samples as I need to get steps taken in the 3-4 pm slot and the 4-5 pm slot.
static func getSteps(date: Date, duration: DateComponents, completion: #escaping ([HKSample]) -> Void) {
let quantityType : Set = [HKObjectType.quantityType(forIdentifier: HKQuantityTypeIdentifier.stepCount)!]
let stepsQuantityType = HKQuantityType.quantityType(forIdentifier: .stepCount)!
let startOfDay = Calendar.current.startOfDay(for: date)
if let endOfDay = Calendar.current.date(byAdding: duration, to: startOfDay) {
var interval = DateComponents()
interval.hour = 1
let predicate = HKQuery.predicateForSamples(withStart: startOfDay, end: endOfDay, options: .strictStartDate)
let query = HKSampleQuery.init(sampleType:stepsQuantityType,
predicate: predicate,
limit: HKObjectQueryNoLimit,
sortDescriptors: nil,
resultsHandler: { (query, results, error) in
guard let result = results else {
return
}
// print("result healthkit",result.description)
//print("Total count:",)
completion(result)
})
healthStore.execute(query)
}
}
Don't use HKSampleQuery for charting quantity types. HKStatisticsCollectionQuery is designed for this purpose and will split samples that fall into separate regions of your chart for you. See the documentation for examples of how to build the query and use its results.
You're correct, you can't split the sample. That's the all the information that's available. Steps are not stored step-by-step; they're aggregated into blocks to reduce power and storage requirements (mostly power; it's easier to accumulate a value in hardware and periodically read it than to query the real time clock every single time a step is detected).
In order to do what you're discussing, you'll need to average the steps over the period. So if there were 100 steps over the period 3:58p to 4:07p, that averages 10 steps/minute, and you would allocate 20 steps to the 3p-4p block and 80 steps to the 4p-5p block. That's the best information you have.

How to work with accelerometer data from the Apple Watch?

Via the code below I am getting accelerometer data, now I want to work with it to track the user's movement, specifically speed. Looking around at code using Core Motion on the iPhone they are using a motionManager object which can set a value for accelerometerUpdateInterval as well as get the .acceleration.x value for example. How can I work with the raw data I'm given back so that I can determine e.g. how fast a person is moving or how fast their arm is swinging?
//Record the data
if CMSensorRecorder.isAccelerometerRecordingAvailable() {
print("Accelerometer available")
recorder.recordAccelerometer(forDuration: 20 * 60) // Record for 20 minutes
}
//Read the data
if CMSensorRecorder.isAccelerometerRecordingAvailable() {
let accelerometerData = recorder.accelerometerData(from: startDate, to: endDate)
for (index, data) in (accelerometerData?.enumerated())! {
print(index, data)
}
}
Prints:
0 388, 208409.082611, 529770182.607276, (0.038574, -0.762207, -0.652832)
1 388, 208409.102722, 529770182.627387, (0.027588, -0.763184, -0.660889)
2 388, 208409.122863, 529770182.647528, (0.027100, -0.763184, -0.661865)
3 388, 208409.142974, 529770182.667639, (0.030029, -0.756836, -0.661865)
4 388, 208409.163116, 529770182.687781, (0.026611, -0.764648, -0.665039)
Edit: I found this lib which looks like it would be perfect but hasn't been updated in 3 years...anything similar that is still maintained? https://github.com/MHaroonBaig/MotionKit
I found that using a CMMotionManager on the watch works just as good as it does on iPhone. This way you can implement startAccelerometerUpdates in awake and receive real time feed back on watch positions for X Y Z coordinates so that you can get a better grasp of the data;
var motionManager = CMMotionManager()
override func awake(withContext context: Any?) {
super.awake(withContext: context)
manager.accelerometerUpdateInterval = 0.2
manager.startAccelerometerUpdates(to: OperationQueue.current!) { (data, error) in
if let myData = data {
print("x: \(myData.acceleration.x) y: \(myData.acceleration.y) z: \(myData.acceleration.z)")
if myData.acceleration.x > 1.5 && myData.acceleration.y > 1.5 {
}
}
}
}

iOS: Swift: Combine Accelerometer and Attitude data for comparable results

Idea
My idea is to write a little application for my iOS-Device to record a motion after start recording by touching an UIButton and save the data from the accelerometer for this motion. After the recording, I can save this data to use it in the finished Application to detect this motion again.
So, what I'm looking for is a way to compare two NSMutableArrays, each as a set of motion data, and want to check if the current NSMutableArray is the same motion as the recorded one.
Problems
To realize this idea, I found three problems, which I can't solve myself:
How to compare two NSMutableArrays and get e.g. an index-level of the similarity or a percentage e.g. 97.32% equivalent.
What if the user makes the motion in different speeds? E.g. slowly or fast.
The last problem: How to handle different device-orientations? Is there a way to calculate the data from the accelerometer to a neutral level in any orientation? I think I have a solution approach for this last problem, but don't know how to bring this to code: We have the attitude data for pitch, roll and yaw. Maybe we can calculate with these values the neutral data for the accelerometer?
Code
At this moment, I only have the code to get the data from the accelerometer and the attitude data, both from the MotionManager. I played multiple hours with my code and searched a lot on Google, but can't find the right way...
import UIKit
import CoreMotion
var motionManager = CMMotionManager()
var x:Float = 0.0
var y:Float = 0.0
var z:Float = 0.0
var roll:Float = 0.0
var pitch:Float = 0.0
var yaw:Float = 0.0
class FirstViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
motionManager.accelerometerUpdateInterval = 1
motionManager.deviceMotionUpdateInterval = 1
}
#IBAction func startRecording() {
accelerometerData = NSMutableArray()
motionManager.startDeviceMotionUpdates(using: CMAttitudeReferenceFrame.xMagneticNorthZVertical, to:
OperationQueue.current!, withHandler: {
(deviceMotion, error) -> Void in
let attitude = deviceMotion?.attitude
let roll = self.degrees(radians: attitude!.roll)
let pitch = self.degrees(radians: attitude!.pitch)
let yaw = self.degrees(radians: attitude!.yaw)
self.roll = Float(roll)
self.pitch = Float(pitch)
self.yaw = Float(yaw)
let accX = deviceMotion?.userAcceleration.x
let accY = deviceMotion?.userAcceleration.y
let accZ = deviceMotion?.userAcceleration.z
self.x = Float(accX!)
self.y = Float(accY!)
self.z = Float(accZ!)
})
}
}
Update
I will update this question, when I know more.
Problem 2: Solution:
With the Attribute CMAttitudeReferenceFrame.xMagneticNorthZVertical for the motionManager.startDeviceMotionUpdates-Method you can neutralize all sensors data to a defined orientation. You can access the accelerometer-data here too. I've updated the code above, too.

Swift: can I add and subtract `dispatch_time_t` variables?

I need to do some time math in iOS, with Swift.
I have to use dispatch_walltime. I hope that can be taken as axiomatic. Where time math is concerned, I think I'm likely to get the response "just use NSDate," but please take it on faith: I am bound to dispatch_walltime.
Now, it's plain why someone might suggest NSDate, because when you're using NSTimeInterval and NSDate and that good stuff, it's pretty easy to make custom timestamps and compare them and do all kinds of time math.
But I have to use dispatch_time_t, and specifically dispatch_walltime created like this:
//Get the timeInterval of now.
let nowInterval = NSDate().timeIntervalSince1970
//Make a timespec from it.
var nowStruct = timespec(tv_sec: Int(nowInterval), tv_nsec: 0)
//Make a walltime definition from that.
let referenceWalltime = dispatch_walltime(&nowStruct, 0)
Later on I need to use that reference time in various ways. For instance, I need to get the time interval between the reference time and whatever time it happens to be.
I am attempting to do this the same way I would with NSTimeInterval, in other words, make a new one and subtract the old one from it:
//Repeat everything from before to make a new wall time.
let newNowInterval = NSDate().timeIntervalSince1970
var newNowStruct = timespec(tv_sec: Int(newNowInterval), tv_nsec: 0)
let newWalltime = dispatch_walltime(& newNowStruct, 0)
//Time math a la NSTimeInterval to find the interval:
let walltimeInterval = newWalltime - referenceWalltime
Will that work?
The short answer is: no. That code will crash.
The better answer is: no, but it can be done, and it's not all that different in the end.
I did some investigating on my own in a Playground and learned some interesting things, and I believe I figured out the right way to do this.
I'm pasting the entirety of my Playground here, so that others can copy-paste it and figure out how to do their own dispatch_time math.
Comments marked by //********* in the code denote the key things I learned.
import UIKit
import XCPlayground
XCPSetExecutionShouldContinueIndefinitely(continueIndefinitely: true)
public class IntervalMaker {
var referenceWalltime: dispatch_time_t = 0
var newWalltime: dispatch_time_t = 0
var walltimeInterval: dispatch_time_t = 0
func scheduleWalltimeSequence () {
let threeSeconds = Int64(NSEC_PER_SEC) * 3
let now = walltimeNow()
let dispatchTimeInThree = dispatch_time(now, threeSeconds)
let dispatchTimeInSix = dispatch_time(now,
2 * threeSeconds)
setReferenceWalltimeToNow()
dispatch_after(dispatchTimeInThree, dispatch_get_main_queue(),
setNewWalltimeToNow)
dispatch_after(dispatchTimeInSix,
dispatch_get_main_queue(), dispatchBasedOnDispatchMath)
}
func walltimeNow()->dispatch_time_t{
let nowInterval = NSDate().timeIntervalSince1970
var nowStruct = timespec(tv_sec: Int(nowInterval), tv_nsec: 0)
return dispatch_walltime(&nowStruct, 0)
}
func setReferenceWalltimeToNow () {
referenceWalltime = walltimeNow()
}
func setNewWalltimeToNow (){
newWalltime = walltimeNow()
}
func dispatchBasedOnDispatchMath () {
computeInterval() //Should be three seconds
let nineTheWrongWay = referenceWalltime + (walltimeInterval * 3)
let nineTheRightWay = dispatch_time(referenceWalltime,
Int64(walltimeInterval) * 3)
dispatch_after(nineTheWrongWay,
dispatch_get_main_queue(), finalPrintln)
//********** THE ABOVE DOES NOT WORK CORRECTLY - prints 6 seconds later
dispatch_after(nineTheRightWay,
dispatch_get_main_queue(), finalPrintln)
//********** THE ABOVE WORKS CORRECTLY - prints 9 seconds later
}
func finalPrintln () {
let now = walltimeNow()
println("I should be printing nine seconds from reference time.")
println("It's actually \(referenceWalltime - now) nanoseconds after")
}
func computeInterval () {
walltimeInterval = referenceWalltime - newWalltime
//********** dispatch_walltimes actually count backwards, and *CANNOT* be
//********** negative: writing `newWalltime - referenceWalltime` will crash
}
}
let intervaller = IntervalMaker()
intervaller.scheduleWalltimeSequence()

Resources