High-precision timer in iOS - ios

What is the most precise way to measure time intervals in iOS? So far I have been using NSDate to "mark" events and then timeIntervalSinceDate method to calculate the interval, but is there more precise approach? E.g. Windows has so called QPC for that kind of thing. Is there something similar in iOS (or MacOSX) world?

You can get down to nanoseconds using mach_absolute_time(). Here is an article on using it:
https://developer.apple.com/library/mac/qa/qa1398/_index.html
It also exists on iOS.

Related

How to implement the mach_absolute_time Swift api?

This may be a bit of a silly question, but I'm looking for a reliable way to create a repeating timer in Swift that's as completely accurate as possible. I've researched this quite a bit and have found that the mach_absolute_time api seems to be what I'm looking for.
However, while I think I understand mach_absolute_time, there's not much out there on mach_wait_until (if I understand correctly that's the mach api version of using Timer.scheduledTimer() to call a selector).
So my goal is to turn something like Martin R's answer to this post: Measuring Time Accurately in Swift for Comparison Across Devices into a timer that can measure out an exact amount of time, call a selector, and repeat with little to no loss of time between ticks.
Thanks!
Update
The other thing I ran into is that the timer can't lock up the app while the it's running either – I know since mach is pretty low level, calling mach_wait_until will stop everything else and wait for the time to elapse (I think).

Speech Kit limits: how do I know that I've reached them?

I'm using speech recognition in my app. It's guite important for user experience, so I want it to be good (and free or cheap).
Right now, I'm using Speech Kit from Apple, and it works like a charm but it's not very reliable because there are some limits per app and per device, and I don't know these limits.
Another option is to use OpenEars. It's not nearly as good as Speech Kit for me, so I'm thinking about switching from Speech Kit to OpenEars silently if Speech Kit is not working (and back,when Speech Kit is alive and well).
But is there a way to know that Speech Kit is not working right now before ever using it?
The only way I know of is to try to recognise some audiofile before every user session, but it needs time (at least, several seconds will be spent, and several seconds is a lot), and it's not very good solution in terms of using the service — it seems too costly to recognise audio just to check if Speech Kit is working or not. Also, I don't know how to debug this, because obviously I don't have any problems with limits in my app right now.
What is the best way to solve this?
I also thought about this question not long ago. Here's an answer from the Apple Q & A. "The current rate limit for the number of SFSpeechRecognitionRequest calls a device can make is 1000 requests per hour." There is also an example of the error being received upon reached limit, so you can prepare yourself for that :)
Here's the link: Apple Q & A

is iphone timezones reliable? Alternatives?

I am currently writing an app that needs to represent some data arranged by local time. Since I don't think relying on [NSDate date] is a good idea at all, because the user can change his time wrongly, hence messing up my data. I then thought it would be better to make a timestamp (server time) and then represent the data comparing the difference between my timestamp timezone and users timezone. Am I right doing it this way?
Obviously this needs pretty much full reliance on timezones given by iOS are correct. If no, are there any better alternatives (like google timezone api?) (I would think they are both based on tz_database though?)
Thanks for any input

CMMotionActivityManager ignores cycling

I've been researching the new M7 chip's CMMotionActivityManager, for determining whether the user of the device is walking, running, in a car, etc (see Apple Documentation). This seemed like a great step forward over trying to determine this previous from using LocationManager and accelerometer data only.
I notice however that CMMotionActivityManager does not have a cycling activity, which is disappointing, and almost a deal-breaker for complete usage as a new activity manager. Has anyone else found a convenient way to use CMMotionActivityManager with cycling also without having to reincorporate CMLocationManager + accelerometer just to try to test for cycling too?
Note, this also does not include general transport options for things like a Train. For instance, I commute an hour a day on the train. Automotive could be made more generic at least, similar to how Moves uses Transport.
CMMotionActivity has these defined motion types only:
stationary
walking
running
automotive
unknown
Useful notes from Apple's code, that does not necessarily solve the issue, but is helpful:
CMMotionActivity
An estimate of the user's activity based on the motion of the device.
The activity is exposed as a set of properties, the properties are not
mutually exclusive.
For example, if you're in a car stopped at a stop sign the state might
look like:
stationary = YES, walking = NO, running = NO, automotive = YES
Or a moving vehicle, stationary = NO, walking = NO, running = NO,
automotive = YES
Or the device could be in motion but not walking or in a vehicle.
stationary = NO, walking = NO, running = NO, automotive = NO. Note in this case all of the properties are NO.
[Direct Source: Apple iOS Framework, CoreMotion/CMMotionActivity.h #interface CMMotionActivity, inline code comments]
First of all its your question or kind of informative details on M7?
Has anyone else found a convenient way to use CMMotionActivityManager
with cycling also without having to reincorporate LocationManager +
accelerometer just to try to test for cycling too?
See there is lots of confusion it will create if you want to check if activity is type of cycling ??because its just depend on accelerometer
accelerometer contain microscopic crystal structures that get stressed by accelerative forces, which causes a voltage to be generated.and from that voltage it can parse the result.. so what i know is its just classifies your speed and giving you result that its running walking or automotive so if you want to use cycling some time very fast very slow or medium so may be it will some time result in to walking or running or may be auotomotive so m7 can not clarify the thing if its automotive or cycling or running because there is not much of speed variance while you cycling.
Still while using for running and walking its some time gives wrong results in some cases.. so that will chances that your app will give wrong information too.
One more thing you asked is
Note, this also does not include general transport options for things
like a Train. For instance, I commute an hour a day on the train.
Automotive could be made more generic at least, similar to how Moves
uses Transport.
So Apple is also working on other mapping features. Apple is said to be planning notable updates to its Maps app in iOS 8, and the company is currently working on implementing both public transit directions and indoor mapping features (which Google already has on iOS).
http://www.macrumors.com/2013/09/12/apple-working-to-leverage-new-m7-motion-sensing-chip-for-mapping-improvements/ (Useful Link)
So, not sure if you still need an answer to that but here is the latest from iOs8 SDK
#property(readonly, nonatomic) BOOL cycling NS_AVAILABLE(NA, 8_0);
In session 612 at WWDC 2014, the two presenting Apple engineers provided some information: In the slides they stated:
Performance is very sensitive to location
Works best if device is worn on upper arm Best for retrospective use cases
Longest latency
Best for retrospective use cases
In the video they explain on the audio track (starting at about 11:00) that
Cycling is new, something we introduced in iOS 8.
Cycling is very challenging, and again you need the dynamics and so
it's going to be very sensitive to location.
If it was mounted on the upper arm the latency is going to be fairly
reasonable.
And if it's anywhere else, it's going to take a lot longer. So definitely I would not suggest using cycling activity classification as a hint for the context here and now. It's really something that you'll want to use in a retrospective manner for a journaling app, for example.
I made a simple test setup for iOS 8 and 9 and iPhone 5s and 6 and cycling was not detected - not a single time in over 1.5h cycling. If the new iPhone 6S makes good this major deficit in motion activity detection is unclear - Phil Schiller announced it in September 2015.
tl;tr
Currently, cycling detection in CoreMotion does not work as it works for stationary, walking, running, and car! It will be not detected and can be used retrospectively only.

Use similar technology as CMMotionActivity on older iPhones

I'm looking to use the CMMotionActivity for iPhone 5s's, but also want to be able to use similar functionality on older iPhone's, is this possible?
Could I create a less accurate alternative maybe, by tracking GPS and not using the M7 chip? Any advice/tutorials/sample code?
You can create your own algorithm which will utilize accelerometer data and estimate number of steps taken. Its not as accurate and its not a good idea to have 2 separate logic in the same app.
In case, you want to give it a try, check this answer..How to count steps using an Accelerometer?

Resources