How precise is the timestamp property of the UITouch class in iOS? Milliseconds? Tens of milliseconds? I'm comparing an iPad's internal measurements with a custom touch detection circuit taped on the screen, and there is quite a bit of variability between the two (standard deviation ~ 15ms).
I've seen it suggested that the timestamp is discretized according to the frame refresh interval, but the distribution I'm getting looks continuous.
Prior to the iPad Air 2, the touch detection polling of the iDevices is 60 Hz. The iPad Air 2 is for the first time able to poll touches at 120 Hz.
So while the numbers of the timestamp seem to be very precise (many digits after the dot), they are not.
This is a WWDC video, it's the best WWDC video I've ever seen and it explains everything in detail:
https://developer.apple.com/videos/wwdc/2015/?id=233
Just because I'm curious: What kind of custom touch detection circuit do you have?
Related
This question was asked 5 years ago here What is the precision of the UITouch timestamp in iOS? and I'm wondering if anyone has any further information or if things have moved on.
To summarize my very rough understanding, an app gets notified once every screen refresh cycle about any touch events, and the refresh cycle used to be 60Hz, and on some devices may be 120Hz.
This would suggest that there are two possibilities.
The timestamp coincides with the screen refresh cycle, meaning that the timestamp approximately has the resolution of 60Hz or 120Hz - ie, if you get a touch at 0 milliseconds, the next timestamp you could possibly get from another touch would be at 16 milliseconds on a 60Hz device or at 8 milliseconds on a 120Hz device.
Alternatively, it could be that the screen hardware stores the (more or less) exact time of the tap into a buffer somewhere, and then the refresh cycle picks up all the timestamps that have occurred since the last cycle, but these timestamps could fall anywhere in that period. So you could have a tap at 0 milliseconds, or 5 milliseconds or 9 or whatever.
Obviously I'd prefer option 2 to be the case, because in my app I want to know the precise time that the user touched the screen, rather than a value rounded to the nearest 16 millisecond multiple.
Very grateful for any input - thanks!
I'm trying to detect if my iPhone has been thrown into the air.I've tried using core motion's acceleration API and its altitude API.However, because the axes are fixed to the phone doing the detection of the changes is incredibly difficult.Is there a better way to do what I want?Is it possible to speed up the refresh rate of the CMAltitude API?
In freefall, you should see your 3 accelerometer values go to 0. Even in a projectile type of fall (throwing), the phone is in freefall as soon as it leaves the thrower's hand.
This white paper talks about using a MCU, but the concept is there.
http://www.nxp.com/files/sensors/doc/app_note/AN3151.pdF
I have a problem with the iOS SDK. I can't find the API to slowdown a video with continuous values.
I have made an app with a slider and an AVPlayer, and I would like to change the speed of the video, from 50% to 150%, according to the slider value.
As for now, I just succeeded to change the speed of the video, but only with discrete values, and by recompiling the video. (In order to do that, I used AVMutableComposition APIs.
Do you know if it is possible to change continuously the speed, and without recompiling?
Thank you very much!
Jery
The AVPlayer's rate property allows playback speed changes if the associated AVPlayerItem is capable of it (responds YES to canPlaySlowForward or canPlayFastForward). The rate is 1.0 for normal playback, 0 for stopped, and can be set to other values but will probably round to the nearest discrete value it is capable of, such as 2:1, 3:2, 5:4 for faster speeds, and 1:2, 2:3 and 4:5 for slower speeds.
With the older MPMoviePlayerController, and its similar currentPlaybackRate property, I found that it would take any setting and report it back, but would still round it to one of the discrete values above. For example, set it to 1.05 and you would get normal speed (1:1) even though currentPlaybackRate would say 1.05 if you read it. Set it to 1.2 and it would play at 1.25X (5:4). And it was limited to 2:1 (double speed), beyond which it would hang or jump.
For some reason, the iOS API Reference doesn't mention these discrete speeds. They were found by experimentation. They make some sense. Since the hardware displays video frames at a fixed rate (e.g.- 30 or 60 frames per second), some multiples are easier than others. Half speed can be achieved by showing each frame twice, and double speed by dropping every other frame. Dropping 1 out of every 3 frames gives you 150% (3:2) speed. But to do 105% is harder, dropping 1 out of every 21 frames. Especially if this is done in hardware, you can see why they might have limited it to only certain multiples.
My question is about the iPhone accelerometer. Does the accelerometer measure acceleration or movement of the iPhone? What I mean is if I hold the iPhone and go from 0mph to 60mph, I would expect the measure of acceleration to increase in value from 0 to 60, but once I reach 60, I expect the value to return to 0 since I am "no longer accelerating" but am moving at a constant speed. Now if the accelerometer measure motion, I would expect it to register 0 to 60 and continue to provide a change in value as I move forward at 60mph. Sorry, I looked at a few books, programmed some code (values seemed to small to give a recognizable result over short distances or speeds), and a lot of web searches, and I am trying to get an answer to this question.
Thanx!
A couple of points:
The accelerometer never reads zero because gravity is always with us and is an acceleration (and a good thing too). If it ever reads 0 you are in deep doo-doo - you have been cast adrift in space, or else you are in free-fall plunging towards the ground.
When you go from 0 to 60, acceleration does not "register 0 to 60". That isn't what acceleration is. It isn't speed; it's the rate of change in speed. Is this a Lamborghini or a VW Bug? They both go 0 to 60 but the acceleration might be very, very different.
You might need to read a little physics textbook here, but to make it simple, think of it as how hard you are being pressed back against the car seat! If you are not being pressed back, thrown from side to side, etc., then your horizontal acceleration is zero even you are going 100 miles per hour.
I think that the following links can clarify what the accelerometer of the iPhone does, and does not:
http://electronics.howstuffworks.com/iphone-accelerometer.htm
http://www.engadget.com/2012/05/22/the-engineer-guy-shows-how-a-smartphone-accelerometer-works/
in image processing applications what is considered real time? Is 33 fps real time? Is 20 fps real time? If 33 and 20 fps are considered real time then is 1 or 2 fps also real time?
Can anyone throw some light.
In my experience, it's a pretty vague term. Often, what is meant is that the algorithm will run at the rate of the source (e.g. a camera) supplying the images; however, I would prefer to state this explicitly ("the algorithm can process images at the frame rate of the camera").
Real time image processing = produce output simultaneously with the input.
The input may be 25 fps but you may choose to process 1 of every 5 frames(that makes 5 fps processing) and your application is still real time.
TV streaming software: all the frames are processed.
Security application and the input is CCTV security cams: you may choose to skip some frames to fit the performance.
3d game or simulation: fps changes depending on the current scene.
And they are all real time.
Strictly speaking, I would say real-time means that the application is generating images based on user input as it occurs, e.g. a mouse movement which changes the facing of an avatar.
How successful it is at this task - 1 fps, 10 fps, 100 fps, etc - is actually another question.
Real-time describes an approach, not a performance metric.
If however you ask what is the slowest fps which passes as usable by a human, the answer is about 15, I think.
i think it depends on what the real time application is. If the app is showing slideshows with 1 picture every 3 seconds, and the app can process 1 picture within this 3 seconds and show it, then it is real time processing.
If the movie is 29.97 frames per second, and the app can process all 29.97 frames within the second, then it is also real time.
An example is, if an app can take the movie from a VCR or Cable's analog output, and compress it into 29.97 frames per second video and also send all that info to a remote area for another person to watch, then it is real time processing.
(Hard) Real time is when an outcome has no value when delivered too early or too late.
Any FPS is real time provided that displayed frames represent what should be displayed at the very instant they are displayed.
The notion of real-time display is not really tied to a specific frame rate - it could be defined as the minimum frame rate at which movement is perceived as being continuous. So for slow moving objects in a visual frame (e.g. ships in a harbour, or stars in the night sky) a relatively slow frame rate might suffice, whereas for rapid movement (e.g. a racing car simulator) a much higher frame rate would be needed.
There is also a secondary consideration of latency. A real-time display must have sufficiently low latency in relation to other events (e.g. behaviour of a real-time simulation) that there is no perceptible lag in display updates.
That's not actually an easy question (even without taking into account differences between individulas).
Wikipedia has a good article explaining why. For what it's worth, I think cinema films run at 24fps so, if you're happy with that, that's what I'd consider realtime.
It depends on what exactly you are trying to do. For some purposes 1fps or even 2 spf (Seconds per frame) could be considered real-time. For others thats way too slow ...
That said, real-time means that it takes as long (or less) to process x frames as it would take to just present those x frames.
It depends.
automatic aircraft cannon - 1000 fps
monitoring - 10 - 15 fps
authentication - 1 fps
medical devices - 1 fph
I guess the term is used with different meanings in different contexts. In industrial image processing, real time processing is usually the opposite of offline processing. In offline processing applications, you record images (many of them) and process them at a later time. In real time processing, the system that acquires the images also processes them, at the same time, so the processing frame rate must not be higher than the acquisition frame rate.
Real-time means your implementation is fast enough to meet some deadline. The deadline is part of your system's specification. If it's an interactive UI and the users are not too picky, 15Hz update can be OK, although it can feel laggy. If you're using it to drive a car along the motorway 30Hz is about right. If it's a missile, well, maybe 100Hz?