iOS dynamically slow down the playback of a video, with continuous value - ios

I have a problem with the iOS SDK. I can't find the API to slowdown a video with continuous values.
I have made an app with a slider and an AVPlayer, and I would like to change the speed of the video, from 50% to 150%, according to the slider value.
As for now, I just succeeded to change the speed of the video, but only with discrete values, and by recompiling the video. (In order to do that, I used AVMutableComposition APIs.
Do you know if it is possible to change continuously the speed, and without recompiling?
Thank you very much!
Jery

The AVPlayer's rate property allows playback speed changes if the associated AVPlayerItem is capable of it (responds YES to canPlaySlowForward or canPlayFastForward). The rate is 1.0 for normal playback, 0 for stopped, and can be set to other values but will probably round to the nearest discrete value it is capable of, such as 2:1, 3:2, 5:4 for faster speeds, and 1:2, 2:3 and 4:5 for slower speeds.
With the older MPMoviePlayerController, and its similar currentPlaybackRate property, I found that it would take any setting and report it back, but would still round it to one of the discrete values above. For example, set it to 1.05 and you would get normal speed (1:1) even though currentPlaybackRate would say 1.05 if you read it. Set it to 1.2 and it would play at 1.25X (5:4). And it was limited to 2:1 (double speed), beyond which it would hang or jump.
For some reason, the iOS API Reference doesn't mention these discrete speeds. They were found by experimentation. They make some sense. Since the hardware displays video frames at a fixed rate (e.g.- 30 or 60 frames per second), some multiples are easier than others. Half speed can be achieved by showing each frame twice, and double speed by dropping every other frame. Dropping 1 out of every 3 frames gives you 150% (3:2) speed. But to do 105% is harder, dropping 1 out of every 21 frames. Especially if this is done in hardware, you can see why they might have limited it to only certain multiples.

Related

Does AVPlayer auto adjust when playing a m3u8 playlist?

After spending some time setting up a transcoding process on AWS I am finding that the loading times for videos has not been lowered as expected with HLS (m3u8).
It seems that if I am using AVPlayer directly, without AVPlayerViewController, I a may need to do the managing of the video stream quality myself? My understanding was that if I had a m3u8, that things would be done automatically and the best quality would be used depending on network conditions / device / etc?
So far it seems that the loading times are the same if not slightly worse than without the m3u8 if AVPlayer is used as is.
To better understand what's going on I've been trying out a few things.
1) While doing this has worked to reduce loading times, I would prefer to do a bit more than just lower it all the way when not on wfifi:
self.player?.currentItem?.preferredPeakBitRate = 1
This seems to give me a pretty low quality video but it loads pretty quickly. I have yet to figure out how to detect the actual bitrate being used though (since setting this value has improved loading times dramatically, I am going to assume AVPlayer does not handle the adjustments on its own?).
2) Also, haven't had any luck with (causes infinite spinner, even with the preferredPeakBitRate set to 1):
self.player.automaticallyWaitsToMinimizeStalling = false
3) I am open to using a third party library that might handle this, found something called VKVideoPlayer that might do some of this?
Thanks
It's possible now in iOS8 and onwards.
Following copied from Apple's documentation:
The desired limit, in bits per second, of network bandwidth
consumption for this item. SWIFT: var preferredPeakBitRate: Double
OBJECTIVE-C: #property(nonatomic) double preferredPeakBitRate
Set preferredPeakBitRate to non-zero to indicate that the player
should attempt to limit item playback to that bit rate, expressed in
bits per second.
If network bandwidth consumption cannot be lowered to meet the
preferredPeakBitRate, it will be reduced as much as possible while
continuing to play the item.

How to dynamically change the playback rate of video in iOS?

The perfect example of what I am trying to achieve can be seen in the Flow ● Slow and Fast Motion app .
One can change the playback rate of the video by dragging points on the curve up or down. The video can also be saved in this state.
I am looking for a way to dynamically speed up/down a video , so that the playback rate can be changed while the video is being played.
Video explanation
WHAT I'VE TRIED
The playback rate property of AVPlayer .But it Only works with a few values for playback Rate(0.50, 0.67, 0.80, 1.0, 1.25, 1.50, and 2.0 ) and one cannot save the video
The scaleTimeRange(..) property of AVMutableComposition. But it doesn't work when you want to ramp the video for gradually decreasing slow/fast motion.
Display video frames on screen using CAEAGLLayer and CADisplayLink. But my many attempts on trying to achieve Slow/Fast motion with this have been unsuccessful .
All this has taken me months and I'm starting to doubt if I'll be able to accomplish this at all.
Thus any suggestion , would be immensely valuable.
In IOS, the MPNowPlayingInfoCenter object contains a 'nowPlayingInfo' dictionary whose contents describe the item being played. It is advised that you start the playback at the 'currentplaybackrate' and then set the speed. See this thread on the developer's forum.
You might possibly end up with something like this (but this is javascript) where the playback rate of the video has been sped up by 4.
document.querySelector('video').playbackRate = 4.0;
document.querySelector('video').play();
video{width:400px;
height:auto;}
<video controls preload="true" autoplay>
<source src="http://www.rachelgallen.com/nature.mp4" type="video/mp4" >
</video>
So I'm not sure I fully understand the use case you're going for, but I think
func setRate(_ rate: Float,
time itemTime: CMTime,
atHostTime hostClockTime: CMTime)
[Apple Documentation Source]
Is something that you're looking for. While this may not be exactly what you need, I'm also not sure where in the docs there is exactly what you're looking for, but with the above method alone, you could do the following to save videos at a variable rate:
Use the above method to play the video throughout (assuming it's not too long, otherwise this will be computationally impossible/timeout-worthy on some devices) at the desired rates each second. Design UI to adjust this per second rate.
under the hood you can actually play the video at that speed "frame by frame" and capture the frames you want (in the right # which will give you the rate you desire) and voila -- saving the right number of frames together (skipping/duplicating as needed to increase/lower desired rate based on "picker" UI) you've now accomplished what you desire
To be clear, what I'm talking about here is a video output # 60FPS has 60 frames per second. You would literally "cut and paste" frames together from the source video into the "destination" video based on whatever UI steppers values you receive from your user (in the screenshot-ed example the question contains, as my basis), and pick up that many frames. AKA if the user says seconds 2-10 of their 20 second video should be at 2X, only put in 30 frames for each of those seconds (if filmed at 60 FPS) alternating frames. The output will, at 60FPS, seem like 2X speed (since there are now 30 frames per 1 second of original video, which is 0.5 seconds at 60 FPS). Similarly, any value can appropriately be factored into:
(desired consistent FPS) = (source video FPS) = (destination video FPS) (ie 60 or 90)
(rate) = (rate from UI steppers/graph UI to pick rate # each time interval) (ie 1X/2x/0.25X)
(desired consistent FPS) * (rate) = (# frames kept in destination video)
(destination video frames) = (source video) * (desired consistent FPS) ~modulated by~ (per custom time interval rate)
The exact mechanisms for ^^ might actually be built into AVPlayer and I didn't find the details, but this alone should be a good start to get you going in that direction.

Is there a way to program games without depending on fram rate?

Im programming an iOS game and I use the method update for a lot of things, which is called at the game speed refresh (for the moment 60 times per second) but the problem is if the frame rate drops down (for example a notification, or any behavior in the game that, when called, it makes drop down a little bit the fps...) then the bugs comes....
A fast example is if I have an animation of 80 pictures, 40 for jump up and 40 for fall, I would need 1,2 second to run the animation, so if the jump takes 1,2 second it would be ok, the animation would run. But if my fps drop down to 30 then the animation would cut because it would need 2,4 seconds to run the animation but the jump remains 1,2 second. This is only a fast example, there is a lot of unexpected behaviors in the game if the frame rate drops, so my question is, are games developers depending so much on frame rate or there is a way to avoid those fps-bugs? (another way to program or any trick?)
Base your timing on the time, rather than the frame count. So, save a time stamp on each frame, and on the next frame, calculate how much time has elapsed, and based on your ideal frame rate, figure out how many frames of animation to advance. At full speed, you shouldn’t notice a difference, and when the frame rate drops, your animations may get jerky but the motion will never get more than 1 frame behind where it should be.
Edit: as uliwitness points out, be careful what time function you use, so you don’t encounter issues when, for example, the computer goes to sleep or the game pauses.
Always use the delta value in your update method. This is platform and engine independent. Multiply any speed or change value by the delta value (the time interval between the current and the last frames).
In case of the animation, one way to fix the issue could be to multiply the animation counter by delta (and an inverse of the expected interval). Then round this value to get the correct image for the animation.
// currentFrame is a float ivar set to 0 at the beginning of the animation.
currentFrame = currentFrame + delta * 60.0;
int imageIndex = roundf(currentFrame);
However, with Sprite Kit there is a better way to do this kind of animation, as there is a premade SKAction dealing with sprite animation.
[SKAction animateWithTextures:theTextures timePerFrame:someInterval];
With this solution you don't have to deal with timing the images at all. The engine will do that for you.
There's a great discussion about FPS-based and Time-based techniques here:
Why You Should be Using Time-based Animation and How to Implement it
It's the best on my opinion, very complete, easy to follow and provides JsFiddle examples. I translated those examples to C++/Qt.
Click here to watch a video of my app running:

AudioQueue's Peak Power Progression is too Gradual

I'm using the standard Apple AudioQueue Services from the AudioToolBox library to use my device's microphone. I am keeping track of the peak audio level in real time, and this is working well except it does not respond as expected to uneven changes in volume. The mPeakPower jumps appropriately high for sudden increases in volume (no problem there), but it decreases only gradually: if I feed my device a loud clap followed by silence it jumps up high, then immediately starts to steadily decrease until it reaches the actual current volume (this decrease can take up to 2 seconds). Ideally, a loud sound followed by silence should cause a spike which would immediately fall back to the ambient volume. I need to quickly respond to both increases and decreases in volume. Any insights? I assume this is happening because of a smoothing algorithm, but how can I get around it? Should I be using a different library?
Here's a snippet of my code in the AudioQueueInputCallback method.
AudioQueueLevelMeterState meters[2];
UInt32 dlen = sizeof(meters);
AudioQueueGetProperty(myQueue,kAudioQueueProperty_CurrentLevelMeterDB,meters,&dlen);
NSNumber *ambientVolume = [NSNumber numberWithFloat:meters[0].mPeakPower];
Thanks!

In image processing, what is real time?

in image processing applications what is considered real time? Is 33 fps real time? Is 20 fps real time? If 33 and 20 fps are considered real time then is 1 or 2 fps also real time?
Can anyone throw some light.
In my experience, it's a pretty vague term. Often, what is meant is that the algorithm will run at the rate of the source (e.g. a camera) supplying the images; however, I would prefer to state this explicitly ("the algorithm can process images at the frame rate of the camera").
Real time image processing = produce output simultaneously with the input.
The input may be 25 fps but you may choose to process 1 of every 5 frames(that makes 5 fps processing) and your application is still real time.
TV streaming software: all the frames are processed.
Security application and the input is CCTV security cams: you may choose to skip some frames to fit the performance.
3d game or simulation: fps changes depending on the current scene.
And they are all real time.
Strictly speaking, I would say real-time means that the application is generating images based on user input as it occurs, e.g. a mouse movement which changes the facing of an avatar.
How successful it is at this task - 1 fps, 10 fps, 100 fps, etc - is actually another question.
Real-time describes an approach, not a performance metric.
If however you ask what is the slowest fps which passes as usable by a human, the answer is about 15, I think.
i think it depends on what the real time application is. If the app is showing slideshows with 1 picture every 3 seconds, and the app can process 1 picture within this 3 seconds and show it, then it is real time processing.
If the movie is 29.97 frames per second, and the app can process all 29.97 frames within the second, then it is also real time.
An example is, if an app can take the movie from a VCR or Cable's analog output, and compress it into 29.97 frames per second video and also send all that info to a remote area for another person to watch, then it is real time processing.
(Hard) Real time is when an outcome has no value when delivered too early or too late.
Any FPS is real time provided that displayed frames represent what should be displayed at the very instant they are displayed.
The notion of real-time display is not really tied to a specific frame rate - it could be defined as the minimum frame rate at which movement is perceived as being continuous. So for slow moving objects in a visual frame (e.g. ships in a harbour, or stars in the night sky) a relatively slow frame rate might suffice, whereas for rapid movement (e.g. a racing car simulator) a much higher frame rate would be needed.
There is also a secondary consideration of latency. A real-time display must have sufficiently low latency in relation to other events (e.g. behaviour of a real-time simulation) that there is no perceptible lag in display updates.
That's not actually an easy question (even without taking into account differences between individulas).
Wikipedia has a good article explaining why. For what it's worth, I think cinema films run at 24fps so, if you're happy with that, that's what I'd consider realtime.
It depends on what exactly you are trying to do. For some purposes 1fps or even 2 spf (Seconds per frame) could be considered real-time. For others thats way too slow ...
That said, real-time means that it takes as long (or less) to process x frames as it would take to just present those x frames.
It depends.
automatic aircraft cannon - 1000 fps
monitoring - 10 - 15 fps
authentication - 1 fps
medical devices - 1 fph
I guess the term is used with different meanings in different contexts. In industrial image processing, real time processing is usually the opposite of offline processing. In offline processing applications, you record images (many of them) and process them at a later time. In real time processing, the system that acquires the images also processes them, at the same time, so the processing frame rate must not be higher than the acquisition frame rate.
Real-time means your implementation is fast enough to meet some deadline. The deadline is part of your system's specification. If it's an interactive UI and the users are not too picky, 15Hz update can be OK, although it can feel laggy. If you're using it to drive a car along the motorway 30Hz is about right. If it's a missile, well, maybe 100Hz?

Resources