I'm trying to scrub through a video frame by frame. For this I found multiple options:
Encode the video for every frame to be a keyframe
[avplayer seekToTime:time toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero]
stepForward/stepBackward
Re-encoding the video is not an option, so I have to stick to the other two. I stumbled upon two different problems:
Jumping between keyframes
long loading times but precise
Both problems are a no-go.
In AVPlayerItem I've found canStepBackwardand canPlayReverse but I can't find anywhere the conditions for these methods to return true.
Does anyone know these conditions?
Thanks
Try these:
in .h file
AVPlayer *avPlayer;
in .m file
[avPlayer.currentItem canPlayReverse];
[avPlayer.currentItem canStepBackward];
These condition being the BOOL value are already set to 1.0 i.e. they are already true. You just have to provide these values to your AVplayerItem object.
You can also try setting the player rate:
avPlayer.rate=-1.0;(for negative playing)
avplayer.rate=1.0; (for positive playing)
Related
Playing video with avplayer, listening to the loadedTimeRanges property, playing about 10 minutes of video, avplayer always preload the video, and feel very costly, is there a way to limit the size of the preloaded area? Such as pre-loading video half of the time?
I think you're looking for AVPlayerItem's preferredForwardBufferedDuration property.
Per Apple:
This property defines the preferred forward buffer duration in seconds. If set to 0, the player will choose an appropriate level of buffering for most use cases. Setting this property to a low value will increase the chance that playback will stall and re-buffer, while setting it to a high value will increase demand on system resources.
See https://developer.apple.com/reference/avfoundation/avplayeritem/1643630-preferredforwardbufferduration?language=objc
If you use resourceLoaderdelegate you can control the exact amount of content that gets preloaded/downloaded prior to playback.
The code example here is a good start - AVPlayer stalling on large video files using resource loader delegate
Basically, you would have to maintain an array of pendingRequests and process them once by one by firing up URLSession dataTask until you have downloaded enough content that you would like to preload.
Cheers.
I am attempting to stitch together video assets using AVComposition based on the code here:
https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html
On OSX it works perfectly, however on iOS when playing back via AVPlayer it only works with 1 or 2 input clips. If I attempt to add a third there will be nothing played back on the AVPlayerLayer. Wierdly if I observer the AVPlayer playback time using addPeriodicTimeObserverForInterval the video appears to be playing for the correct duration, but nothing plays back on the layer.
Does anyone have any insight into why this would be?
Turns out I was creating CMTime objects with differing timeScale values, which was causing rounding errors and creating gaps in my tracks. If a track had a gap then it would just fail to play. Ensuring that all my CMTime objects had the same timeScale made everything work perfectly.
I have implemented AVPlayerItem's stepByCount method to manually go through a video frame by frame. Here's how it looks like for forward 1 step.
AVPlayer *player = [AVPlayer playerWithURL:url];
[player.currentItem stepByCount:1];
And backward 1 step
AVPlayer *player = [AVPlayer playerWithURL:url];
[player.currentItem stepByCount:-1];
The forward 1 step (going forward in time frame by frame) works well. However when I try to go backward frame by frame, it's not as smooth as the forward step. Am I missing something? Or is this because of the way videos are encoded--it's meant to be viewed forward but not backwards--inherently?
You can check to see if the AVPlayerItem supports stepping:
if (playerItem.canStepBackward)
{
[playerItem stepByCount:numberOfFrames];
}
else
{
// Do our best here...
[player seekToTime:CMTimeSubtract(player.currentTime, CMTimeMake(ABS(numberOfFrames)*1000, compositionFPS*1000)) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}
Not sure what encoding format your video is in but compressed videos are encoded with keyframes. These occur either at regular intervals (like once a second) or when the scene changes (like when a cut is made from close up to wide shot). The data between keyframes just describes the cumulative changes since the last keyframe - only pixels that change are noted. As you have deduced, this means that a video in reverse is not how it is designed to be played. When you skip backwards the encoder needs to skip back to the previous keyframe (which could be hundreds or even thousands of frames before your current position) then recreate all the frames up to your required frame before actually presenting you with the composited frame. The only way around this in your situation is to either use an already flattened video format or to flatten it before scanning. You might want to take a look at http://opencv.org/ if you want to get into decompression.
How I can create a crossfader between two sound track with AVAudioplayer?
I must use AVAudiomix? But... how I should use?
AVAudioPlayer does not support using an AVAudioMix. You could try to fade out the playing yourself by directly setting the playback volume but getting the timing right between two AVAudioPlayers will be difficult as AVAudioPlayer is known to have very high and unpredictable latency when starting to play.
One way to accomplish this is to use AVPlayer and AVPlayerItem and AVComposition. You can setup the composition to overlay the two audio files by your desired amount and setup the AVAudioMix fade out the first and fade in the second. This method will let you have precise control over when the audio files play relative to one another.
I want to play video backward in AVPlayer. I have tried with changing rates property to -1.0, and although it did work it was not smooth. Is there any way through which I can smoothly play videos backward?
As stated in the comments, the problem is with keyframes and the fact that most codecs are not designed to play backwards. There are 2 options for re-encoding the video that doesn't require you to actually reverse the video in editing.
Make every frame a keyframe. I've seen this work well for codecs like H.264 that rely on keyframes. Basically if every frame is a key frame, then each frame can be decoded without relying on any previous frames so it's effectively the same as playing forward.
Use a codec that doesn't use keyframes and non-keyframes (basically all frames are always keyframes). PhotoJPEG is one such option, although I'm not completely sure if it plays back on iOS. I would think so. It works great on a Mac.
Note that either of this options will result in larger file sizes compared to typical "keyframe every x frames" encoded video.
You have to reach at the end of the current item and then set the rate to the negative value. Something like this:
-(void)reversePlay
{
CMTime durTime = myPlayer.currentItem.asset.duration;
if (CMTIME_IS_VALID(durTime))
{
[myPlayer seekToTime:durTime toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
[myPlayer setRate:-1.0];
}
else
NSLog(#"Invalid time");
}
source: https://stackoverflow.com/a/16104363/701043