AVAssetWriter AVVideoExpectedSourceFrameRateKey (frame rate) ignored - ios

Me and my team are trying to re-encode a video file to a more "gify" feeling by changing the video frame rate. We are using the following properties for the AVAssetWriterInput:
let videoSettings:[String:Any] = [
AVVideoCodecKey: AVVideoCodecH264,
AVVideoHeightKey: videoTrack.naturalSize.height,
AVVideoWidthKey: videoTrack.naturalSize.width,
AVVideoCompressionPropertiesKey: [AVVideoExpectedSourceFrameRateKey: NSNumber(value: 12)]
]
But the output video keep playing in the normal frame rate (played using AVPlayer).
What is the right way to reduce video frame rate? (12 for example).
Any help in the right direction would be HIGHLY approcated. We stuck.
Best regards,
Roi

You can control the timing of each sample you append to your AVAssetWriterInput directly with CMSampleBufferCreateCopyWithNewTiming.
You need to adjust the timing in the CMSampleTimingInfo you provide.
Retrieve current timing info with CMSampleBufferGetOutputSampleTimingInfoArray and just go over the duration of each sample and calculate the correct duration to get 12 frames per second and adjust presentation and decode timestamps to match this new duration.
You then make your copy and feed it to your writer's input.
Let's say you have existingSampleBuffer:
CMSampleBufferRef sampleBufferToWrite = NULL;
CMSampleTimingInfo sampleTimingInfo = {0};
CMSampleBufferGetSampleTimingInfo(existingSampleBuffer, 0, &sampleTimingInfo);
// modify duration & presentationTimeStamp
sampleTimingInfo.duration = CMTimeMake(1, 12) // or whatever frame rate you desire
sampleTimingInfo.presentationTimeStamp = CMTimeAdd(previousPresentationTimeStamp, sampleTimingInfo.duration);
previousPresentationTimeStamp = sampleTimingInfo.presentationTimeStamp; // should be initialised before passing here the first time
OSStatus status = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, existingSampleBuffer, 1, &sampleTimingInfo, &sampleBufferToWrite);
if (status == noErr) {
// you can write sampleBufferToWrite
}
I'm making some assumptions in this code:
SampleBuffer contains only one sample
SampleBuffer contains uncompressed video (otherwise, you need to handle decodeTimeStamp as well)

Related

Problem setting video frame rate using AVAssetWriter/AVAssetReader

Situation:
I am trying to export video with some parameters like video bit rate, audio bit rate, frame rate, changing video resolution, etc. Note that I am letting the user set the video frame rate in fractions; like user can set the video frame rate say, 23.98.
I use AVAssetWriter and AVAssetReader for this operation. I use AVAssetWriterInputPixelBufferAdaptor for writing the sample buffers.
Everything else works just fine, except the video frame rate.
What I have tried:
Setting the AVAssetWriter.movieTimeScale as suggested here. Which does change the video frame rate but also makes the video sluggish. (gist here)
Setting AVVideoExpectedSourceFrameRateKey. Which does not help. (gist here)
Setting AVAssetWriterInput.mediaTimeScale. Again, it changes the video frame rate but makes the video sluggish as AVAssetWriter.movieTimeScale does. The video shows different frames at some point and sometimes it sticks and resumes again. (gist here)
Using AVAssetReaderVideoCompositionOutput and setting AVMutableVideoComposition.frameDuration; just like SDAVAssetExportSession does. Ironically with SDAVAssetExportSession code, the video is being exported just at the right frame rate that I want, but it just does not work in my code. gist here
I am not sure why it won't work with my code. The issue with this approach is it always returns nil from AVAssetReaderVideoCompositionOutput.copyNextSampleBuffer().
Manually changing the timestamp of the frame with CMSampleTimingInfo, as suggested here Something like:
var sampleTimingInfo = CMSampleTimingInfo()
var sampleBufferToWrite: CMSampleBuffer?
CMSampleBufferGetSampleTimingInfo(vBuffer, at: 0, timingInfoOut: &sampleTimingInfo)
sampleTimingInfo.duration = CMTimeMake(value: 100, timescale: Int32(videoConfig.videoFrameRate * 100))
sampleTimingInfo.presentationTimeStamp = CMTimeAdd(previousPresentationTimeStamp, sampleTimingInfo.duration)
previousPresentationTimeStamp = sampleTimingInfo.presentationTimeStamp
let status = CMSampleBufferCreateCopyWithNewTiming(allocator: kCFAllocatorDefault, sampleBuffer: vBuffer,sampleTimingEntryCount: 1, sampleTimingArray: &sampleTimingInfo, sampleBufferOut: &sampleBufferToWrite)
With this approach, I do get the frame rate set just right, but it increases the video duration (as mentioned in the comment of that question’s answer). I think at some point I may have to discard some frames (if the target frame rate is lower; I need to lower the frame rate in most of the cases).
If I know that if I want 30fps, and my current frame rate is 60fps, it's simple to discard every second frame and setting the SampleBuffer time accordingly.
If I go with this approach(i.e. setting 23.98 fps), how do I decide which frame to discard and if the target frame rate is higher, which frame to duplicate? Reminder: the frame rate could be in fractions.
Here is an idea to select frames. Suppose the fps of source video is F and target fps is TF. rate = TF/F
Initiate a variable n equal to -rate and add rate each time,
when the integer part of n changed, select the frame.
e.g. rate = 0.3
n: -0.3 0 0.3 0.6 0.9 1.2 1.5 1.8 2.1
^ ^ ^
frame index: 0 1 2 3 4 5 6 7
select 0 4 7
float rate = 0.39999f; // TF/F
float n = -rate; // to make sure first frame will be selected
for (int i = 0; i < 100; ++i, n += rate) { // i stands for frame index, take a video with 100 frames as an example
int m = floor(n);
int tmp = n+rate;
// if rate > 1.0 repeat i
// if rate < 1.0 some of the frames will be dropped
for (int j = 0; m+j < tmp; ++j) {
// Use this frame
printf("%d ", i);
}
}
NSMutableDictionary *writerInputParams = [[NSMutableDictionary alloc] init];
[writerInputParams setObject:AVVideoCodecTypeH264 forKey:AVVideoCodecKey];
[writerInputParams setObject:[NSNumber numberWithInt:width] forKey:AVVideoWidthKey];
[writerInputParams setObject:[NSNumber numberWithInt:height] forKey:AVVideoHeightKey];
[writerInputParams setObject:AVVideoScalingModeResizeAspectFill forKey:AVVideoScalingModeKey];
NSMutableDictionary * compressionProperties = [[NSMutableDictionary alloc] init];
[compressionProperties setObject:[NSNumber numberWithInt: 20] forKey:AVVideoExpectedSourceFrameRateKey];
[compressionProperties setObject:[NSNumber numberWithInt: 20] forKey:AVVideoAverageNonDroppableFrameRateKey];
[compressionProperties setObject:[NSNumber numberWithInt: 0.0] forKey:AVVideoMaxKeyFrameIntervalDurationKey];
[compressionProperties setObject:[NSNumber numberWithInt: 1] forKey:AVVideoMaxKeyFrameIntervalKey];
[compressionProperties setObject:[NSNumber numberWithBool:YES] forKey:AVVideoAllowFrameReorderingKey];
[compressionProperties setObject:AVVideoProfileLevelH264BaselineAutoLevel forKey:AVVideoProfileLevelKey];
[writerInputParams setObject:compressionProperties forKey:AVVideoCompressionPropertiesKey];
self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:writerInputParams];
self.assetWriterInput.expectsMediaDataInRealTime = YES;
It has been verified that SCNView refreshes 60 frames per second, but using AVAssetWriter only wants to save 20 frames per second, what should to do?
Neither AVVideoExpectedSourceFrameRateKey nor AVVideoAverageNonDroppableFrameRateKey above will not affect fps, config fps will not work !!!
// Set this to make sure that a functional movie is produced, even if the recording is cut off mid-stream. Only the last second should be lost in that case.
self.videoWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);
self.videoWriter.shouldOptimizeForNetworkUse = YES;
self.videoWriter.movieTimeScale = 20;
The above configuration will not affect fps either.
self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:writerInputParams];
self.assetWriterInput.expectsMediaDataInRealTime = YES;
/// this config will change video frame presenttime to fit fps, but it will be change video duration.
// self.assetWriterInput.mediaTimeScale = 20;
self.assetWriterInput.mediaTimeScale will affect the fps, but will cause the video duration to be stretched by 3 times, because
BOOL isSUc = [self.writerAdaptor appendPixelBuffer:cvBuffer withPresentationTime:presentationTime]; The time of the filled frame will be re-modified, so the self.assetWriterInput.mediaTimeScale value is configured, which is seriously inconsistent with expectations, and the video duration should not be stretched.
So if you want to control the fps of the video that AVAssetWriter finally saves, you must pass the control, and must make sure call 20 per second.
CMTime presentationTime = CMTimeMake(_writeCount * (1.0/20.0) * 1000, 1000);
BOOL isSUc = [self.writerAdaptor appendPixelBuffer:cvBuffer withPresentationTime:presentationTime];
_writeCount += 1;

How to get the timestamp of each video frame in iOS while decoding a video.mp4

Scenario:
I am writing an iOS app to try decode a videoFile.mp4. I am using AVAssetReaderTrackOutput with AVAssetReader to decode frames from the video file. This works very well. I get each & every frame from videoFile.mp4 basically using the following logic at the core.
Code:
AVAssetReader * videoFileReader;
AVAssetReaderTrackOutput * assetReaderOutput = [videoFileReader.outputs objectAtIndex:0];
CMSampleBufferRef sampleBuffer = [assetReaderOutput copyNextSampleBuffer];
sampleBuffer is the buffer of each video frame here.
Question:
How can I get the timestamp of each video frame here ?
In other words & more detail, how can I get the timestamp of each sampleBuffer that i am returned from copyNextSampleBuffer?
PS:
Please note that I need the timestamp in milliseconds.
I got the answer to my question finally. Following 2 lines can get the frame timestamp of the sampleBuffer returned from copyNextSampleBuffer
CMTime frameTime = CMSampleBufferGetOutputPresentationTimeStamp(sampleBuffer);
double frameTimeMillisecs = CMTimeGetSeconds(frameTime) * 1000;
Timestamp is returned in seconds. Hence multiplying it by 1000 to convert to milliseconds

How do I control AVAssetWriter to write at the correct FPS

Let me see if I understood it correctly.
At the present most advanced hardware, iOS allows me to record at the following fps: 30, 60, 120 and 240.
But these fps behave differently. If I shoot at 30 or 60 fps, I expect the videos files created from shooting at these fps to play at 30 and 60 fps respectively.
But if I shoot at 120 or 240 fps, I expect the video files creating from shooting at these fps to play at 30 fps, or I will not see the slow motion.
A few questions:
am I right?
is there a way to shoot at 120 or 240 fps and play at 120 and 240 fps respectively? I mean play at the fps the videos were shoot without slo-mo?
How do I control that framerate when I write the file?
I am creating the AVAssetWriter input like this...
NSDictionary *videoCompressionSettings = #{AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : #(videoWidth),
AVVideoHeightKey : #(videoHeight),
AVVideoCompressionPropertiesKey : #{ AVVideoAverageBitRateKey : #(bitsPerSecond),
AVVideoMaxKeyFrameIntervalKey : #(1)}
};
_assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
and there is no apparent way to control that.
NOTE: I have tried different numbers where that 1 is. I have tried 1.0/fps, I have tried fps and I have removed the key. No difference.
This is how I setup `AVAssetWriter:
AVAssetWriter *newAssetWriter = [[AVAssetWriter alloc] initWithURL:_movieURL fileType:AVFileTypeQuickTimeMovie
error:&error];
_assetWriter = newAssetWriter;
_assetWriter.shouldOptimizeForNetworkUse = NO;
CGFloat videoWidth = size.width;
CGFloat videoHeight = size.height;
NSUInteger numPixels = videoWidth * videoHeight;
NSUInteger bitsPerSecond;
// Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
// if ( numPixels < (640 * 480) )
// bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.
// else
NSUInteger bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.
bitsPerSecond = numPixels * bitsPerPixel;
NSDictionary *videoCompressionSettings = #{AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : #(videoWidth),
AVVideoHeightKey : #(videoHeight),
AVVideoCompressionPropertiesKey : #{ AVVideoAverageBitRateKey : #(bitsPerSecond)}
};
if (![_assetWriter canApplyOutputSettings:videoCompressionSettings forMediaType:AVMediaTypeVideo]) {
NSLog(#"Couldn't add asset writer video input.");
return;
}
_assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoCompressionSettings
sourceFormatHint:formatDescription];
_assetWriterVideoInput.expectsMediaDataInRealTime = YES;
NSDictionary *adaptorDict = #{
(id)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA),
(id)kCVPixelBufferWidthKey : #(videoWidth),
(id)kCVPixelBufferHeightKey : #(videoHeight)
};
_pixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:_assetWriterVideoInput
sourcePixelBufferAttributes:adaptorDict];
// Add asset writer input to asset writer
if (![_assetWriter canAddInput:_assetWriterVideoInput]) {
return;
}
[_assetWriter addInput:_assetWriterVideoInput];
captureOutput method is very simple. I get the image from the filter and write it to file using:
if (videoJustStartWriting)
[_assetWriter startSessionAtSourceTime:presentationTime];
CVPixelBufferRef renderedOutputPixelBuffer = NULL;
OSStatus err = CVPixelBufferPoolCreatePixelBuffer(nil,
_pixelBufferAdaptor.pixelBufferPool,
&renderedOutputPixelBuffer);
if (err) return; // NSLog(#"Cannot obtain a pixel buffer from the buffer pool");
//_ciContext is a metal context
[_ciContext render:finalImage
toCVPixelBuffer:renderedOutputPixelBuffer
bounds:[finalImage extent]
colorSpace:_sDeviceRgbColorSpace];
[self writeVideoPixelBuffer:renderedOutputPixelBuffer
withInitialTime:presentationTime];
- (void)writeVideoPixelBuffer:(CVPixelBufferRef)pixelBuffer withInitialTime:(CMTime)presentationTime
{
if ( _assetWriter.status == AVAssetWriterStatusUnknown ) {
// If the asset writer status is unknown, implies writing hasn't started yet, hence start writing with start time as the buffer's presentation timestamp
if ([_assetWriter startWriting]) {
[_assetWriter startSessionAtSourceTime:presentationTime];
}
}
if ( _assetWriter.status == AVAssetWriterStatusWriting ) {
// If the asset writer status is writing, append sample buffer to its corresponding asset writer input
if (_assetWriterVideoInput.readyForMoreMediaData) {
if (![_pixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime]) {
NSLog(#"error", [_assetWriter.error localizedFailureReason]);
}
}
}
if ( _assetWriter.status == AVAssetWriterStatusFailed ) {
NSLog(#"failed");
}
}
I put the whole thing to shoot at 240 fps. These are presentation times of frames being appended.
time ======= 113594.311510508
time ======= 113594.324011508
time ======= 113594.328178716
time ======= 113594.340679424
time ======= 113594.344846383
if you do some calculation between them you will see that the framerate is about 240 fps. So the frames are being stored with the correct time.
But when I watch the video the movement is not in slow motion and quick time says the video is 30 fps.
Note: this app grabs frames from the camera, the frames goes into CIFilters and the result of those filters is converted back to a sample buffer that is stored to file and displayed on the screen.
I'm reaching here, but I think this is where you're going wrong. Think of your video capture as a pipeline.
(1) Capture buffer -> (2) Do Something With buffer -> (3) Write buffer as frames in video.
Sounds like you've successfully completed (1) and (2), you're getting the buffer fast enough and you're processing them so you can vend them as frames.
The problem is almost certainly in (3) writing the video frames.
https://developer.apple.com/reference/avfoundation/avmutablevideocomposition
Check out the frameDuration setting in your AVMutableComposition, you'll need something like CMTime(1, 60) //60FPS or CMTime(1, 240) // 240FPS to get what you're after (telling the video to WRITE this many frames and encode at this rate).
Using AVAssetWriter, it's exactly the same principle but you set the frame rate as a property in the AVAssetWriterInput outputSettings adding in the AVVideoExpectedSourceFrameRateKey.
NSDictionary *videoCompressionSettings = #{AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : #(videoWidth),
AVVideoHeightKey : #(videoHeight),
AVVideoExpectedSourceFrameRateKey : #(60),
AVVideoCompressionPropertiesKey : #{ AVVideoAverageBitRateKey : #(bitsPerSecond),
AVVideoMaxKeyFrameIntervalKey : #(1)}
};
To expand a little more - you can't strictly control or sync your camera capture exactly to the output / playback rate, the timing just doesn't work that way and isn't that exact, and of course the processing pipeline adds overhead. When you capture frames they are time stamped, which you've seen, but in the writing / compression phase, it's using only the frames it needs to produce the output specified for the composition.
It goes both ways, you could capture only 30 FPS and write out at 240 FPS, the video would display fine, you'd just have a lot of frames "missing" and being filled in by the algorithm. You can even vend only 1 frame per second and play back at 30FPS, the two are separate from each other (how fast I capture Vs how many frames and what I present per second)
As to how to play it back at different speed, you just need to tweak the playback speed - slow it down as needed.
If you've correctly set the time base (frameDuration), it will always play back "normal" - you're telling it "play back is X Frames Per Second", of course, your eye may notice a difference (almost certainly between low FPS and high FPS), and the screen may not refresh that high (above 60FPS), but regardless the video will be at a "normal" 1X speed for it's timebase. By slowing the video, if my timebase is 120, and I slow it to .5x I know effectively see 60FPS and one second of playback takes two seconds.
You control the playback speed by setting the rate property on AVPlayer https://developer.apple.com/reference/avfoundation/avplayer
The iOS screen refresh is locked at 60fps, so the only way to "see" the extra frames is, as you say, to slow down the playback rate, a.k.a slow motion.
So
yes, you are right
the screen refresh rate (and perhaps limitations of the human visual system, assuming you're human?) means that you cannot perceive 120 & 240fps frame rates. You can play them at normal speed by downsampling to the screen refresh rate. Surely this is what AVPlayer already does, although I'm not sure if that's the answer you're looking for.
you control the framerate of the file when you write it with the CMSampleBuffer presentation timestamps. If your frames are coming from the camera, you're probably passing the timestamps straight through, in which case check that you really are getting the framerate you asked for (a log statement in your capture callback should be enough to verify this). If you're procedurally creating frames, then you choose the presentation timestamps so that they're spaced 1.0/desiredFrameRate seconds apart!
Is 3. not working for you?
p.s. you can discard & ignore AVVideoMaxKeyFrameIntervalKey - it's a quality setting and has nothing to do with playback framerate.

How to set timestamp of CMSampleBuffer for AVWriter writing

I'm working with AVFoundation for capturing and recording audio. There are some issues I don't quite understand.
Basically I want to capture audio from AVCaptureSession and write it using AVWriter, however I need some shifting in the timestamp of the CMSampleBuffer I get from AVCaptureSession. I read documentation of CMSampleBuffer I see two different term of timestamp: 'presentation timestamp' and 'output presentation timestamp'. What the different of the two ?
Let say I get a CMSampleBuffer (for audio) instance from AVCaptureSession, and I want to write it to a file using AVWriter, what function should I use to 'inject' a CMTime to the buffer in order to set the presentation timestamp of it in the resulting file ?
Thanks.
Use the CMSampleBufferGetPresentationTimeStamp, that is the time when the buffer is captured and should be "presented" at when played back to be in sync. To quote session 520 at WWDC 2012: "Presentation time is the time at which the first sample in the buffer was picked up by the microphone".
If you start the AVWriter with
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
and then append samples with
if(videoWriterInput.readyForMoreMediaData) [videoWriterInput appendSampleBuffer:sampleBuffer];
the frames in the finished video will be consistent with CMSampleBufferGetPresentationTimeStamp (I have checked). If you want to modify the time when adding samples you have to use AVAssetWriterInputPixelBufferAdaptor
Chunk of sample code from here: http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html
CMSampleBufferRef sample - is your sampleBuffer, CMSampleBufferRef sout your output. NewTimeStamp is your time stamp.
CMItemCount count;
CMTime newTimeStamp = CMTimeMake(YOURTIME_GOES_HERE);
CMSampleBufferGetSampleTimingInfoArray(sample, 0, nil, &count);
CMSampleTimingInfo* pInfo = malloc(sizeof(CMSampleTimingInfo) * count);
CMSampleBufferGetSampleTimingInfoArray(sample, count, pInfo, &count);
for (CMItemCount i = 0; i < count; i++)
{
pInfo[i].decodeTimeStamp = newTimeStamp; // kCMTimeInvalid if in sequence
pInfo[i].presentationTimeStamp = newTimeStamp;
}
CMSampleBufferRef sout;
CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, sample, count, pInfo, &sout);
free(pInfo);

IOS AVPlayer get fps

Im trying to figure out how to retrieve a videos frame rate via AVPlayer. AVPlayerItem has a rate variable but it only returns a value between 0 and 2 (usually 1 when playing). Anybody have an idea how to get the video frame rate?
Cheers
Use AVAssetTrack's nominalFrameRate property.
Below method to get FrameRate : Here queuePlayer is AVPlayer
-(float)getFrameRateFromAVPlayer
{
float fps=0.00;
if (self.queuePlayer.currentItem.asset) {
AVAssetTrack * videoATrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
if(videoATrack)
{
fps = videoATrack.nominalFrameRate;
}
}
return fps;
}
Swift 4 version of the answer:
let asset = avplayer.currentItem.asset
let tracks = asset.tracks(withMediaType: .video)
let fps = tracks?.first?.nominalFrameRate
Remember to handle nil checking.
There seems to be a discrepancy in this nominalFrameRate returned for the same media played on different versions of iOS. I have a video I encoded with ffmpeg at 1 frame per second (125 frames) with keyframes every 25 frames and when loading in an app on iOS 7.x the (nominal) frame rate is 1.0, while on iOS 8.x the (nominal) frame rate is 0.99. This seems like a very small difference, however in my case I need to navigate precisely to a given frame in the movie and this difference screws up such navigation (the movie is an encoding of a sequence of presentation slides). Given that I already know the frame rate of the videos my app needs to play (e.g. 1 fps) I can simply rely on this value instead of determining the frame rate dynamically (via nominalFrameRate value), however I wonder WHY there is such discrepancy between iOS versions as far as this nominalFrameRate goes. Any ideas?
The rate value on AVPlayer is the speed relative to real time to which it's playing, eg 0.5 is slow motion, 2 is double speed.
As Paresh Navadiya points out a track also has a nominalFrameRate variable however this seems to sometimes give strange results. the best solution I've found so far is to use the following:
CMTime frameDuration = [myAsset tracksWithMediaType:AVMediaTypeVideo][0].minFrameDuration;
float fps = frameDuration.timescale/(float)frameDuration.value;
The above gives slightly unexpected results for variable frame rate but variable frame rate has slightly odd behavior anyway. Other than that it matches ffmpeg -i in my tests.
EDIT ----
I've found sometimes the above gives time kCMTimeZero. The workaround I've used for this is to create an AVAssetReader with a track output,get the pts of the first frame and second frame then do a subtraction of the two.
I don't know anything in AVPlayer that can help you to calculate the frame rate.
AVPlayerItem rate property is the playback rate, nothing to do with the frame rate.
The easier options is to obtain a AVAssetTrack and read its nominalFrameRate property. Just create an AVAsset and you'll get an array of tracks.
Or use AVAssetReader to read the video frame by frame, get its presentation time and count how many frames are in the same second, then average for a few seconds or the whole video.
This is not gonna work anymore, API has changed, and this post is old. :(
The swift 4 answer is also cool, this is answer is similar.
You get the video track from the AVPlayerItem, and you check the FPS there. :)
private var numberOfRenderingFailures = 0
func isVideoRendering() -> Bool {
guard let currentItem = player.currentItem else { return false }
// Check if we are playing video tracks
let isRendering = currentItem.tracks.contains { ($0.assetTrack?.mediaType == .video) && ($0.currentVideoFrameRate > 5) }
if isRendering {
numberOfRenderingFailures = 0
return true
}
numberOfRenderingFailures += 1
if numberOfRenderingFailures < 5 {
return true
}
return false
}

Resources