Is it possible to create a QTTimeRange from two frame numbers? - quicktime

All QTKit examples use seconds for making ranges. I, unfortunately, have frame numbers and need to be frame accurate. I suppose I could multiply up by the frame rate if I could figure out how to get that out of my movie.

You should be able to calculate the frame rate of a given video media by querying the following mediaAttributes of a QTMedia:
QTMediaDurationAttribute
QTMediaSampleCountAttribute
(they are described in the QTKit docs here)
and use the following formula for calculation:
QTTime duration = ... // value get from mediaAttribute
NSNumber sample_count = ... // value get from mediaAttribute
double fps = (sample_count.longValue * duration.timeScale) / duration.timeValue;
Disclaimer:
Note that I have not tried if this works, but that it is how I expect it to work based on my experience the QuickTime C APIs and the QuickTime File Format.
Good Luck!

Multiplying by the frame rate is not frame accurate because many of the file containers and codecs that Quicktime uses make use of variable frame rates to get better compression. You'll notice this in any kind of movie that has freezes frame for any length of time. See macbreak's The Road to 1080p, part1 as an example.
You can do frame accurate ranges with the QTMovie methods frameStartTime:atTime and frameEndTime:atTime introduced in OSX 10.6. These will give you the start and end of a frame respectively without doing frame decoding.
For example to count all the frames in a movie:
// Initialize QTMovie object called 'movie', disable looping, etc
[movie gotoEnd];
QTTime endTime = [movie currentTime];
[movie gotoBeginning];
QTTime curTime = [movie currentTime];
unsigned long numFrames = 0;
while (true)
{
% get the end time of the current frame
[movie frameEndTime:&curTime];
numFrames++;
% If we get to the last frame, stop counting
if (QTTimeCompare(curTime, endTime) == NSOrderedSame)
{
break;
}
}

Related

Problem setting video frame rate using AVAssetWriter/AVAssetReader

Situation:
I am trying to export video with some parameters like video bit rate, audio bit rate, frame rate, changing video resolution, etc. Note that I am letting the user set the video frame rate in fractions; like user can set the video frame rate say, 23.98.
I use AVAssetWriter and AVAssetReader for this operation. I use AVAssetWriterInputPixelBufferAdaptor for writing the sample buffers.
Everything else works just fine, except the video frame rate.
What I have tried:
Setting the AVAssetWriter.movieTimeScale as suggested here. Which does change the video frame rate but also makes the video sluggish. (gist here)
Setting AVVideoExpectedSourceFrameRateKey. Which does not help. (gist here)
Setting AVAssetWriterInput.mediaTimeScale. Again, it changes the video frame rate but makes the video sluggish as AVAssetWriter.movieTimeScale does. The video shows different frames at some point and sometimes it sticks and resumes again. (gist here)
Using AVAssetReaderVideoCompositionOutput and setting AVMutableVideoComposition.frameDuration; just like SDAVAssetExportSession does. Ironically with SDAVAssetExportSession code, the video is being exported just at the right frame rate that I want, but it just does not work in my code. gist here
I am not sure why it won't work with my code. The issue with this approach is it always returns nil from AVAssetReaderVideoCompositionOutput.copyNextSampleBuffer().
Manually changing the timestamp of the frame with CMSampleTimingInfo, as suggested here Something like:
var sampleTimingInfo = CMSampleTimingInfo()
var sampleBufferToWrite: CMSampleBuffer?
CMSampleBufferGetSampleTimingInfo(vBuffer, at: 0, timingInfoOut: &sampleTimingInfo)
sampleTimingInfo.duration = CMTimeMake(value: 100, timescale: Int32(videoConfig.videoFrameRate * 100))
sampleTimingInfo.presentationTimeStamp = CMTimeAdd(previousPresentationTimeStamp, sampleTimingInfo.duration)
previousPresentationTimeStamp = sampleTimingInfo.presentationTimeStamp
let status = CMSampleBufferCreateCopyWithNewTiming(allocator: kCFAllocatorDefault, sampleBuffer: vBuffer,sampleTimingEntryCount: 1, sampleTimingArray: &sampleTimingInfo, sampleBufferOut: &sampleBufferToWrite)
With this approach, I do get the frame rate set just right, but it increases the video duration (as mentioned in the comment of that question’s answer). I think at some point I may have to discard some frames (if the target frame rate is lower; I need to lower the frame rate in most of the cases).
If I know that if I want 30fps, and my current frame rate is 60fps, it's simple to discard every second frame and setting the SampleBuffer time accordingly.
If I go with this approach(i.e. setting 23.98 fps), how do I decide which frame to discard and if the target frame rate is higher, which frame to duplicate? Reminder: the frame rate could be in fractions.
Here is an idea to select frames. Suppose the fps of source video is F and target fps is TF. rate = TF/F
Initiate a variable n equal to -rate and add rate each time,
when the integer part of n changed, select the frame.
e.g. rate = 0.3
n: -0.3 0 0.3 0.6 0.9 1.2 1.5 1.8 2.1
^ ^ ^
frame index: 0 1 2 3 4 5 6 7
select 0 4 7
float rate = 0.39999f; // TF/F
float n = -rate; // to make sure first frame will be selected
for (int i = 0; i < 100; ++i, n += rate) { // i stands for frame index, take a video with 100 frames as an example
int m = floor(n);
int tmp = n+rate;
// if rate > 1.0 repeat i
// if rate < 1.0 some of the frames will be dropped
for (int j = 0; m+j < tmp; ++j) {
// Use this frame
printf("%d ", i);
}
}
NSMutableDictionary *writerInputParams = [[NSMutableDictionary alloc] init];
[writerInputParams setObject:AVVideoCodecTypeH264 forKey:AVVideoCodecKey];
[writerInputParams setObject:[NSNumber numberWithInt:width] forKey:AVVideoWidthKey];
[writerInputParams setObject:[NSNumber numberWithInt:height] forKey:AVVideoHeightKey];
[writerInputParams setObject:AVVideoScalingModeResizeAspectFill forKey:AVVideoScalingModeKey];
NSMutableDictionary * compressionProperties = [[NSMutableDictionary alloc] init];
[compressionProperties setObject:[NSNumber numberWithInt: 20] forKey:AVVideoExpectedSourceFrameRateKey];
[compressionProperties setObject:[NSNumber numberWithInt: 20] forKey:AVVideoAverageNonDroppableFrameRateKey];
[compressionProperties setObject:[NSNumber numberWithInt: 0.0] forKey:AVVideoMaxKeyFrameIntervalDurationKey];
[compressionProperties setObject:[NSNumber numberWithInt: 1] forKey:AVVideoMaxKeyFrameIntervalKey];
[compressionProperties setObject:[NSNumber numberWithBool:YES] forKey:AVVideoAllowFrameReorderingKey];
[compressionProperties setObject:AVVideoProfileLevelH264BaselineAutoLevel forKey:AVVideoProfileLevelKey];
[writerInputParams setObject:compressionProperties forKey:AVVideoCompressionPropertiesKey];
self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:writerInputParams];
self.assetWriterInput.expectsMediaDataInRealTime = YES;
It has been verified that SCNView refreshes 60 frames per second, but using AVAssetWriter only wants to save 20 frames per second, what should to do?
Neither AVVideoExpectedSourceFrameRateKey nor AVVideoAverageNonDroppableFrameRateKey above will not affect fps, config fps will not work !!!
// Set this to make sure that a functional movie is produced, even if the recording is cut off mid-stream. Only the last second should be lost in that case.
self.videoWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);
self.videoWriter.shouldOptimizeForNetworkUse = YES;
self.videoWriter.movieTimeScale = 20;
The above configuration will not affect fps either.
self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:writerInputParams];
self.assetWriterInput.expectsMediaDataInRealTime = YES;
/// this config will change video frame presenttime to fit fps, but it will be change video duration.
// self.assetWriterInput.mediaTimeScale = 20;
self.assetWriterInput.mediaTimeScale will affect the fps, but will cause the video duration to be stretched by 3 times, because
BOOL isSUc = [self.writerAdaptor appendPixelBuffer:cvBuffer withPresentationTime:presentationTime]; The time of the filled frame will be re-modified, so the self.assetWriterInput.mediaTimeScale value is configured, which is seriously inconsistent with expectations, and the video duration should not be stretched.
So if you want to control the fps of the video that AVAssetWriter finally saves, you must pass the control, and must make sure call 20 per second.
CMTime presentationTime = CMTimeMake(_writeCount * (1.0/20.0) * 1000, 1000);
BOOL isSUc = [self.writerAdaptor appendPixelBuffer:cvBuffer withPresentationTime:presentationTime];
_writeCount += 1;

How to get the accurate time position of a live streaming in avplayer

I'm using AVPlayer to play a live streaming. This stream supports one hour catch-up which means user can seek to one hour ago and play. But I have one question how do I know the accurate position that the player is playing. I need to display current position on the player view. For example,if user is playing half an hour ago then display -30:00; if user is playing the latest content, the player will show 00:00 or live. Thanks
Swift solution :
override func getLiveDuration() -> Float {
var result : Float = 0.0;
if let items = player.currentItem?.seekableTimeRanges {
if(!items.isEmpty) {
let range = items[items.count - 1]
let timeRange = range.timeRangeValue
let startSeconds = CMTimeGetSeconds(timeRange.start)
let durationSeconds = CMTimeGetSeconds(timeRange.duration)
result = Float(startSeconds + durationSeconds)
}
}
return result;
}
To get a live position poison and seek to it you can by using seekableTimeRanges of AVPlayerItem:
CMTimeRange seekableRange = [player.currentItem.seekableTimeRanges.lastObject CMTimeRangeValue];
CGFloat seekableStart = CMTimeGetSeconds(seekableRange.start);
CGFloat seekableDuration = CMTimeGetSeconds(seekableRange.duration);
CGFloat livePosition = seekableStart + seekableDuration;
[player seekToTime:CMTimeMake(livePosition, 1)];
Also when you seek some time back, you can get current playing position by calling currentTime method
CGFloat current = CMTimeGetSeconds([self.player.currentItem currentTime]);
CGFloat diff = livePosition - current;
I know this question is old, but I had the same requirement and I believe the solutions aren't addressing properly the intent of the question.
What I did for this same requirement was to gather the current point in time, the starting time, and the length of the total duration of the stream.
I'll explain something before going further, the current point in time could surpass the (starting time + total duration) this is due to the way hls is structured as ts segments. Ts segments are small chucks of playable video, you could have on your seekable range 5 ts segments of 10 seconds each. This doesn't mean that 50 secs is the full length of the live stream, there is around a full segment more (so 60 seconds of playtime total) but it isn't categorized as seekable since you shouldn't seek to that segment. If you were to do this you'll notice in most instances rebuffering (cause the source may be still creating the next ts segment when you already reached the end of playback).
What I did was checking if the current stream time is further than the seekable rage, if so this would mean were are live on stream. If it isn't you could easily calculate how far behind you are from live if you subtract the current time, starting time, and total duration.
let timeRange:CMTimeRange = player.currentItem?.seekableTimeRanges.last
let start = timeRange.start.seconds
let totalDuration = timeRange.duration.seconds
let currentTime = player.currentTime().seconds
let secondsBehindLive = currentTime - totalDuration - start
The code above will give you a negative number with the number of seconds behind "live" or more specifically the start of the lastest ts segment. Or a positive number or zero when it's playing the latest ts segment.
Tbh I don't really know when does the seekableTimeRanges will have more than 1 value, it has always been just one for the streams I have tested with, but if you find in your streams more than 1 value you may have to figure if you want to add all the ranges duration, which time range to use as the start value, etc. At least for my use case, this was enough.

How do I fetch the current frame of a video playing in iOS using AVFoundation?

I can see the nominalFrameRate for some video tracks, but not current frame in AVFoundation docs. How can I get the current frame number of the track as it is played in an AVPlayer? I know frame rates will vary, and nominalFrameRate will always be 0.0 in .m3u8 streams, but surely there must be a way to get the frame number of the currently playing track without having to multiply nominalFrameRate by currentTime?
Thanks.
For iOS 7+ you can use the currentVideoFrameRate property of AVPlayerItemTrack. Its the only consistent property that I've seen measure FPS. The nominalFrameRate property seems to be broken in HLS streams and always returns 0.0 as you mentioned.
AVPlayerItem *item = AVPlayer.currentItem; // Your current item
float fps = 0.00;
for (AVPlayerItemTrack *track in item.tracks) {
if ([track.assetTrack.mediaType isEqualToString:AVMediaTypeVideo]) {
fps = track.currentVideoFrameRate;
}
}

ios audio queue - how to meter audio level in buffer?

I'm working on an app that should do some audio signal processing. I need to measure the audio level in each one of the buffers I get (through the Callback function). I've been searching the web for some time, and I found that there is a build-in property called Current level metering:
AudioQueueGetProperty(recordState->queue,kAudioQueueProperty_CurrentLevelMeter,meters,&dlen);
This property gets me the average or peak audio level, but it's not synchronised to the current buffer.
I figured out I need to calculate the audio level from the buffer data by myself, so I had this:
double calcAudioRMS (SInt16 * audioData, int numOfSamples)
{
double RMS, adPercent;
RMS = 0;
for (int i=0; i<numOfSamples; i++)
{
adPercent=audioData[i]/32768.0f;
RMS += adPercent*adPercent;
}
RMS = sqrt(RMS / numOfSamples);
return RMS;
}
This function gets the audio data (casted into Sint16) and the number of samples in the current buffer. The numbers I get are indeed between 0 and 1, but they seem to be rather random and low comparing to the numbers I got from the built-in audio level metering.
The recording audio format is:
format->mSampleRate = 8000.0;
format->mFormatID = kAudioFormatLinearPCM;
format->mFramesPerPacket = 1;
format->mChannelsPerFrame = 1;
format->mBytesPerFrame = 2;
format->mBytesPerPacket = 2;
format->mBitsPerChannel = 16;
format->mReserved = 0;
format->mFormatFlags = kLinearPCMFormatFlagIsSignedInteger |kLinearPCMFormatFlagIsPacked;
My question is how to get the right values from the buffer? Is there a built-in function \ property for this? Or should I calculate the audio level myself, and how to do it?
Thanks in advance.
Your calculation for RMS power is correct. I'd be inclined to say that you have a fewer number of samples than Apple does, or something similar, and that would explain the difference. You can check by inputting a loud sine wave, and checking that Apple (and you) calculate RMS power at 1/sqrt(2).
Unless there's a good reason, I would use Apple's power calculations. I've used them, and they seem good to me. Additionally, generally you don't want RMS power, you want RMS power as decibels, or use the kAudioQueueProperty_CurrentLevelMeterDB constant. (This depends on if you're trying to build an audio meter, or truly display the audio power)

IOS AVPlayer get fps

Im trying to figure out how to retrieve a videos frame rate via AVPlayer. AVPlayerItem has a rate variable but it only returns a value between 0 and 2 (usually 1 when playing). Anybody have an idea how to get the video frame rate?
Cheers
Use AVAssetTrack's nominalFrameRate property.
Below method to get FrameRate : Here queuePlayer is AVPlayer
-(float)getFrameRateFromAVPlayer
{
float fps=0.00;
if (self.queuePlayer.currentItem.asset) {
AVAssetTrack * videoATrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
if(videoATrack)
{
fps = videoATrack.nominalFrameRate;
}
}
return fps;
}
Swift 4 version of the answer:
let asset = avplayer.currentItem.asset
let tracks = asset.tracks(withMediaType: .video)
let fps = tracks?.first?.nominalFrameRate
Remember to handle nil checking.
There seems to be a discrepancy in this nominalFrameRate returned for the same media played on different versions of iOS. I have a video I encoded with ffmpeg at 1 frame per second (125 frames) with keyframes every 25 frames and when loading in an app on iOS 7.x the (nominal) frame rate is 1.0, while on iOS 8.x the (nominal) frame rate is 0.99. This seems like a very small difference, however in my case I need to navigate precisely to a given frame in the movie and this difference screws up such navigation (the movie is an encoding of a sequence of presentation slides). Given that I already know the frame rate of the videos my app needs to play (e.g. 1 fps) I can simply rely on this value instead of determining the frame rate dynamically (via nominalFrameRate value), however I wonder WHY there is such discrepancy between iOS versions as far as this nominalFrameRate goes. Any ideas?
The rate value on AVPlayer is the speed relative to real time to which it's playing, eg 0.5 is slow motion, 2 is double speed.
As Paresh Navadiya points out a track also has a nominalFrameRate variable however this seems to sometimes give strange results. the best solution I've found so far is to use the following:
CMTime frameDuration = [myAsset tracksWithMediaType:AVMediaTypeVideo][0].minFrameDuration;
float fps = frameDuration.timescale/(float)frameDuration.value;
The above gives slightly unexpected results for variable frame rate but variable frame rate has slightly odd behavior anyway. Other than that it matches ffmpeg -i in my tests.
EDIT ----
I've found sometimes the above gives time kCMTimeZero. The workaround I've used for this is to create an AVAssetReader with a track output,get the pts of the first frame and second frame then do a subtraction of the two.
I don't know anything in AVPlayer that can help you to calculate the frame rate.
AVPlayerItem rate property is the playback rate, nothing to do with the frame rate.
The easier options is to obtain a AVAssetTrack and read its nominalFrameRate property. Just create an AVAsset and you'll get an array of tracks.
Or use AVAssetReader to read the video frame by frame, get its presentation time and count how many frames are in the same second, then average for a few seconds or the whole video.
This is not gonna work anymore, API has changed, and this post is old. :(
The swift 4 answer is also cool, this is answer is similar.
You get the video track from the AVPlayerItem, and you check the FPS there. :)
private var numberOfRenderingFailures = 0
func isVideoRendering() -> Bool {
guard let currentItem = player.currentItem else { return false }
// Check if we are playing video tracks
let isRendering = currentItem.tracks.contains { ($0.assetTrack?.mediaType == .video) && ($0.currentVideoFrameRate > 5) }
if isRendering {
numberOfRenderingFailures = 0
return true
}
numberOfRenderingFailures += 1
if numberOfRenderingFailures < 5 {
return true
}
return false
}

Resources