AVPlayer Item get a nan duration (invalid) - ios

I am using AVAsset class and passed url. But it does not provide the duration. I only get nan.
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
double duration = CMTimeGetSeconds(asset.duration);

Try below code,
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:info[UIImagePickerControllerMediaURL] options:nil]; // info[UIImagePickerControllerMediaURL] is url(file url) here. You should pass your url here
CMTime videoDuration = asset1.duration;
float videoDurationInSeconds = CMTimeGetSeconds(videoDuration);
and make sure that your url is file url!!

Related

No rendering with AVMutableVideoComposition

I have 3 videos that I am sequencing using an AVMutableComposition and then playing the video using an AVPlayer and grabbing the frames using an AVPlayerItemVdeoOutput. The video sequence is as follows:
[Logo Video - n seconds][Main video - m seconds][Logo Video - l seconds]
The code looks like this:
// Build the composition.
pComposition = [AVMutableComposition composition];
// Fill in the assets that make up the composition
AVMutableCompositionTrack* pCompositionVideoTrack = [pComposition addMutableTrackWithMediaType: AVMediaTypeVideo preferredTrackID: 1];
AVMutableCompositionTrack* pCompositionAudioTrack = [pComposition addMutableTrackWithMediaType: AVMediaTypeAudio preferredTrackID: 2];
CMTime time = kCMTimeZero;
CMTimeRange keyTimeRange = kCMTimeRangeZero;
for( AVAsset* pAssetsAsset in pAssets )
{
AVAssetTrack* pAssetsAssetVideoTrack = [pAssetsAsset tracksWithMediaType: AVMediaTypeVideo].firstObject;
AVAssetTrack* pAssetsAssetAudioTrack = [pAssetsAsset tracksWithMediaType: AVMediaTypeAudio].firstObject;
CMTimeRange timeRange = CMTimeRangeMake( kCMTimeZero, pAssetsAsset.duration );
NSError* pVideoError = nil;
NSError* pAudioError = nil;
if ( pAssetsAssetVideoTrack )
{
[pCompositionVideoTrack insertTimeRange: timeRange ofTrack: pAssetsAssetVideoTrack atTime: time error: &pVideoError];
}
if ( pAssetsAssetAudioTrack )
{
[pCompositionAudioTrack insertTimeRange: timeRange ofTrack: pAssetsAssetAudioTrack atTime: time error: &pAudioError];
}
if ( pAssetsAsset == pKeyAsset )
{
keyTimeRange = CMTimeRangeMake( time, timeRange.duration );
}
NSLog( #"%#", [pVideoError description] );
NSLog( #"%#", [pAudioError description] );
time = CMTimeAdd( time, pAssetsAsset.duration );
}
The logo videos are silent and merely display my logo. I manually create these videos so everything is perfect here. The "Main Video" however can end up with the wrong orientation. To combat this an AVMutableVideoComposition looks like the perfect way forward. So I setup a simple video composition that does a simple setTransform as follows:
pAsset = pComposition;
pPlayerItem = [AVPlayerItem playerItemWithAsset: pAsset];
pPlayer = [AVPlayer playerWithPlayerItem: pPlayerItem];
NSArray* pPlayerTracks = [pAsset tracksWithMediaType: AVMediaTypeVideo];
AVAssetTrack* pPlayerTrack = pPlayerTracks[0];
pVideoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
[pVideoCompositionLayerInstruction setTransform: [[pKeyAsset tracksWithMediaType: AVMediaTypeVideo].firstObject preferredTransform] atTime: kCMTimeZero];
pVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
pVideoCompositionInstruction.backgroundColor = [[UIColor blackColor] CGColor];
pVideoCompositionInstruction.timeRange = keyTimeRange;
pVideoCompositionInstruction.layerInstructions = #[ pVideoCompositionLayerInstruction ];
pVideoComposition = [AVMutableVideoComposition videoComposition];
pVideoComposition.renderSize = [[pKeyAsset tracksWithMediaType: AVMediaTypeVideo].firstObject naturalSize];
pVideoComposition.frameDuration = [[pKeyAsset tracksWithMediaType: AVMediaTypeVideo].firstObject minFrameDuration];
pVideoComposition.instructions = #[ pVideoCompositionInstruction ];
pPlayerItem.videoComposition = pVideoComposition;
However when I come to play the video sequence I get no output returned. AVPlayerItemVideoOutput hasNewPixelBufferForItemTime always returns NO. If I comment out the last line in the code above (ie the setting the videoComposition) then everything works as before (with videos with the wrong orientation). Does anybody know what I'm doing wrong? Any thoughts much appreciated!
The issue here is that keyTimeRange may not start at time zero if your Logo video has nonzero duration. pVideoCompositionInstruction will start at keyTimeRange.start, rather than kCMTimeZero (when the AVMutableComposition will start), which violates the rules for AVVideoCompositionInstructions
"For the first instruction in the array, timeRange.start must be less than or equal to the earliest time for which playback or other processing will be attempted (typically kCMTimeZero)", according to the docs
To solve this, set pVideoComposition.instructions to an array containing three AVMutableVideoCompositionInstruction objects, each with their own AVMutableVideoCompositionLayerInstruction according to each AVAsset's transform. The time range for each of the three instructions should be the times at which these assets appear in the composition track. Make sure they line up exactly.

Processing all frames in an AVAsset

I am trying to go through each frame in an AVAsset and process each frame as if it were an image. I have not been able to find anything from my searches.
The task I am trying to accomplish would look like this in pseudo-code
for each frame in asset
take the frame as an image and convert to a cvMat
Process and store data of center points
Store center points in array
The only part in that pseudo-code I do not know how to write is the going though each frame and capturing it in an image.
Can anyone help?
One answer is to use AVAssetImageGenerator.
1) Load the movie file into an AVAsset object.
2) Create an AVAssetImageGenerator object.
3) Pass in an estimated time of the frame where you want to get an image back from the movie.
Setting the 2 properties requestedTimeToleranceBefore and requestedTimeToleranceAfter on the AVAssetImageGenerator object to kCMTimeZero will increase the ability to get individual frames, but increases the processing time.
However this method is slow and I have not found a faster way.
//Load the Movie from a URL
self.movieAsset = [AVAsset assetWithURL:self.movieURL];
NSArray *movieTracks = [self.movieAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *movieTrack = [movieTracks objectAtIndex:0];
//Make the image Generator
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:self.movieAsset];
//Create a variables for the time estimation
Float64 durationSeconds = CMTimeGetSeconds(self.movieAsset.duration);
Float64 timePerFrame = 1.0 / (Float64)movieTrack.nominalFrameRate;
Float64 totalFrames = durationSeconds * movieTrack.nominalFrameRate;
//Step through the frames
for (int counter = 0; counter <= totalFrames; counter++){
CMTime actualTime;
Float64 secondsIn = ((float)counter/totalFrames)*durationSeconds;
CMTime imageTimeEstimate = CMTimeMakeWithSeconds(secondsIn, 600);
NSError *error;
CGImageRef image = [imageGenerator copyCGImageAtTime:imageTimeEstimate actualTime:&actualTime error:&error];
...Do some processing on the image
CGImageRelease(image);
}
You could simply gen each frame using AVAssetReaderTrackOutput:
let asset = AVAsset(url: inputUrl)
let reader = try! AVAssetReader(asset: asset)
let videoTrack = asset.tracks(withMediaType: .video).first!
let outputSettings = [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)]
let trackReaderOutput = AVAssetReaderTrackOutput(track: videoTrack,
outputSettings: outputSettings)
reader.add(trackReaderOutput)
reader.startReading()
while let sampleBuffer = trackReaderOutput.copyNextSampleBuffer() {
if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
// do what you want
}
}

precise timing with AVMutableComposition

I'm trying to use AVMutableComposition to play a sequence of sound files at precise times.
When the view loads, I create the composition with the intent of playing 4 sounds evenly spaced over 1 second. It shouldn't matter how long or short the sounds are, I just want to fire them at exactly 0, 0.25, 0.5 and 0.75 seconds:
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
NSDictionary *options = #{AVURLAssetPreferPreciseDurationAndTimingKey : #YES};
for (NSInteger i = 0; i < 4; i++)
{
AVMutableCompositionTrack* track = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSURL *url = [[NSBundle mainBundle] URLForResource:[NSString stringWithFormat:#"sound_file_%i", i] withExtension:#"caf"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:options];
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;
CMTimeRange timeRange = [assetTrack timeRange];
Float64 t = i * 0.25;
NSError *error;
BOOL success = [track insertTimeRange:timeRange ofTrack:assetTrack atTime:CMTimeMakeWithSeconds(t, 1) error:&error];
if (!success)
{
NSLog(#"unsuccesful creation of composition");
}
if (error)
{
NSLog(#"composition creation error: %#", error);
}
}
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.avPlayer = [[AVPlayer alloc] initWithPlayerItem:playerItem];
The composition is created successfully with no errors. Later, when I want to play the sequence I do this:
[self.avPlayer seekToTime:CMTimeMakeWithSeconds(0, 1)];
[self.avPlayer play];
For some reason, the sounds are not evenly spaced at all - but play almost all at once. I tried the same thing spaced over 4 seconds, replacing the time calculation like this:
Float64 t = i * 1.0;
And this plays perfectly. Any time interval under 1 second seems to generate unexpected results. What am I missing? Are AVCompositions not supposed to be used for time intervals under 1 second? Or perhaps I'm misunderstanding the time intervals?
Your CMTimeMakeWithSeconds(t, 1) is in whole second 'slices' because your timescale is set to 1. No matter what fraction t is, the atTime: will always end up as 0. This is why it works when you increase it to 1 second (t=i*1).
You need to set the timescale to 4 to get your desired 0.25 second slices. Since the CMTime is now in .25 second slices, you won't need the i * 0.25 calculcation. Just use the i directly; atTime:CMTimeMake(i, 4)
If you might need to get more precise in the future, you should account for it now so you won't have to adjust your code later. Apple recommends using a timescale of 600 as it is a multiple of the common video framerates (24, 25, and 30 FPS) but it works fine for audio-only too. So for your situation, you would use 24 slices to get your .25 second value; Float64 t = i * 24; atTime:CMTimeMake(t, 600)
As for your issue of all 4 sounds playing almost all at once, be aware of this unanswered SO question where it only happens on the first play. Even with the changes above, you might still run into this issue.
Unless each track is exactly 0.25 seconds long this is your problem:
Float64 t = i * 0.25;
NSError *error;
BOOL success = [track insertTimeRange:timeRange ofTrack:assetTrack atTime:CMTimeMakeWithSeconds(t, 1) error:&error];
You need to be keeping track of the cumulative time range added so far, and inserting the next track at that time:
CMTime currentTime = kCMTimeZero;
for (NSInteger i = 0; i < 4; i++) {
/* Code to create track for insertion */
CMTimeRange trackTimeRange = [assetTrack timeRange];
BOOL success = [track insertTimeRange:trackTimeRange
ofTrack:assetTrack
atTime:currentTime
error:&error];
/* Error checking code */
//Update time range for insertion
currentTime = CMTimeAdd(currentTime,trackTimeRange.duration);
}
i changed a bit your code, sorry i had no time to test it.
AVMutableComposition *composition = [AVMutableComposition composition];
NSDictionary *options = #{AVURLAssetPreferPreciseDurationAndTimingKey : #YES};
CMTime totalDuration = kCMTimeZero;
for (NSInteger i = 0; i < 4; i++)
{
AVMutableCompositionTrack* track = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:#"Record_%i", i] ofType:#"caf"]];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:options];
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTimeRange timeRange = [assetTrack timeRange];
NSError *error;
BOOL success = [track insertTimeRange:timeRange ofTrack:assetTrack atTime:CMTIME_COMPARE_INLINE(totalDuration, >, kCMTimeZero)? CMTimeAdd(totalDuration, CMTimeMake(1, 4)): totalDuration error:&error];
if (!success)
{
NSLog(#"unsuccesful creation of composition");
}
if (error)
{
NSLog(#"composition creation error: %#", error);
}
totalDuration = CMTimeAdd(CMTimeAdd(totalDuration,CMTimeMake(1, 4)), asset.duration);
}
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.avPlayer = [[AVPlayer alloc] initWithPlayerItem:playerItem];
P.S. use kCMTimeZero instead of CMTimeMakeWithSeconds(0, 1).

Determine whether mp4 is a audio or video in objective C

Since mp4 is a container file format it can store audio as well as video files. What i am struggling to find out is its true media type. (Whether its a audio or video) Could this be done in IOS (objective c) ?
AVAsset *asset = [AVAsset assetWithURL:<URL to mp4>];
BOOL hasVideo = [asset tracksWithMediaType:AVMediaTypeVideo].count > 0;
BOOL hasAudio = [asset tracksWithMediaType:AVMediaTypeAudio].count > 0;
BOOL isMP4VideoType; //Global Variables;
BOOL isMP4AudioType; //Global Variables;
//Create an object of AVAsset class With MP4 URL.
AVAsset *asset = [AVAsset assetWithURL:YOUR_MP4_URL];
NSArray *aryVideoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSArray *aryAudioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
if([aryVideoTracks count]!=0)
{
isMP4VideoType = YES;
}else if([aryVideoTracks count]!=0)
{
isMP4AudioType = YES;
}
AVURLAsset* asset = [[AVURLAsset alloc]initWithURL:_assetUrl options:nil];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack * audioTrack = [audioTracks firstObject];
if([track hasMediaCharacteristic:AVMediaCharacteristicAudible]){
}

Why doesn't my AVURLAsset have a duration?

In my iOS app (SDK 5.6), I create an AVURLAsset from a short video, like so:
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
This seems to work fine. If I examine the asset in gdb, it looks like this:
(gdb) po asset
<AVURLAsset: 0x1a1c40, URL = file://localhost/var/mobile/Applications/A2142D8E-BC19-4E0B-A4C8-ABED4F7D4970/Documents/sample_iTunes.mov>
I use a sample .mov file from the Apple website, stored in the app's Documents directory.
I would like to know the duration of this video, for further processing, but when I examine the duration property, I get this:
(gdb) po asset.duration
There is no member named duration.
What am I doing wrong? Is there any other way to determine the duration of an AVAsset or AVAssetTrack?
TIA: John
Creating the asset, however, does not necessarily mean that it’s ready for use. To be used, an asset must have loaded its tracks. Load the asset with
[asset loadValuesAsynchronouslyForKeys:[NSArray
arrayWithObject:tracksKey] completionHandler:
^{
// The completion block goes here.
}];
And there should be a 'duration' key in the array.
When loading completes, get the duration with
Float64 durationSeconds = CMTimeGetSeconds([<#An asset#> duration]);
For Swift 3.0 and above
import AVFoundation
let asset = AVAsset(url: URL(fileURLWithPath: ("Your File Path here")!))
let totalSeconds = Int(CMTimeGetSeconds(asset.duration))
let minutes = totalSeconds / 60
let seconds = totalSeconds % 60
let mediaTime = String(format:"%02i:%02i",minutes, seconds)

Resources