In my iOS app (SDK 5.6), I create an AVURLAsset from a short video, like so:
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
This seems to work fine. If I examine the asset in gdb, it looks like this:
(gdb) po asset
<AVURLAsset: 0x1a1c40, URL = file://localhost/var/mobile/Applications/A2142D8E-BC19-4E0B-A4C8-ABED4F7D4970/Documents/sample_iTunes.mov>
I use a sample .mov file from the Apple website, stored in the app's Documents directory.
I would like to know the duration of this video, for further processing, but when I examine the duration property, I get this:
(gdb) po asset.duration
There is no member named duration.
What am I doing wrong? Is there any other way to determine the duration of an AVAsset or AVAssetTrack?
TIA: John
Creating the asset, however, does not necessarily mean that it’s ready for use. To be used, an asset must have loaded its tracks. Load the asset with
[asset loadValuesAsynchronouslyForKeys:[NSArray
arrayWithObject:tracksKey] completionHandler:
^{
// The completion block goes here.
}];
And there should be a 'duration' key in the array.
When loading completes, get the duration with
Float64 durationSeconds = CMTimeGetSeconds([<#An asset#> duration]);
For Swift 3.0 and above
import AVFoundation
let asset = AVAsset(url: URL(fileURLWithPath: ("Your File Path here")!))
let totalSeconds = Int(CMTimeGetSeconds(asset.duration))
let minutes = totalSeconds / 60
let seconds = totalSeconds % 60
let mediaTime = String(format:"%02i:%02i",minutes, seconds)
Related
Essentially I am looking to concatenate AVAsset files. I've got a rough idea of what to do but I'm struggling with loading the audio files.
I can play the files with an AVAudioPlayer, I can see them in the directory via my terminal, but when I attempt to load them with AVAssetURL it always returns an empty array for tracks.
The URL I am using:
NSURL *firstAudioFileLocation = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", workingDirectory , #"/temp.pcm"]];
Which results in:
file:///Users/evolve/Library/Developer/CoreSimulator/Devices/8BF465E8-321C-47E6-BF2E-049C5E900F3C/data/Containers/Data/Application/4A2D29B2-E5B4-4D07-AE6B-1DD15F5E59A3/Documents/temp.pcm
The asset being loaded:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
However when calling this:
NSLog(#" total tracks %#", test.tracks);
My output is always total tracks ().
My subsequent calls to add them to my AVMutableCompositionTrack end up crashing the app as the AVAsset seems to not have loaded correctly.
I have played with other variations for loading the asset including:
NSURL *alternativeLocation = [[NSBundle mainBundle] URLForResource:#"temp" withExtension:#"pcm"];
As well as trying to load AVAsset with the options from the documentation:
NSDictionary *assetOptions = #{AVURLAssetPreferPreciseDurationAndTimingKey: #YES};
How do I load the tracks from a local resource, recently created by the AVAudioRecorder?
EDIT
I had a poke around and found I can record and load a .CAF file extension.
Seems .PCM is unsupported for AVAsset, this page also was of great help. https://developer.apple.com/documentation/avfoundation/avfiletype
An AVAsset load is not instantaneous. You need to wait for the data to be available. Example:
AVAsset *test = [AVURLAsset URLAssetWithURL:firstAudioFileLocation options:nil];
[test loadValuesAsynchronouslyForKeys:#[#"playable",#"tracks"] completionHandler:^{
// Now tracks is available
NSLog(#" total tracks %#", test.tracks);
}];
A more detailed example can be found in the documentation.
I am using AVAsset class and passed url. But it does not provide the duration. I only get nan.
CMTime audioDuration = audioAsset.duration;
float audioDurationSeconds = CMTimeGetSeconds(audioDuration);
double duration = CMTimeGetSeconds(asset.duration);
Try below code,
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:info[UIImagePickerControllerMediaURL] options:nil]; // info[UIImagePickerControllerMediaURL] is url(file url) here. You should pass your url here
CMTime videoDuration = asset1.duration;
float videoDurationInSeconds = CMTimeGetSeconds(videoDuration);
and make sure that your url is file url!!
I am trying to go through each frame in an AVAsset and process each frame as if it were an image. I have not been able to find anything from my searches.
The task I am trying to accomplish would look like this in pseudo-code
for each frame in asset
take the frame as an image and convert to a cvMat
Process and store data of center points
Store center points in array
The only part in that pseudo-code I do not know how to write is the going though each frame and capturing it in an image.
Can anyone help?
One answer is to use AVAssetImageGenerator.
1) Load the movie file into an AVAsset object.
2) Create an AVAssetImageGenerator object.
3) Pass in an estimated time of the frame where you want to get an image back from the movie.
Setting the 2 properties requestedTimeToleranceBefore and requestedTimeToleranceAfter on the AVAssetImageGenerator object to kCMTimeZero will increase the ability to get individual frames, but increases the processing time.
However this method is slow and I have not found a faster way.
//Load the Movie from a URL
self.movieAsset = [AVAsset assetWithURL:self.movieURL];
NSArray *movieTracks = [self.movieAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *movieTrack = [movieTracks objectAtIndex:0];
//Make the image Generator
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:self.movieAsset];
//Create a variables for the time estimation
Float64 durationSeconds = CMTimeGetSeconds(self.movieAsset.duration);
Float64 timePerFrame = 1.0 / (Float64)movieTrack.nominalFrameRate;
Float64 totalFrames = durationSeconds * movieTrack.nominalFrameRate;
//Step through the frames
for (int counter = 0; counter <= totalFrames; counter++){
CMTime actualTime;
Float64 secondsIn = ((float)counter/totalFrames)*durationSeconds;
CMTime imageTimeEstimate = CMTimeMakeWithSeconds(secondsIn, 600);
NSError *error;
CGImageRef image = [imageGenerator copyCGImageAtTime:imageTimeEstimate actualTime:&actualTime error:&error];
...Do some processing on the image
CGImageRelease(image);
}
You could simply gen each frame using AVAssetReaderTrackOutput:
let asset = AVAsset(url: inputUrl)
let reader = try! AVAssetReader(asset: asset)
let videoTrack = asset.tracks(withMediaType: .video).first!
let outputSettings = [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)]
let trackReaderOutput = AVAssetReaderTrackOutput(track: videoTrack,
outputSettings: outputSettings)
reader.add(trackReaderOutput)
reader.startReading()
while let sampleBuffer = trackReaderOutput.copyNextSampleBuffer() {
if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
// do what you want
}
}
I have video in Mac player duration of video is 31 seconds. When I'm using it in my app and loading that file the duration of AVAsset is '28.03'.
AVAsset *videoAsset = [AVAsset assetWithURL:videoUrl];
Float64 time = CMTimeGetSeconds(videoAsset.duration);
For some types of assets a duration is an approximation. If you need the exact duration (should be an extreme case) use:
NSDictionary *options = #{AVURLAssetPreferPreciseDurationAndTimingKey: #YES};
AVURLAsset *videoAsset = [URLAssetWithURL:videoUrl options:options];
You can find more informations in documentation. Calculating the duration may take some time, so remember to use asynchronous loading:
[videoAsset loadValuesAsynchronouslyForKeys:#[#"duration"] completionHandler:^{
switch ([videoAsset statusOfValueForKey:#"duration" error:nil]) {
case AVKeyValueStatusLoaded:
Float64 time = CMTimeGetSeconds(videoAsset.duration);
// ...
break;
default:
// other cases like cancellation or fail
break;
}
}];
You can find some more tips on using AVFoundation API in the video Discovering AV Foundation - WWDC 2010 Session 405
This was working fine in iOS 6, but in iOS 7, after I export a portion of a song using AVAssetExportSession to a file, the exported file's duration is wrong in AVAudioPlayer, but correct in AVURLAsset.
AVAudioPlayer incorrectly reports the duration as the whole song duration.
I'm exporting files using steps from https://developer.apple.com/library/ios/qa/qa1730/_index.html
and checking the durations as below:
AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:outputURL options:nil];
CMTime audioDuration = audioAsset.duration; // shows correct
and
AVAudioPlayer* avAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:outputURL error:nil];
NSTimeInterval duration = avAudioPlayer.duration; // shows wrong
Interestingly, if I play the exported file in iTunes, it also shows the wrong (whole) duration.
I'm not sure how to fix this problem. Could this be a bug in iOS7?