AVMutableCompositionTrack - Using insertEmptyTimeRange to insert silence between two WAV files - ios

The problem I am having is getting a variable amount of silence to be placed in-between two wav files.
My approach thus far is as follows:
Firstly I create an AVMutableComposition and an AVMutableCompositionTrack
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* appendedAudioTrack =
[composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
Then using AVURLAsset I allocate my conveniently named first.wav file.
AVURLAsset* firstComponent = [[AVURLAsset alloc]
initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource: #"first" ofType: #"wav"]] options:nil];
I then insert this into the mutable composition track I named appendedAudioTrack earlier
NSArray *firstTrack = [firstComponent tracksWithMediaType:AVMediaTypeAudio];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, firstComponent.duration);
[appendedAudioTrack insertTimeRange:timeRange
ofTrack:[firstTrack objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
The second.wav file is inserted in the exact same way:
AVURLAsset* secondComponent = [[AVURLAsset alloc]
initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource: #"second" ofType: #"wav"]] options:nil];
NSArray *secondTrack = [secondComponent tracksWithMediaType:AVMediaTypeAudio];
CMTimeRange timeRange2 = CMTimeRangeMake(kCMTimeZero, secondComponent.duration);
[appendedAudioTrack insertTimeRange:timeRange2
ofTrack:[secondTrack objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
This so far successfully joins the two wav files end to end.
I then try to insert a variable amount of silence using insertEmptyTimeRange like this:
CMTimeRange timeRange3 = CMTimeRangeFromTimeToTime(firstComponent.duration,CMTimeAdd(firstComponent.duration,CMTimeMake(interval,1)));
[appendedAudioTrack insertEmptyTimeRange:timeRange3];
The silence duration is a float and for testing purposes is currently 0.49. Its variable name is interval and it represents the desired silence in seconds.
An assumption i've made using CMTimeRange is that AVURLAsset's duration property can be considered as the finish CMTime for the first audio track.
When I download the documents directory from organiser in Xcode, and look at the resulting m4a file in audacity the silence is in the file, but its at the start, not in the middle of both .wav files as desired.
Incorrectly it goes: SILENCE, first.wav, second.wav
I would like to know how to properly use insertEmptyTimeRange to produce
first.wav, SILENCE, second.wav.
Note: I have seen this question (and others) which presents a very similar problem, however the approach they went for was to use a constant silence file. My silence is variable and determined at run time. And I am also aware that another answer provides what they say is a solution, but it has not worked for me. I have tried all the different methods I found on the internet but it seems I'm misunderstanding something as it never works correctly for me.
Just in case it matters, I export the file like so:
// Create a new audio file using the appendedAudioTrack
AVAssetExportSession* exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (!exportSession)
{
// do something
return nil;
}
//This gives me an output URL to a file name that doesn't yet exist
NSString *path2;
NSArray *paths2 = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
path2 = [paths2 objectAtIndex:0];
path2 = [path2 stringByAppendingPathComponent:[string stringByAppendingString: #".m4a"]];
NSString* appendedAudioPath= path2;
exportSession.outputURL = [NSURL fileURLWithPath:appendedAudioPath];
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status)
{
case AVAssetExportSessionStatusFailed:
NSLog(#"%#",exportSession.error);
break;
case AVAssetExportSessionStatusCompleted:
break;
case AVAssetExportSessionStatusWaiting:
break;
default:
break;
}
Thanks

It turns out the interval was not of the correct duration, when checked in audacity it was actually 0.049, not 0.49.
The problem I was having was with this
CMTimeMake(interval,1)
I thought this would yield interval/1 seconds but I was wrong.
Instead I used CMTimeMakeWithSeconds along with NSEC_PER_SEC like so
CMTimeRange timeRange3 = CMTimeRangeFromTimeToTime(firstComponent.duration,CMTimeAdd(firstComponent.duration,CMTimeMakeWithSeconds(interval ,NSEC_PER_SEC)));
[appendedAudioTrack insertEmptyTimeRange:timeRange3];
This then provided me with the desired silence of interval seconds long, and in the right place.

Related

AVAssetExportSession exporting too slow

I am trying to export an AVMutableComposition using AVAssetExportSession.
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = mainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
switch (exporter.status)
{
case AVAssetExportSessionStatusCompleted:
{
NSLog(#"Video Merge SuccessFullt");
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#", exporter.error.description);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#", exporter.error);
break;
case AVAssetExportSessionStatusExporting:
NSLog(#"Exporting!");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"Waiting");
break;
default:
break;
}
}];
But exporting even 1 minute video takes around 30 seconds, which is too much considering iPad inbuilt camera app takes less than 2 seconds.
Also if I remove videoComposition from exporter, time reduces to 7 seconds, which is still bad considering video length to be only 1 minute.
So, I want to know how to decrease the export time to minimum?
Also, I want to know, does AVAssetExportSession takes generally this much time or is it just my case?
Update:
Merge Code:
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableVideoCompositionLayerInstruction *videoTrackLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
NSMutableArray *instructions = [NSMutableArray new];
CGSize size = CGSizeZero;
CMTime time = kCMTimeZero;
for (AVURLAsset *asset in assets)
{
AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;
NSError *error;
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration )
ofTrack:assetTrack
atTime:time
error:&error];
[videoTrackLayerInstruction setTransform:assetTrack.preferredTransform atTime:time];
if (error) {
NSLog(#"asset url :: %#",assetTrack.asset);
NSLog(#"Error1 - %#", error.debugDescription);
}
[audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAssetTrack.timeRange.duration)
ofTrack:audioAssetTrack
atTime:time
error:&error];
if (error) {
NSLog(#"Error2 - %#", error.debugDescription);
}
time = CMTimeAdd(time, assetTrack.timeRange.duration);
if (CGSizeEqualToSize(size, CGSizeZero)) {
size = assetTrack.naturalSize;
}
}
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);
mainInstruction.layerInstructions = [NSArray arrayWithObject:videoTrackLayerInstruction];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
mainCompositionInst.renderSize = size;
I think the issue here is AVAssetExportPresetHighestQuality this will cause conversion or up sampling, and will slow things WAY down. Try using AVAssetExportPresetPassthrough instead. Took my export times from ~35+ secs down to less than a second
I also disabled optimizing for network use as all of our videos are only used within the app, and never streamed or passed over a network
I built an app that merged together different video fragments and I can safely say it is your case. My video files have ~10 mb so maybe they are little smaller but it takes less than second to merge them all together, even if there is 10, 20 segments.
Now as for it is actually happening, I have checked my configuration against yours and the difference is following:
I use export.outputFileType = AVFileTypeMPEG4
I have network optimization disabled, and if you are not planning to stream the video from your device, you should disable it too
Other than that, it should be the same, I can't really compare it as you would have to provide code about how you actually create the composition. There are some things to check though:
if you are using AVURLAssetPreferPreciseDurationAndTimingKey when creating AVURLAsset and you don't have enough keyframes, it can actually take quite some time to seek through to find the keys and so it slows it down
Consider if you really need highest quality for the video
Consider resolution of the video and possibly lower it down
I should be able to help you more if you provide more information, but maybe some of this stuff will work. Give it a try and then report back.
Hope it helps a little!
Edit 1:
I forgot to mention that if you are out of options, you should try to use FFmpeg library as it is very high performance, though due to licencing might not be suitable for you.
Keep in my that maybe the asset you're trying to export is not stored locally and first is downloading the content and then exporting your assets.
if you don't want to download any content
let videoRequestOptions: PHVideoRequestOptions = PHVideoRequestOptions()
videoRequestOptions.isNetworkAccessAllowed = true
you also going to receive a message within requestExportSession completion handler with a couple of useful info values.
https://developer.apple.com/documentation/photokit/phimagemanager/image_result_info_keys
otherwise, if you want to download your asset from iCloud and make it as fast as possible you could play with the following parameters
let videoRequestOptions: PHVideoRequestOptions = PHVideoRequestOptions()
// highQualityFormat, highQualityFormat, fastFormat, automatic
videoRequestOptions.deliveryMode = .fastFormat
videoRequestOptions.isNetworkAccessAllowed = true
another important property is the export preset, there's a bunch of available preset
let lowQualityPreset1 = AVAssetExportPresetLowQuality
let lowQualityPreset2 = AVAssetExportPreset640x480
let lowQualityPreset3 = AVAssetExportPreset960x540
let lowQualityPreset4 = AVAssetExportPreset1280x720
let manager = PHImageManager()
manager.requestExportSession(forVideo: asset,
options: videoRequestOptions,
exportPreset: lowQualityPreset1) { (session, info) in
session?.outputURL = outputUrl
session?.outputFileType = .mp4
session?.shouldOptimizeForNetworkUse = true
session?.exportAsynchronously {
}
}

Merging Clips with Different Resolutions

I have a set of video clips that I would like to merge together and then put a watermark on it.
I am able to do both functions individually, however problems arise when performing the them together.
All clips that will be merged are either 1920x1080 or 960x540.
For some reason, AVAssetExportSession does not display them well together.
Here are the 2 bugs based on 3 different scenarios:
This image is a result of:
Merging Clips together
As you can see, there is nothing wrong here, the output video produces the desired effect.
However, when I then try to add a watermark, it creates the following issue:
This image is a result of:
Merging Clips together
Putting a watermark on it
BUG 1: Some clips in the video get resized for whatever reason while other clips do not.
This image is a result of:
Merging Clips together
Resizing clips that are 960x540 to 1920x1080
Putting a watermark on it
Bug 2 Now the clips that need to be resized get resized, however the old unresized clip is still there.
Merging/Resizing Code:
-(void) mergeClips{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *mutableVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *mutableAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// loop through the list of videos and add them to the track
CMTime currentTime = kCMTimeZero;
NSMutableArray* instructionArray = [[NSMutableArray alloc] init];
if (_clipsArray){
for (int i = 0; i < (int)[_clipsArray count]; i++){
NSURL* url = [_clipsArray objectAtIndex:i];
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CGSize size = videoTrack.naturalSize;
CGFloat widthScale = 1920.0f/size.width;
CGFloat heightScale = 1080.0f/size.height;
// lines that performs resizing
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mutableVideoTrack];
CGAffineTransform scale = CGAffineTransformMakeScale(widthScale,heightScale);
CGAffineTransform move = CGAffineTransformMakeTranslation(0,0);
[layerInstruction setTransform:CGAffineTransformConcat(scale, move) atTime:currentTime];
[instructionArray addObject:layerInstruction];
[mutableVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofTrack:videoTrack
atTime:currentTime error:nil];
[mutableAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
ofTrack:audioTrack
atTime:currentTime error:nil];
currentTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(asset.duration) + CMTimeGetSeconds(currentTime), asset.duration.timescale);
}
}
AVMutableVideoCompositionInstruction * mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, currentTime);
mainInstruction.layerInstructions = instructionArray;
// 4 - Get path
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *lastPostedDayPath = [documentsDirectory stringByAppendingPathComponent:#"lastPostedDay"];
//Check if folder exists, if not create folder
if (![[NSFileManager defaultManager] fileExistsAtPath:lastPostedDayPath]){
[[NSFileManager defaultManager] createDirectoryAtPath:lastPostedDayPath withIntermediateDirectories:NO attributes:nil error:nil];
}
NSString *fileName = [NSString stringWithFormat:#"%li_%li_%li.mov", (long)_month, (long)_day, (long)_year];
NSString *finalDayPath = [lastPostedDayPath stringByAppendingPathComponent:fileName];
NSURL *url = [NSURL fileURLWithPath:finalDayPath];
BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:finalDayPath];
if (fileExists){
NSLog(#"file exists");
[[NSFileManager defaultManager] removeItemAtURL:url error:nil];
}
AVMutableVideoComposition *mainComposition = [AVMutableVideoComposition videoComposition];
mainComposition.instructions = [NSArray arrayWithObject:mainInstruction];
mainComposition.frameDuration = CMTimeMake(1, 30);
mainComposition.renderSize = CGSizeMake(1920.0f, 1080.0f);
// 5 - Create exporter
_exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
_exportSession.outputURL=url;
_exportSession.outputFileType = AVFileTypeQuickTimeMovie;
_exportSession.shouldOptimizeForNetworkUse = YES;
_exportSession.videoComposition = mainComposition;
[_exportSession exportAsynchronouslyWithCompletionHandler:^{
[merge_timer invalidate];
merge_timer = nil;
switch (_exportSession.status) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed -> Reason: %#, User Info: %#",
_exportSession.error.localizedDescription,
_exportSession.error.userInfo.description);
[self showSavingFailedDialog];
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export cancelled");
[self showSavingFailedDialog];
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export finished");
[self addWatermarkToExportSession:_exportSession];
break;
default:
break;
}
}];
});
}
Once it finishes this, I run it through a different Export Session that just simply adds a watermark.
Is there something I am doing wrong in my code or process?
Is there an easier way for achieving this?
Thank you for your time!
I was able to solve my issue.
For some reason, AVAssetExportSession will not actually create a 'flat' video file of the merged clips, so it still recognized the lower resolution clips and their locations when adding the watermark which caused them to resize.
What I did to solve this was, first use AVAssetWriter to merge my clips and create one 'flat' file. I then could add a watermark without having a resizing issue.
Hope this helps anyone who may come across this problem in the future!
I also encountered the same problem,
you can set opacity after one video end like this:
[layerInstruction setOpacity:0.0 atTime:duration];

iOS video to audio file conversion [duplicate]

This question already has an answer here:
iPhone - Separating audio from a video file and saving it to a separate file
(1 answer)
Closed 9 years ago.
I managed it to download a youtube video using NSUrlConnection and save it to the device. Now I want to convert this (I guess .mp4) file to an .mp3 audio file. Does anyone know a solution for this problem? Maybe there's a way to only download the audio from the video? This would save a lot of time.
First of all, you don't want to convert anything, that's slow. Instead you want to extract the audio stream from the mp4 file. You can do this by creating an AVMutableComposition containing only the audio track of the original file and then exporting the composition with an AVAssetExportSession. This is currently m4a centric. If you want to handle both m4a and mp3 output, check the audio track type, make sure to set the right file extension and choose between AVFileTypeMPEGLayer3 or AVFileTypeAppleM4A in the export session.
NSURL* dstURL = [NSURL fileURLWithPath:dstPath];
[[NSFileManager defaultManager] removeItemAtURL:dstURL error:nil];
AVMutableComposition* newAudioAsset = [AVMutableComposition composition];
AVMutableCompositionTrack* dstCompositionTrack;
dstCompositionTrack = [newAudioAsset addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAsset* srcAsset = [AVURLAsset URLAssetWithURL:srcURL options:nil];
AVAssetTrack* srcTrack = [[srcAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTimeRange timeRange = srcTrack.timeRange;
NSError* error;
if(NO == [dstCompositionTrack insertTimeRange:timeRange ofTrack:srcTrack atTime:kCMTimeZero error:&error]) {
NSLog(#"track insert failed: %#\n", error);
return;
}
AVAssetExportSession* exportSesh = [[AVAssetExportSession alloc] initWithAsset:newAudioAsset presetName:AVAssetExportPresetPassthrough];
exportSesh.outputFileType = AVFileTypeAppleM4A;
exportSesh.outputURL = dstURL;
[exportSesh exportAsynchronouslyWithCompletionHandler:^{
AVAssetExportSessionStatus status = exportSesh.status;
NSLog(#"exportAsynchronouslyWithCompletionHandler: %i\n", status);
if(AVAssetExportSessionStatusFailed == status) {
NSLog(#"FAILURE: %#\n", exportSesh.error);
} else if(AVAssetExportSessionStatusCompleted == status) {
NSLog(#"SUCCESS!\n");
}
}];

Export 2 audio tracks (running at the same time) to one file m4a)

I'm developing an ios audio app on xcode and I'm trying to use 2 audio files I have recorded - which are playing at the same time and export it to one audio file.
All I have managed to do is merge 2 audio files to one, but the 2 audios are playing one after another and not in sync at the same time.
Does anyone have a clue how I can sort it out?
Thanks
You should take a look at this for AAC conversion (http://atastypixel.com/blog/easy-aac-compressed-audio-conversion-on-ios/). It's super useful.
Another thing you might want to consider... combining two audio signals is as easy as adding the samples together. So what you could do is:
Open both recordings and get an array for each of the recordings that holds the audio samples.
Make a for() loop that adds each sample and puts it in an output array
for(int i = 0; i<numberOfSamples; i++) {
exportBuffer[i] = firstTrack[i] + secondTrack[i];
}
and then write the exportBuffer to an m4a file.
This code will only work if the two files are the same exact length, so adjust it to your needs. You'll need to add a conditional that fires if you've reached the end of one of the arrays. In that case, just add 0's.
Try Apple's MixerHost sample app.
/* Implement this method if you have already saved your recorded audio file */
-(void)mixAudio{
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne =[[NSBundle mainBundle]pathForResource:#"RecordAudio1" ofType:#"wav"];
NSURL *url = [NSURL fileURLWithPath:soundOne];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack = [tracks objectAtIndex:0];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionAudioTrack1 = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack setPreferredVolume:0.8];
NSString *soundOne1 =[[NSBundle mainBundle]pathForResource:#"RecordAudio2" ofType:#"wav"];
NSURL *url1 = [NSURL fileURLWithPath:soundOne1];
AVAsset *avAsset1 = [AVURLAsset URLAssetWithURL:url1 options:nil];
NSArray *tracks1 = [avAsset1 tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *clipAudioTrack1 = [tracks1 objectAtIndex:0];
[compositionAudioTrack1 insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset1.duration) ofTrack:clipAudioTrack1 atTime: kCMTimeZero error:nil];
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession) return NO;
NSString *soundOneNew = [documentsDirectory stringByAppendingPathComponent:#"combined10.m4a"];
//NSLog(#"Output file path - %#",soundOneNew);
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:soundOneNew]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed");
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
}];
}

iOS Extracting Audio from .mov file

I've been trying to extract audio from a .mov file for a while now and I just can't seem to get it working. Specifically, I need to extract the audio and save it as an .aif or .aiff file .
I've tried using an AVMutableComposition, and loading the mov file as a AVAsset. Adding only the audio track to the AVMutableComposition before finally using an AVAssetExportSession (setting the output file type to AVFileTypeAIFF, which is the format I need it in), to write the file to an aif.
I get an error saying that this output file type is invalid, I'm unsure why:
* Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Invalid output file type'
AVAssetExportSession *exporter;
exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality] ;
exporter.audioMix = audioMix;
exporter.outputURL=[NSURL fileURLWithPath:filePath];
exporter.outputFileType=AVFileTypeAIFF; //Error occurs on this line
I'm not sure if the above approach would work, but im open to the possibility that I'm just doing something wrong. However if anyone knows another way to accomplish what I'm trying to achieve, than any help would be greatly appreciated.
I can post more detailed code if it is needed, but at the moment I'm trying a few other approaches so its a bit messy right now.
Thanks for the help!
-(void)getAudioFromVideo {
float startTime = 0;
float endTime = 10;
[super viewDidLoad];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *audioPath = [documentsDirectory stringByAppendingPathComponent:#"soundOneNew.m4a"];
AVAsset *myasset = [AVAsset assetWithURL:[[NSBundle mainBundle] URLForResource:#"VideoName" withExtension:#"mp4"]];
AVAssetExportSession *exportSession=[AVAssetExportSession exportSessionWithAsset:myasset presetName:AVAssetExportPresetAppleM4A];
exportSession.outputURL=[NSURL fileURLWithPath:audioPath];
exportSession.outputFileType=AVFileTypeAppleM4A;
CMTime vocalStartMarker = CMTimeMake((int)(floor(startTime * 100)), 100);
CMTime vocalEndMarker = CMTimeMake((int)(ceil(endTime * 100)), 100);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(vocalStartMarker, vocalEndMarker);
exportSession.timeRange= exportTimeRange;
if ([[NSFileManager defaultManager] fileExistsAtPath:audioPath]) {
[[NSFileManager defaultManager] removeItemAtPath:audioPath error:nil];
}
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status==AVAssetExportSessionStatusFailed) {
NSLog(#"failed");
}
else {
NSLog(#"AudioLocation : %#",audioPath);
}
}];
}
Because the outputFileType is wrong. For .mov file, it often is #"com.apple.quicktime-movie". And the audio extracted is .mov format. To make sure, you can use this method to get the supported output type:
NSArray *supportedTypeArray=exportSession.supportedFileTypes;
for (NSString *str in supportedTypeArray)
NSLog(#"%#",str);
And you can export audio in audio format (.m4a, can be played by AVAudioPlayer) by using methods like this:
AVAssetExportSession *exportSession=[AVAssetExportSession exportSessionWithAsset:self.mAsset presetName:AVAssetExportPresetAppleM4A];
exportSession.outputURL=myUrl;
exportSession.outputFileType=AVFileTypeAppleM4A;
exportSession.timeRange=timeRange;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status==AVAssetExportSessionStatusFailed) {
NSLog(#"failed");
}
}];
Just small info for others (I did not know that..)
If you have video file(in my case it was mp4),You can play only the audio with AVAudioPlayer.
You don't need to extract(or convert) the audio from the video.

Resources