iOS Extracting Audio from .mov file - ios

I've been trying to extract audio from a .mov file for a while now and I just can't seem to get it working. Specifically, I need to extract the audio and save it as an .aif or .aiff file .
I've tried using an AVMutableComposition, and loading the mov file as a AVAsset. Adding only the audio track to the AVMutableComposition before finally using an AVAssetExportSession (setting the output file type to AVFileTypeAIFF, which is the format I need it in), to write the file to an aif.
I get an error saying that this output file type is invalid, I'm unsure why:
* Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Invalid output file type'
AVAssetExportSession *exporter;
exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality] ;
exporter.audioMix = audioMix;
exporter.outputURL=[NSURL fileURLWithPath:filePath];
exporter.outputFileType=AVFileTypeAIFF; //Error occurs on this line
I'm not sure if the above approach would work, but im open to the possibility that I'm just doing something wrong. However if anyone knows another way to accomplish what I'm trying to achieve, than any help would be greatly appreciated.
I can post more detailed code if it is needed, but at the moment I'm trying a few other approaches so its a bit messy right now.
Thanks for the help!

-(void)getAudioFromVideo {
float startTime = 0;
float endTime = 10;
[super viewDidLoad];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *audioPath = [documentsDirectory stringByAppendingPathComponent:#"soundOneNew.m4a"];
AVAsset *myasset = [AVAsset assetWithURL:[[NSBundle mainBundle] URLForResource:#"VideoName" withExtension:#"mp4"]];
AVAssetExportSession *exportSession=[AVAssetExportSession exportSessionWithAsset:myasset presetName:AVAssetExportPresetAppleM4A];
exportSession.outputURL=[NSURL fileURLWithPath:audioPath];
exportSession.outputFileType=AVFileTypeAppleM4A;
CMTime vocalStartMarker = CMTimeMake((int)(floor(startTime * 100)), 100);
CMTime vocalEndMarker = CMTimeMake((int)(ceil(endTime * 100)), 100);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(vocalStartMarker, vocalEndMarker);
exportSession.timeRange= exportTimeRange;
if ([[NSFileManager defaultManager] fileExistsAtPath:audioPath]) {
[[NSFileManager defaultManager] removeItemAtPath:audioPath error:nil];
}
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status==AVAssetExportSessionStatusFailed) {
NSLog(#"failed");
}
else {
NSLog(#"AudioLocation : %#",audioPath);
}
}];
}

Because the outputFileType is wrong. For .mov file, it often is #"com.apple.quicktime-movie". And the audio extracted is .mov format. To make sure, you can use this method to get the supported output type:
NSArray *supportedTypeArray=exportSession.supportedFileTypes;
for (NSString *str in supportedTypeArray)
NSLog(#"%#",str);
And you can export audio in audio format (.m4a, can be played by AVAudioPlayer) by using methods like this:
AVAssetExportSession *exportSession=[AVAssetExportSession exportSessionWithAsset:self.mAsset presetName:AVAssetExportPresetAppleM4A];
exportSession.outputURL=myUrl;
exportSession.outputFileType=AVFileTypeAppleM4A;
exportSession.timeRange=timeRange;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status==AVAssetExportSessionStatusFailed) {
NSLog(#"failed");
}
}];

Just small info for others (I did not know that..)
If you have video file(in my case it was mp4),You can play only the audio with AVAudioPlayer.
You don't need to extract(or convert) the audio from the video.

Related

iOS Convert AVI to MP4

(Please note I've already looked at this other SO post.)
The Problem
I'm trying to convert an avi video to an mp4 so that I can play it natively on an iOS app using Objective-C
What I've Tried
I'm trying the following code to do that conversion:
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL outputURL:(NSURL*)outputURL handler:(void (^)(AVAssetExportSession*))handler {
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeMPEG4;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void) {
handler(exportSession);
}];
}
The error that is returned from the exportSession is Cannot Open
Extra Information
When I run the video that I'm trying to convert through Mediainfo I get the following for the video:
7 332kb/s, 1920*1080 (16:9), at 25.000 FPS, AVC (Baseline#L3.1) (CABAC / 1 Ref Frames)
And this for the audio:
128 kb/s, 8 000 Hz, 16 bits, 1 channel, PCM (Little / Signed)
I also used the exportPresetsCompatibleWithAsset: method on AVAssetExportSession and got the following results:
AVAssetExportPreset1920x1080,
AVAssetExportPresetLowQuality,
AVAssetExportPresetAppleM4A,
AVAssetExportPresetHEVCHighestQuality,
AVAssetExportPreset640x480,
AVAssetExportPreset3840x2160,
AVAssetExportPresetHEVC3840x2160,
AVAssetExportPresetHighestQuality,
AVAssetExportPreset1280x720,
AVAssetExportPresetMediumQuality,
AVAssetExportPreset960x540,
AVAssetExportPresetHEVC1920x1080
Another thing to note is that when playing with the preset and the output I managed to get an audio only file that was basically white noise. This was using the preset AVAssetExportPresetAppleM4A.
I hope that I've jotted down enough information.
Update
Using the comment by Ashley, i've created a function to return the export settings compatible with the asset.
- (void)determineCompatibleExportForAsset:(AVURLAsset *)asset completion:(void(^)(NSArray<ExportSettings *> *exports))handler {
NSArray<NSString *> *presets = #[
AVAssetExportPresetLowQuality,
AVAssetExportPresetMediumQuality,
AVAssetExportPresetHighestQuality,
AVAssetExportPresetHEVCHighestQuality,
AVAssetExportPreset640x480,
AVAssetExportPreset960x540,
AVAssetExportPreset1280x720,
AVAssetExportPreset1920x1080,
AVAssetExportPreset3840x2160,
AVAssetExportPresetHEVC1920x1080,
AVAssetExportPresetHEVC3840x2160,
AVAssetExportPresetAppleM4A,
AVAssetExportPresetPassthrough
];
NSArray<NSString *> *outputs = #[
AVFileTypeQuickTimeMovie,
AVFileTypeMPEG4,
AVFileTypeAppleM4V,
AVFileTypeAppleM4A,
AVFileType3GPP,
AVFileType3GPP2,
AVFileTypeCoreAudioFormat,
AVFileTypeWAVE,
AVFileTypeAIFF,
AVFileTypeAIFC,
AVFileTypeAMR,
AVFileTypeMPEGLayer3,
AVFileTypeSunAU,
AVFileTypeAC3,
AVFileTypeEnhancedAC3,
AVFileTypeJPEG,
AVFileTypeDNG,
AVFileTypeHEIC,
AVFileTypeAVCI,
AVFileTypeHEIF,
AVFileTypeTIFF
];
__block int counter = 0;
int totalCount = (int)presets.count * (int)outputs.count;
NSMutableArray<ExportSettings *> *exportSettingsArray = [#[] mutableCopy];
for (NSString *preset in presets) {
for (NSString *output in outputs) {
[AVAssetExportSession determineCompatibilityOfExportPreset:preset withAsset:asset outputFileType:output completionHandler:^(BOOL compatible) {
if (compatible) {
ExportSettings *exportSettings = [[ExportSettings alloc] initWithPreset:preset outputType:output];
[exportSettingsArray addObject:exportSettings];
}
counter++;
if (counter == totalCount) {
if (handler) {
handler([exportSettingsArray copy]);
}
}
}];
}
}
}
The results of this are as follows:
"Preset: AVAssetExportPresetAppleM4A Output: com.apple.m4a-audio",
"Preset: AVAssetExportPresetPassthrough Output: com.microsoft.waveform-audio",
"Preset: AVAssetExportPresetPassthrough Output: public.aifc-audio",
"Preset: AVAssetExportPresetPassthrough Output: public.aiff-audio",
"Preset: AVAssetExportPresetPassthrough Output: com.apple.coreaudio-format",
"Preset: AVAssetExportPresetPassthrough Output: com.apple.quicktime-movie"
From this I deduced that using the preset AVAssetExportPresetPassthrough and output type AVFileTypeQuickTimeMovie would be compatible.
However when running the following code: (i've tried .mp4, .mov and .qt for the file type)
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *filePath = [documentsDirectory stringByAppendingPathComponent:#"MyVideo.mov"];
NSURL *outputURL = [NSURL fileURLWithPath:filePath];
NSURL *localURL = [NSBundle URLForResource:#"20180626_145233-v" withExtension:#"avi" subdirectory:nil inBundleWithURL:[NSBundle mainBundle].bundleURL];
[self convertVideoToLowQuailtyWithInputURL:localURL outputURL:outputURL handler:^(AVAssetExportSession *exportSession) {
switch ([exportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [exportSession error]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Successfully");
NSLog(#"OutputURL: %#", outputURL.absoluteString);
break;
default:
break;
}
}];
Which calls:
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL outputURL:(NSURL*)outputURL handler:(void (^)(AVAssetExportSession*))handler {
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetPassthrough];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void) {
handler(exportSession);
}];
}
I get this error:
Export failed: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12842), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x60400024def0 {Error Domain=NSOSStatusErrorDomain Code=-12842 "(null)"}}
The reason you cannot open the file is because iOS does not support AVI files.
Here is the link to the Apple documentation on supported file types within AVFoundation.
Or this image for posterity should the values change:
Finally, you can just inspect the definition of AVFileType from within XCode to see this list for yourself.
For what it is worth, a cursory inspection on AVI via Google indicates that there are limitations with the AVI container that have been rectified in newer formats that are supported by iOS, so there is likely little incentive to add support at a later date.
While also a potential issue being that Microsoft created AVI, I cannot locate any restrictive licensing that would prohibit someone from using it, but IANAL.
You should use FFmpeg library for that kind of purposes.
Here is an open-source video-player based on FFmpeg: https://github.com/kolyvan/kxmovie
I don't know if this player still works for the latest iOS but in any case in source codes, you can discover many interesting things that can help you with your problem (such as how to use FFmpeg library).
Very easy to do with MobileFFMpeg
https://stackoverflow.com/a/59325680/1466453
Once you have MobileFFMpeg then make a call as follows:
[MobileFFMpeg execute:#"-i infile.avi youroutput.mp4"];

AVExportSession with a streaming AVAsset -11800 Error

I have been using AVPlayer to play video where the underlying asset that is backed by a Streaming URL.
I am trying to create a snip of the video (just a 9 second clip), in order to save the clip locally.
I tried AVAssetExportSession with the following code, but the session returns AVAssetExportSessionStatusFailed every time. The specific error is Error Domain=AVFoundationErrorDomain Code=-11800, which means unknown error.
AVURLAsset *otherAsset = [[AVURLAsset alloc] initWithURL:streamURL options:options];
NSArray *exportPresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:otherAsset];
NSLog(exportPresets.description);
AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:otherAsset presetName:AVAssetExportPresetMediumQuality];
NSArray *supportedFileTypes = session.supportedFileTypes;
NSLog(supportedFileTypes.description);
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex: 0];
NSString *dstPath = [documentsDirectory stringByAppendingString:#"/sample.mov"];
NSURL *savetUrl = [NSURL fileURLWithPath:dstPath];
session.outputFileType = #"com.apple.quicktime-movie";
session.outputURL = savetUrl;
session.shouldOptimizeForNetworkUse = YES;
[session exportAsynchronouslyWithCompletionHandler:
^(void ) {
switch ([session status]) {
case AVAssetExportSessionStatusFailed:
NSLog([session error].description);
NSLog(#"Export failed: %#",[[session error]localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
NSLog(#"Export Success, File Saved.");
break;
}
}
];
I've checked to make sure that I am not overwriting a file, and checked other causes of AVAssetExportSessionStatusFailed.
I am guessing that AVAssetExportSession is not meant to be used with streaming assets. (I could be wrong).
Is my guess right, or is there something else I need to do. Is there an easier way to snip and cache a streaming URL?
Thanks for your time!
For as far as I know, there is no way to export other than local files under iOS7.
It seems to work fine under iOS8 though.
I suggest you just use NSData.writeToFile(filePath, atomically: bool) if you don't have to set an exportInterval.
Hope this'll help!

AVMutableCompositionTrack - Using insertEmptyTimeRange to insert silence between two WAV files

The problem I am having is getting a variable amount of silence to be placed in-between two wav files.
My approach thus far is as follows:
Firstly I create an AVMutableComposition and an AVMutableCompositionTrack
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* appendedAudioTrack =
[composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
Then using AVURLAsset I allocate my conveniently named first.wav file.
AVURLAsset* firstComponent = [[AVURLAsset alloc]
initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource: #"first" ofType: #"wav"]] options:nil];
I then insert this into the mutable composition track I named appendedAudioTrack earlier
NSArray *firstTrack = [firstComponent tracksWithMediaType:AVMediaTypeAudio];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, firstComponent.duration);
[appendedAudioTrack insertTimeRange:timeRange
ofTrack:[firstTrack objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
The second.wav file is inserted in the exact same way:
AVURLAsset* secondComponent = [[AVURLAsset alloc]
initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource: #"second" ofType: #"wav"]] options:nil];
NSArray *secondTrack = [secondComponent tracksWithMediaType:AVMediaTypeAudio];
CMTimeRange timeRange2 = CMTimeRangeMake(kCMTimeZero, secondComponent.duration);
[appendedAudioTrack insertTimeRange:timeRange2
ofTrack:[secondTrack objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
This so far successfully joins the two wav files end to end.
I then try to insert a variable amount of silence using insertEmptyTimeRange like this:
CMTimeRange timeRange3 = CMTimeRangeFromTimeToTime(firstComponent.duration,CMTimeAdd(firstComponent.duration,CMTimeMake(interval,1)));
[appendedAudioTrack insertEmptyTimeRange:timeRange3];
The silence duration is a float and for testing purposes is currently 0.49. Its variable name is interval and it represents the desired silence in seconds.
An assumption i've made using CMTimeRange is that AVURLAsset's duration property can be considered as the finish CMTime for the first audio track.
When I download the documents directory from organiser in Xcode, and look at the resulting m4a file in audacity the silence is in the file, but its at the start, not in the middle of both .wav files as desired.
Incorrectly it goes: SILENCE, first.wav, second.wav
I would like to know how to properly use insertEmptyTimeRange to produce
first.wav, SILENCE, second.wav.
Note: I have seen this question (and others) which presents a very similar problem, however the approach they went for was to use a constant silence file. My silence is variable and determined at run time. And I am also aware that another answer provides what they say is a solution, but it has not worked for me. I have tried all the different methods I found on the internet but it seems I'm misunderstanding something as it never works correctly for me.
Just in case it matters, I export the file like so:
// Create a new audio file using the appendedAudioTrack
AVAssetExportSession* exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (!exportSession)
{
// do something
return nil;
}
//This gives me an output URL to a file name that doesn't yet exist
NSString *path2;
NSArray *paths2 = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
path2 = [paths2 objectAtIndex:0];
path2 = [path2 stringByAppendingPathComponent:[string stringByAppendingString: #".m4a"]];
NSString* appendedAudioPath= path2;
exportSession.outputURL = [NSURL fileURLWithPath:appendedAudioPath];
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status)
{
case AVAssetExportSessionStatusFailed:
NSLog(#"%#",exportSession.error);
break;
case AVAssetExportSessionStatusCompleted:
break;
case AVAssetExportSessionStatusWaiting:
break;
default:
break;
}
Thanks
It turns out the interval was not of the correct duration, when checked in audacity it was actually 0.049, not 0.49.
The problem I was having was with this
CMTimeMake(interval,1)
I thought this would yield interval/1 seconds but I was wrong.
Instead I used CMTimeMakeWithSeconds along with NSEC_PER_SEC like so
CMTimeRange timeRange3 = CMTimeRangeFromTimeToTime(firstComponent.duration,CMTimeAdd(firstComponent.duration,CMTimeMakeWithSeconds(interval ,NSEC_PER_SEC)));
[appendedAudioTrack insertEmptyTimeRange:timeRange3];
This then provided me with the desired silence of interval seconds long, and in the right place.

iOS: How to trim silence from start and end of .aif audio recording?

My app includes the ability for the user to record a brief message; I'd like to trim off any silence (or, to be more precise, any audio whose volume falls below a given threshold) from the beginning and end of the recording.
I'm recording the audio with an AVAudioRecorder, and saving it to an .aif file. I've seen some mention elsewhere of methods by which I could have it wait to start recording until the audio level reaches a threshold; that'd get me halfway there, but won't help with trimming silence off the end.
If there's a simple way to do this, I'll be eternally grateful!
Thanks.
This project takes audio from the microphone, triggers on loud noise and untriggers when quiet. It also trims and fades in/fades out around the ends.
https://github.com/fulldecent/FDSoundActivatedRecorder
Relevant code you are seeking:
- (NSString *)recordedFilePath
{
// Prepare output
NSString *trimmedAudioFileBaseName = [NSString stringWithFormat:#"recordingConverted%x.caf", arc4random()];
NSString *trimmedAudioFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:trimmedAudioFileBaseName];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:trimmedAudioFilePath]) {
NSError *error;
if ([fileManager removeItemAtPath:trimmedAudioFilePath error:&error] == NO) {
NSLog(#"removeItemAtPath %# error:%#", trimmedAudioFilePath, error);
}
}
NSLog(#"Saving to %#", trimmedAudioFilePath);
AVAsset *avAsset = [AVAsset assetWithURL:self.audioRecorder.url];
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *track = [tracks objectAtIndex:0];
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:avAsset
presetName:AVAssetExportPresetAppleM4A];
// create trim time range
CMTime startTime = CMTimeMake(self.recordingBeginTime*SAVING_SAMPLES_PER_SECOND, SAVING_SAMPLES_PER_SECOND);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, kCMTimePositiveInfinity);
// create fade in time range
CMTime startFadeInTime = startTime;
CMTime endFadeInTime = CMTimeMake(self.recordingBeginTime*SAVING_SAMPLES_PER_SECOND + RISE_TRIGGER_INTERVALS*INTERVAL_SECONDS*SAVING_SAMPLES_PER_SECOND, SAVING_SAMPLES_PER_SECOND);
CMTimeRange fadeInTimeRange = CMTimeRangeFromTimeToTime(startFadeInTime, endFadeInTime);
// setup audio mix
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters =
[AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[exportAudioMixInputParameters setVolumeRampFromStartVolume:0.0 toEndVolume:1.0
timeRange:fadeInTimeRange];
exportAudioMix.inputParameters = [NSArray
arrayWithObject:exportAudioMixInputParameters];
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:trimmedAudioFilePath];
exportSession.outputFileType = AVFileTypeAppleM4A;
exportSession.timeRange = exportTimeRange;
exportSession.audioMix = exportAudioMix;
// MAKE THE EXPORT SYNCHRONOUS
dispatch_semaphore_t semaphore = dispatch_semaphore_create(0);
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_semaphore_signal(semaphore);
}];
dispatch_semaphore_wait(semaphore, DISPATCH_TIME_FOREVER);
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
return trimmedAudioFilePath;
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed %#", exportSession.error.localizedDescription);
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
return nil;
}
I'm recording the audio with an AVAudioRecorder, and saving it to an .aif file. I've seen some mention elsewhere of methods by which I could have it wait to start recording until the audio level reaches a threshold; that'd get me halfway there
Without adequate buffering, that would truncate the start.
I don't know of an easy way. You would have to write a new audio file after recording and analyzing it for the desired start and end points. Modifying the existing file would be straightforward if you knew the AIFF format well (not many people do) and had an easy way to read the file's sample data.
The analysis stage is pretty easy for a basic implementation -- evaluate the average power of sample data, until your threshold is exceeded. Repeat in reverse for end.

iOS video to audio file conversion [duplicate]

This question already has an answer here:
iPhone - Separating audio from a video file and saving it to a separate file
(1 answer)
Closed 9 years ago.
I managed it to download a youtube video using NSUrlConnection and save it to the device. Now I want to convert this (I guess .mp4) file to an .mp3 audio file. Does anyone know a solution for this problem? Maybe there's a way to only download the audio from the video? This would save a lot of time.
First of all, you don't want to convert anything, that's slow. Instead you want to extract the audio stream from the mp4 file. You can do this by creating an AVMutableComposition containing only the audio track of the original file and then exporting the composition with an AVAssetExportSession. This is currently m4a centric. If you want to handle both m4a and mp3 output, check the audio track type, make sure to set the right file extension and choose between AVFileTypeMPEGLayer3 or AVFileTypeAppleM4A in the export session.
NSURL* dstURL = [NSURL fileURLWithPath:dstPath];
[[NSFileManager defaultManager] removeItemAtURL:dstURL error:nil];
AVMutableComposition* newAudioAsset = [AVMutableComposition composition];
AVMutableCompositionTrack* dstCompositionTrack;
dstCompositionTrack = [newAudioAsset addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAsset* srcAsset = [AVURLAsset URLAssetWithURL:srcURL options:nil];
AVAssetTrack* srcTrack = [[srcAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTimeRange timeRange = srcTrack.timeRange;
NSError* error;
if(NO == [dstCompositionTrack insertTimeRange:timeRange ofTrack:srcTrack atTime:kCMTimeZero error:&error]) {
NSLog(#"track insert failed: %#\n", error);
return;
}
AVAssetExportSession* exportSesh = [[AVAssetExportSession alloc] initWithAsset:newAudioAsset presetName:AVAssetExportPresetPassthrough];
exportSesh.outputFileType = AVFileTypeAppleM4A;
exportSesh.outputURL = dstURL;
[exportSesh exportAsynchronouslyWithCompletionHandler:^{
AVAssetExportSessionStatus status = exportSesh.status;
NSLog(#"exportAsynchronouslyWithCompletionHandler: %i\n", status);
if(AVAssetExportSessionStatusFailed == status) {
NSLog(#"FAILURE: %#\n", exportSesh.error);
} else if(AVAssetExportSessionStatusCompleted == status) {
NSLog(#"SUCCESS!\n");
}
}];

Resources