Trying to split a video with `AVAssetExportSession` - ios

I am trying to split a video into 4 second chunks with AVAssetExportSession. The initial split works and returns a 8mb/4 second chunk. But the second returns 12mb which is incorrect when the original video os only 18mb.
- (void) splitVideo{
AVURLAsset *vidAsset = [AVURLAsset URLAssetWithURL:output options:nil];
CMTime duration = vidAsset.duration;
NSLog(#"File size is : %.2f MB And Duration: %f",(float)[NSData dataWithContentsOfURL:output].length/1024.0f/1024.0f, CMTimeGetSeconds(duration));
splitArray = [[NSMutableArray alloc]init];
CMTime end = CMTimeMake(4, 1);
CMTimeRange range = CMTimeRangeMake(kCMTimeZero, end);
NSString *outputPath = [NSTemporaryDirectory() stringByAppendingPathComponent:#"output0.mp4"];
totalSeconds = 4.0f;
[self cutVideo:output withRange:range withOutput:outputPath];
}
- (void) cutVideo:(NSURL *)url withRange:(CMTimeRange)range withOutput:(NSString*)path{
AVAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:asset];
if ([compatiblePresets containsObject:AVAssetExportPresetHighestQuality]) {
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
NSURL *finalUrl = [NSURL fileURLWithPath:path];
[[NSFileManager defaultManager] removeItemAtURL:finalUrl error:NULL];
exportSession.outputURL = finalUrl;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.timeRange = range;
NSLog(#"start: %f end: %f", CMTimeGetSeconds(range.start), CMTimeGetSeconds(range.duration));
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
});
if ([exportSession status] == AVAssetExportSessionStatusCompleted){
NSData *videoData = [[NSData alloc]initWithContentsOfURL:exportSession.outputURL];
NSLog(#"DL: %f", (float)videoData.length/1024.0f/1024.0f);
[self makeFile:finalUrl];
AVURLAsset *fullVid = [AVURLAsset URLAssetWithURL:output options:nil];
CMTime start = CMTimeMake(totalSeconds, 1);
totalSeconds = totalSeconds + 4.0f;
CMTime end;
if ((CMTimeGetSeconds(start) + 4) > CMTimeGetSeconds(fullVid.duration)) {
end = fullVid.duration;
}else{
end = CMTimeMake(CMTimeGetSeconds(start) + 4, 1);
}
CMTimeRange range2 = CMTimeRangeMake(start, end);
NSLog(#"%f < %f\n\n", CMTimeGetSeconds(start), CMTimeGetSeconds(fullVid.duration));
if (CMTimeGetSeconds(start) < CMTimeGetSeconds(fullVid.duration)) {
NSString *outputPath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"output%lu.mp4", splitArray.count]];
[self cutVideo:output withRange:range2 withOutput:outputPath];
}else{
[self saveVideo:true];
}
}else if ([exportSession status] == AVAssetExportSessionStatusFailed){
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
}else if ([exportSession status] == AVAssetExportSessionStatusCancelled){
NSLog(#"Export canceled");
}
}];
}
}
File size is : 18.86 MB And Duration: 9.171667
first
start: 0.000000 end: 4.000000
DL: 8.194733
4.000000 < 9.171667
second
start: 4.000000 end: 8.000000
DL: 12.784523

It's not incorrect, because video decoders stores changes from last frame, not just a set of "images". I guess your video in have more color changes in second chunk, that's why you get more space.

Related

Mute Video + Transform Video Using single export operation of AVAssetExportSession

I have following code to fix Tranform of video
- (AVVideoComposition *)squareVideoCompositionFor:(AVAsset *)asset {
AVAssetTrack *track = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
CGFloat length = MAX(track.naturalSize.width, track.naturalSize.height);
CGSize size = track.naturalSize;
CGFloat scale = 0;
CGAffineTransform transform = track.preferredTransform;
if (transform.a == 0 && transform.b == 1 && transform.c == -1 && transform.d == 0) {
scale = -1;
}
else if (transform.a == 0 && transform.b == -1 && transform.c == 1 && transform.d == 0) {
scale = -1;
}
else if (transform.a == 1 && transform.b == 0 && transform.c == 0 && transform.d == 1) {
scale = 1;
}
else if (transform.a == -1 && transform.b == 0 && transform.c == 0 && transform.d == -1) {
scale = -1;
}
transform = CGAffineTransformTranslate(transform, scale * -(size.width - length) / 2, scale * -(size.height - length) / 2);
AVMutableVideoCompositionLayerInstruction *transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:track];
[transformer setTransform:transform atTime:kCMTimeZero];
// CGAffineTransform finalTransform = t2;
// [transformer setTransform:finalTransform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, kCMTimePositiveInfinity);
instruction.layerInstructions = #[transformer];
AVMutableVideoComposition *composition = [AVMutableVideoComposition videoComposition];
composition.frameDuration = CMTimeMake(1, 30);
composition.renderSize = CGSizeMake(length, length);
composition.instructions = #[instruction];
composition.renderScale = 1.0;
return composition;
}
And Following code for Mute Audio
- (AVMutableComposition *) removeAudioFromVideoFileFor:(AVAsset *)asset {
AVMutableComposition *composition_Mix = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition_Mix addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
BOOL ok = NO;
AVAssetTrack * sourceVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CMTimeRange x = CMTimeRangeMake(kCMTimeZero, [asset duration]);
NSError *error;
ok = [compositionVideoTrack insertTimeRange:x ofTrack:sourceVideoTrack atTime:kCMTimeZero error:&error];
return composition_Mix;
}
Here how i call the function
AVAsset *asset = [AVAsset assetWithURL:inputURL];
AVMutableComposition *composition = [self removeAudioFromVideoFileFor:asset];
AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
session.videoComposition = [self squareVideoCompositionFor:asset];
session.outputURL = outputURL;
session.outputFileType = AVFileTypeMPEG4;
session.shouldOptimizeForNetworkUse = true;
session.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
But it shows error if I used both composition and [self squareVideoCompositionFor:asset]
Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}
If I omit one then it is working fine means One AVAssetExportSession can either mute audio from video or squareVideo
Is there a way I can achieve both using single progress of export of AVAssetExportSession ?
Your code looks good but I have made changes in your code to get it work.
The inputURL and outputURL should be prefixed with either file:// or https:// (as it is url, in your case it should be start with file://)
If your is not valid then you will not get the desired output.
//FOR OUTPUT URL
NSString *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
path = [path stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLQueryAllowedCharacterSet]];
//the output video will be written to file final.mp4
NSURL *outputURL = [NSURL fileURLWithPath:path];
outputURL = [outputURL URLByAppendingPathComponent:#"final.mp4"];
NSLog(#"outputURL = %#", outputURL);
//FOR INPUT URL
//This is the path of the bundle resource that is going to be used
NSURL *inputURL = [[NSBundle mainBundle] URLForResource:#"video" withExtension:#"mp4"];
NSLog(#"inputURL = %#", inputURL);
Export the composition
//this will export the composition with the specified configuration
[session exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"Success");
}];
When you see the "Success" log in console check the document directory of your application. The video will be written at outptURL.
NOTE: USE CMD + SHIFT + G and pase the outputURL. You will be redirected to the document folder of your app (For simulator only). For device, you need to download the app container and see the package content.
Complete CODE
The removeAudioFromVideoFileFor: and squareVideoCompositionFor: methods looks good. just need to change the following.
Here "video" is the name of the resource file in app bundle.
- (void)viewDidLoad {
[super viewDidLoad];
NSString *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
path = [path stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLQueryAllowedCharacterSet]];
NSURL *outputURL = [NSURL fileURLWithPath:path];
outputURL = [outputURL URLByAppendingPathComponent:#"final.mp4"];
NSLog(#"outputURL = %#", outputURL);
NSURL *inputURL = [[NSBundle mainBundle] URLForResource:#"video" withExtension:#"mp4"];
NSLog(#"inputURL = %#", inputURL);
AVAsset *asset = [AVAsset assetWithURL:inputURL];
AVMutableComposition *composition = [self removeAudioFromVideoFileFor: asset];
AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
session.videoComposition = [self squareVideoCompositionFor:asset];
session.outputURL = outputURL;
session.outputFileType = AVFileTypeMPEG4;
session.shouldOptimizeForNetworkUse = true;
session.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
[session exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"Success:");
}];
}
Hope it will help

Videos shot from android phones get ruined after editing with AVFoundation iOS

I am working on an app that requires editing videos(setting overlays).Now,while the videos shot from iPhones are edited fine,the ones shot from android phones are getting blank after editing.
I can't imagine what the problem could be.I would appreciate an immediate help.
This is one of the methods(Trim functionality).
- (IBAction)cutButtonTapped:(id)sender {
hud = [MBProgressHUD showHUDAddedTo:self.view animated:YES];
hud.mode = MBProgressHUDModeText;
hud.labelText = #"Encoding...";
[self.playButton setBackgroundImage:[UIImage imageNamed:#"video_pause.png"] forState:UIControlStateNormal];
NSString *uniqueString = [[NSProcessInfo processInfo]globallyUniqueString];
//do this to export video
NSURL *videoFileUrl = [NSURL fileURLWithPath:[AppHelper userDefaultsForKey:#"videoURL"]];
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {
self.exportSession_ = [[AVAssetExportSession alloc]
initWithAsset:anAsset presetName:AVAssetExportPresetPassthrough];
// Implementation continues.
// NSURL *furl = [self newURLWithName:[uniqueString stringByAppendingString:#".mov"]];
NSURL *furl = [self newURLWithName:[uniqueString stringByAppendingString:[NSString stringWithFormat:#".%#",[videoFileUrl pathExtension]]]];
self.exportSession_.outputURL = furl;
self.exportSession_.outputFileType = AVFileTypeMPEG4;
CMTime start = CMTimeMakeWithSeconds(self.startTime, anAsset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.stopTime-self.startTime, anAsset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
CMTimeShow( self.exportSession_.timeRange.duration);
self.exportSession_.timeRange = range;
CMTimeShow( self.exportSession_.timeRange.duration);
[self.exportSession_ exportAsynchronouslyWithCompletionHandler:^{
switch ([self.exportSession_ status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[self.exportSession_ error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
NSLog(#"NONE");
dispatch_async(dispatch_get_main_queue(), ^{
// [self playDocumentDirectoryVideoWithURLString:[uniqueString stringByAppendingString:#".mov"]];
[self playDocumentDirectoryVideoWithURLString:[uniqueString stringByAppendingString:[NSString stringWithFormat:#".%#",[videoFileUrl pathExtension]]]];
});
}
}];
}
}
Could anyone please help me with this?
First of all, I recommend you to check duration & range values. It seems like an issue with CMTime and decoding.
And second, try to initialise your AVURLAsset with an option to force duration extraction:
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:#{AVURLAssetPreferPreciseDurationAndTimingKey: #(YES)}];

Fade In, Fade Out effect in audio By using AVAssetExportSession in ios

I have trimmed an audio for a particular duration by using AVAssetExportSession and i am also getting the trimmed audio.
But my problem is that i want to add fade in and fade out effect in my audio.
Let me know, how can i solve this?
Any help will be appreciated.
Code for trimming audio is here--
- (void)trimAudio:(NSString *)inputAudioPath audioStartTime:(float)sTime audioEndTime:(float)eTime outputPath:(NSString *)outputFilePath mode:(NSInteger)kSelectionMode
{
#try
{
AVAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:inputAudioPath] options:nil];
//Create the session with assets
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:asset presetName:AVAssetExportPresetAppleM4A];
//Set the output url
exportSession.outputURL = [NSURL fileURLWithPath:outputFilePath];
// Trim video in a particular duration
CMTime startTime = CMTimeMake((int)(floor(sTime * 100)), 100);
CMTime stopTime = CMTimeMake((int)(ceil(eTime * 100)), 100);
CMTimeRange range = CMTimeRangeFromTimeToTime(startTime, stopTime);
exportSession.timeRange = range;
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
NSLog(#"Export Complete");
if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(exportSession.outputURL.path)){
UISaveVideoAtPathToSavedPhotosAlbum(exportSession.outputURL.path, nil, nil, nil);
if ([self.delegate respondsToSelector:#selector(trimDidSucceed:mode:)]) {
[self.delegate trimDidSucceed:outputFilePath mode:kTrimAudio];
}
else{
[self.delegate trimDidFail:exportSession.error];
}
}
break;
}
case AVAssetExportSessionStatusFailed:{
NSLog(#"Export Error: %#", [exportSession.error description]);
break;
}
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export Cancelled");
break;
default:
break;
}
}];
exportSession = nil;
}
#catch (NSException * e)
{
NSLog(#"Exception Name:%# Reason:%#",[e name],[e reason]);
}
}
//fade in /out
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:asset.tracks.lastObject];
int start=2,length=3;
[exportAudioMixInputParameters setVolume:0.0 atTime:CMTimeMakeWithSeconds(start-1, 1)];
[exportAudioMixInputParameters setVolume:0.1 atTime:CMTimeMakeWithSeconds(start, 1)];
[exportAudioMixInputParameters setVolume:0.5 atTime:CMTimeMakeWithSeconds(start+1, 1)];
[exportAudioMixInputParameters setVolume:1.0 atTime:CMTimeMakeWithSeconds(start+2, 1)];
[exportAudioMixInputParameters setVolume:1.0 atTime:CMTimeMakeWithSeconds((start+length-2), 1)];
[exportAudioMixInputParameters setVolume:0.5 atTime:CMTimeMakeWithSeconds((start+length-1), 1)];
[exportAudioMixInputParameters setVolume:0.1 atTime:CMTimeMakeWithSeconds((start+length), 1)];
exportAudioMix.inputParameters = [NSArray arrayWithObject:exportAudioMixInputParameters];
exportSession.audioMix = exportAudioMix; // fade in audio mix
Just add these few line before [exportSession exportAsynchronouslyWithCompletionHandler:^{
}];
You can simply put if statement when playing that audio by taking nstimer like if time reach at particular second that increase volume and if it reach at peak second then decrease volume to initial volume.
Check this link for reference.

AVAssetExportSession exportAsynchronouslyWithCompletionHandler not work sometimes

I want to export and compress a video from iPod-Library,but the "exportAsynchronouslyWithCompletionHandler" can work correctly for some video,and not work for some video.it's not work and no throw exception.
But it is strange that if I comment out the method "setVideoComposition:",the "exportAsynchronouslyWithCompletionHandler" can work normally.
Here is my code:
AVAsset *_videoAsset = [AVAsset assetWithURL:[NSURL URLWithString:filmElementModel.alassetUrl]];
CMTime assetTime = [_videoAsset duration];
AVAssetTrack *avAssetTrack = [[_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
Float64 duration = CMTimeGetSeconds(assetTime);
AVMutableComposition *avMutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *avMutableCompositionTrack = [avMutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *error = nil;
[avMutableCompositionTrack insertTimeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(0.0f, 30), CMTimeMakeWithSeconds(duration>8.0f?8.0f:duration, 30))
ofTrack:avAssetTrack
atTime:kCMTimeZero
error:&error];
AVMutableVideoComposition *avMutableVideoComposition = [AVMutableVideoComposition videoComposition];
avMutableVideoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableVideoCompositionLayerInstruction *layerInstruciton = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:avMutableComposition.tracks[0]];
[layerInstruciton setTransform:[[[_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform] atTime:kCMTimeZero];
[layerInstruciton setOpacity:0.0f atTime:[_videoAsset duration]];
AVMutableVideoCompositionInstruction *avMutableVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
[avMutableVideoCompositionInstruction setTimeRange:CMTimeRangeMake(kCMTimeZero, [avMutableComposition duration])];
avMutableVideoCompositionInstruction.layerInstructions = [NSArray arrayWithObject:layerInstruciton];
if (avAssetTrack.preferredTransform.a) {
NSLog(#"横向");
avMutableVideoComposition.renderSize = CGSizeMake(avAssetTrack.naturalSize.width, avAssetTrack.naturalSize.height);
}else
{
avMutableVideoComposition.renderSize = CGSizeMake(avAssetTrack.naturalSize.height, avAssetTrack.naturalSize.width);
}
avMutableVideoComposition.instructions = [NSArray arrayWithObject:avMutableVideoCompositionInstruction];
// the url for save video
NSString *outUrlString = ITTPathForBabyShotResource([NSString stringWithFormat:#"%#/%#.mp4",DATA_ENV.userModel.userId,filmElementModel.filmElementId]);
NSFileManager *fm = [[NSFileManager alloc] init];
if ([fm fileExistsAtPath:outUrlString]) {
NSLog(#"video is have. then delete that");
if ([fm removeItemAtPath:outUrlString error:&error]) {
NSLog(#"delete is ok");
}else {
NSLog(#"delete is no error = %#",error.description);
}
}
CGSize renderSize = CGSizeMake(1280, 720);
if (MIN(avAssetTrack.naturalSize.width, avAssetTrack.naturalSize.height)<720) {
renderSize =avAssetTrack.naturalSize;
}
long long fileLimite =renderSize.width*renderSize.height*(duration>8.0f?8.0f:duration)/2;
_avAssetExportSession = [[AVAssetExportSession alloc] initWithAsset:avMutableComposition presetName:AVAssetExportPreset1280x720];
[_avAssetExportSession setVideoComposition:avMutableVideoComposition];
[_avAssetExportSession setOutputURL:[NSURL fileURLWithPath:outUrlString]];
[_avAssetExportSession setOutputFileType:AVFileTypeQuickTimeMovie];
[_avAssetExportSession setFileLengthLimit: fileLimite];
[_avAssetExportSession setShouldOptimizeForNetworkUse:YES];
[_avAssetExportSession exportAsynchronouslyWithCompletionHandler:^(void){
switch (_avAssetExportSession.status) {
case AVAssetExportSessionStatusFailed:
{
}
break;
case AVAssetExportSessionStatusCompleted:
{
}
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"export cancelled");
break;
case AVAssetExportSessionStatusExporting:
NSLog(#"AVAssetExportSessionStatusExporting");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"AVAssetExportSessionStatusWaiting");
break;
}
}];
if (_avAssetExportSession.status != AVAssetExportSessionStatusCompleted){
NSLog(#"Retry export");
}
I solved this problem!
[avMutableCompositionTrack insertTimeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(0.0f, 30), CMTimeMakeWithSeconds(duration>8.0f?8.0f:duration, 30))
ofTrack:avAssetTrack
atTime:kCMTimeZero
error:&error];
I replace the CMTimeRangeMake(CMTimeMakeWithSeconds(0.0f, 30) with CMTimeRangeMake(CMTimeMakeWithSeconds(0.1f, 30).
But I don't know why it can Work properly。

AVAssetExportSession fails if start time is not 0

I am trying to export an audio file from the ipod library . If the start value is 0.0 then the export works just fine. If the start value is greater then 0.0 then the audio file is just empty. The status is completed but the file size is 0 kb . Can someone please give me a solution for this or is this another Apple bug ?
float vocalStartMarker = startValue;
float vocalEndMarker = endValue;
NSURL *audioFileInput = url;
NSString *voicePath = [path stringByAppendingPathComponent:#"audio"];
NSString *uniquePath = [mainViewController uniqueFilename:voicePath withExtension:#"caf"];
NSURL *audioFileOutput = [NSURL fileURLWithPath:uniquePath];
if (!audioFileInput || !audioFileOutput)
{
return NO;
}
[[NSFileManager defaultManager] removeItemAtURL:audioFileOutput error:NULL];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:audioFileInput options:nil];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:asset
presetName:AVAssetExportPresetAppleM4A];
if (exportSession == nil)
{
return NO;
}
CMTime startTime = CMTimeMake((int)(floor(vocalStartMarker * 100)), 100);
CMTime stopTime = CMTimeMake((int)(ceil(vocalEndMarker * 100)), 100);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime);
exportSession.outputURL = audioFileOutput;
exportSession.outputFileType = AVFileTypeAppleM4A;
exportSession.timeRange = exportTimeRange;
[exportSession exportAsynchronouslyWithCompletionHandler:^
{
if (AVAssetExportSessionStatusCompleted == exportSession.status)
{
NSLog(#"it worked:%#",uniquePath);
// It worked!
[music replaceObjectAtIndex:currentPage withObject:uniquePath];
}
else if (AVAssetExportSessionStatusFailed == exportSession.status)
{
NSLog(#"it failed");
// It failed...
}
}];
[self.popoverController dismissPopoverAnimated:YES];
self.popoverController=nil;
return YES;

Resources