Videos shot from android phones get ruined after editing with AVFoundation iOS - ios

I am working on an app that requires editing videos(setting overlays).Now,while the videos shot from iPhones are edited fine,the ones shot from android phones are getting blank after editing.
I can't imagine what the problem could be.I would appreciate an immediate help.
This is one of the methods(Trim functionality).
- (IBAction)cutButtonTapped:(id)sender {
hud = [MBProgressHUD showHUDAddedTo:self.view animated:YES];
hud.mode = MBProgressHUDModeText;
hud.labelText = #"Encoding...";
[self.playButton setBackgroundImage:[UIImage imageNamed:#"video_pause.png"] forState:UIControlStateNormal];
NSString *uniqueString = [[NSProcessInfo processInfo]globallyUniqueString];
//do this to export video
NSURL *videoFileUrl = [NSURL fileURLWithPath:[AppHelper userDefaultsForKey:#"videoURL"]];
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {
self.exportSession_ = [[AVAssetExportSession alloc]
initWithAsset:anAsset presetName:AVAssetExportPresetPassthrough];
// Implementation continues.
// NSURL *furl = [self newURLWithName:[uniqueString stringByAppendingString:#".mov"]];
NSURL *furl = [self newURLWithName:[uniqueString stringByAppendingString:[NSString stringWithFormat:#".%#",[videoFileUrl pathExtension]]]];
self.exportSession_.outputURL = furl;
self.exportSession_.outputFileType = AVFileTypeMPEG4;
CMTime start = CMTimeMakeWithSeconds(self.startTime, anAsset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.stopTime-self.startTime, anAsset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
CMTimeShow( self.exportSession_.timeRange.duration);
self.exportSession_.timeRange = range;
CMTimeShow( self.exportSession_.timeRange.duration);
[self.exportSession_ exportAsynchronouslyWithCompletionHandler:^{
switch ([self.exportSession_ status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[self.exportSession_ error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
NSLog(#"NONE");
dispatch_async(dispatch_get_main_queue(), ^{
// [self playDocumentDirectoryVideoWithURLString:[uniqueString stringByAppendingString:#".mov"]];
[self playDocumentDirectoryVideoWithURLString:[uniqueString stringByAppendingString:[NSString stringWithFormat:#".%#",[videoFileUrl pathExtension]]]];
});
}
}];
}
}
Could anyone please help me with this?

First of all, I recommend you to check duration & range values. It seems like an issue with CMTime and decoding.
And second, try to initialise your AVURLAsset with an option to force duration extraction:
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:#{AVURLAssetPreferPreciseDurationAndTimingKey: #(YES)}];

Related

iOS how to export video from PHAsset faster?

I need to upload video from the album, so I use Phasset to fetch the video source, I am going to export video from PHAsset to sandbox, in which the video will be compressed by different rate. I found the duration of export video too long in high rate. so I am looking for a faster way to export video but without lose the video's rate.
here is my video, 500M, takes me about a half minute to export in the highest rate.
or is there any other way to upload the video in album directly without export?
here is my way to export:
PHVideoRequestOptions* options = [[PHVideoRequestOptions alloc] init];
options.version = PHVideoRequestOptionsVersionCurrent;
options.deliveryMode = PHVideoRequestOptionsDeliveryModeHighQualityFormat;
options.networkAccessAllowed = YES;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset* avasset, AVAudioMix* audioMix, NSDictionary* info){
// NSLog(#"Info:\n%#",info);
AVURLAsset *videoAsset = (AVURLAsset*)avasset;
// NSLog(#"AVAsset URL: %#",myAsset.URL);
[self startExportVideoWithVideoAsset:videoAsset presetName:presetName success:success failure:failure];
}];
- (void)startExportVideoWithVideoAsset:(AVURLAsset *)videoAsset presetName:(NSString *)presetName success:(void (^)(NSString *outputPath))success failure:(void (^)(NSString *errorMessage, NSError *error))failure {
// Find compatible presets by video asset.
NSArray *presets = [AVAssetExportSession exportPresetsCompatibleWithAsset:videoAsset];
// Begin to compress video
// Now we just compress to low resolution if it supports
// If you need to upload to the server, but server does't support to upload by streaming,
// You can compress the resolution to lower. Or you can support more higher resolution.
if ([presets containsObject:presetName]) {
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:presetName];
NSDateFormatter *formater = [[NSDateFormatter alloc] init];
[formater setDateFormat:#"yyyy-MM-dd-HH:mm:ss-SSS"];
// NSLog(#"video outputPath = %#",outputPath);
NSString *outputPath = [NSString stringWithFormat:#"%#output-%#.mp4",
[UserCenterUtiles TempVideoPath],
[formater stringFromDate:[NSDate date]]];
session.outputURL = [NSURL fileURLWithPath:outputPath];
// Optimize for network use.
session.shouldOptimizeForNetworkUse = true;
NSArray *supportedTypeArray = session.supportedFileTypes;
if ([supportedTypeArray containsObject:AVFileTypeMPEG4]) {
session.outputFileType = AVFileTypeMPEG4;
} else if (supportedTypeArray.count == 0) {
if (failure) {
failure(#"No supported file types", nil);
}
NSLog(#"No supported file types");
return;
} else {
session.outputFileType = [supportedTypeArray objectAtIndex:0];
}
if (![[NSFileManager defaultManager] fileExistsAtPath:[NSHomeDirectory() stringByAppendingFormat:#"/tmp"]]) {
[[NSFileManager defaultManager] createDirectoryAtPath:[NSHomeDirectory() stringByAppendingFormat:#"/tmp"] withIntermediateDirectories:YES attributes:nil error:nil];
}
AVMutableVideoComposition *videoComposition = [self fixedCompositionWithAsset:videoAsset];
if (videoComposition.renderSize.width) {
session.videoComposition = videoComposition;
}
// Begin to export video to the output path asynchronously.
[session exportAsynchronouslyWithCompletionHandler:^(void) {
dispatch_async(dispatch_get_main_queue(), ^{
switch (session.status) {
case AVAssetExportSessionStatusUnknown: {
NSLog(#"AVAssetExportSessionStatusUnknown");
} break;
case AVAssetExportSessionStatusWaiting: {
NSLog(#"AVAssetExportSessionStatusWaiting");
} break;
case AVAssetExportSessionStatusExporting: {
NSLog(#"AVAssetExportSessionStatusExporting");
} break;
case AVAssetExportSessionStatusCompleted: {
NSLog(#"AVAssetExportSessionStatusCompleted");
if (success) {
success(outputPath);
}
} break;
case AVAssetExportSessionStatusFailed: {
NSLog(#"AVAssetExportSessionStatusFailed");
if (failure) {
failure(#"export failed", session.error);
}
[[NSNotificationCenter defaultCenter] postNotificationName:#"XMVideoOutputNoticeKey" object:[NSNumber numberWithFloat:session.progress]];
} break;
case AVAssetExportSessionStatusCancelled: {
NSLog(#"AVAssetExportSessionStatusCancelled");
if (failure) {
failure(#"export canceled", nil);
}
[[NSNotificationCenter defaultCenter] postNotificationName:#"XMVideoOutputNoticeKey" object:[NSNumber numberWithFloat:session.progress]];
} break;
default: break;
}
});
}];
} else {
if (failure) {
NSString *errorMessage = [NSString stringWithFormat:#"the device not support export"];
failure(errorMessage, nil);
}
}
}

Fade In, Fade Out effect in audio By using AVAssetExportSession in ios

I have trimmed an audio for a particular duration by using AVAssetExportSession and i am also getting the trimmed audio.
But my problem is that i want to add fade in and fade out effect in my audio.
Let me know, how can i solve this?
Any help will be appreciated.
Code for trimming audio is here--
- (void)trimAudio:(NSString *)inputAudioPath audioStartTime:(float)sTime audioEndTime:(float)eTime outputPath:(NSString *)outputFilePath mode:(NSInteger)kSelectionMode
{
#try
{
AVAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:inputAudioPath] options:nil];
//Create the session with assets
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:asset presetName:AVAssetExportPresetAppleM4A];
//Set the output url
exportSession.outputURL = [NSURL fileURLWithPath:outputFilePath];
// Trim video in a particular duration
CMTime startTime = CMTimeMake((int)(floor(sTime * 100)), 100);
CMTime stopTime = CMTimeMake((int)(ceil(eTime * 100)), 100);
CMTimeRange range = CMTimeRangeFromTimeToTime(startTime, stopTime);
exportSession.timeRange = range;
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
NSLog(#"Export Complete");
if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(exportSession.outputURL.path)){
UISaveVideoAtPathToSavedPhotosAlbum(exportSession.outputURL.path, nil, nil, nil);
if ([self.delegate respondsToSelector:#selector(trimDidSucceed:mode:)]) {
[self.delegate trimDidSucceed:outputFilePath mode:kTrimAudio];
}
else{
[self.delegate trimDidFail:exportSession.error];
}
}
break;
}
case AVAssetExportSessionStatusFailed:{
NSLog(#"Export Error: %#", [exportSession.error description]);
break;
}
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export Cancelled");
break;
default:
break;
}
}];
exportSession = nil;
}
#catch (NSException * e)
{
NSLog(#"Exception Name:%# Reason:%#",[e name],[e reason]);
}
}
//fade in /out
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:asset.tracks.lastObject];
int start=2,length=3;
[exportAudioMixInputParameters setVolume:0.0 atTime:CMTimeMakeWithSeconds(start-1, 1)];
[exportAudioMixInputParameters setVolume:0.1 atTime:CMTimeMakeWithSeconds(start, 1)];
[exportAudioMixInputParameters setVolume:0.5 atTime:CMTimeMakeWithSeconds(start+1, 1)];
[exportAudioMixInputParameters setVolume:1.0 atTime:CMTimeMakeWithSeconds(start+2, 1)];
[exportAudioMixInputParameters setVolume:1.0 atTime:CMTimeMakeWithSeconds((start+length-2), 1)];
[exportAudioMixInputParameters setVolume:0.5 atTime:CMTimeMakeWithSeconds((start+length-1), 1)];
[exportAudioMixInputParameters setVolume:0.1 atTime:CMTimeMakeWithSeconds((start+length), 1)];
exportAudioMix.inputParameters = [NSArray arrayWithObject:exportAudioMixInputParameters];
exportSession.audioMix = exportAudioMix; // fade in audio mix
Just add these few line before [exportSession exportAsynchronouslyWithCompletionHandler:^{
}];
You can simply put if statement when playing that audio by taking nstimer like if time reach at particular second that increase volume and if it reach at peak second then decrease volume to initial volume.
Check this link for reference.

Trying to split a video with `AVAssetExportSession`

I am trying to split a video into 4 second chunks with AVAssetExportSession. The initial split works and returns a 8mb/4 second chunk. But the second returns 12mb which is incorrect when the original video os only 18mb.
- (void) splitVideo{
AVURLAsset *vidAsset = [AVURLAsset URLAssetWithURL:output options:nil];
CMTime duration = vidAsset.duration;
NSLog(#"File size is : %.2f MB And Duration: %f",(float)[NSData dataWithContentsOfURL:output].length/1024.0f/1024.0f, CMTimeGetSeconds(duration));
splitArray = [[NSMutableArray alloc]init];
CMTime end = CMTimeMake(4, 1);
CMTimeRange range = CMTimeRangeMake(kCMTimeZero, end);
NSString *outputPath = [NSTemporaryDirectory() stringByAppendingPathComponent:#"output0.mp4"];
totalSeconds = 4.0f;
[self cutVideo:output withRange:range withOutput:outputPath];
}
- (void) cutVideo:(NSURL *)url withRange:(CMTimeRange)range withOutput:(NSString*)path{
AVAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:asset];
if ([compatiblePresets containsObject:AVAssetExportPresetHighestQuality]) {
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
NSURL *finalUrl = [NSURL fileURLWithPath:path];
[[NSFileManager defaultManager] removeItemAtURL:finalUrl error:NULL];
exportSession.outputURL = finalUrl;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.timeRange = range;
NSLog(#"start: %f end: %f", CMTimeGetSeconds(range.start), CMTimeGetSeconds(range.duration));
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
});
if ([exportSession status] == AVAssetExportSessionStatusCompleted){
NSData *videoData = [[NSData alloc]initWithContentsOfURL:exportSession.outputURL];
NSLog(#"DL: %f", (float)videoData.length/1024.0f/1024.0f);
[self makeFile:finalUrl];
AVURLAsset *fullVid = [AVURLAsset URLAssetWithURL:output options:nil];
CMTime start = CMTimeMake(totalSeconds, 1);
totalSeconds = totalSeconds + 4.0f;
CMTime end;
if ((CMTimeGetSeconds(start) + 4) > CMTimeGetSeconds(fullVid.duration)) {
end = fullVid.duration;
}else{
end = CMTimeMake(CMTimeGetSeconds(start) + 4, 1);
}
CMTimeRange range2 = CMTimeRangeMake(start, end);
NSLog(#"%f < %f\n\n", CMTimeGetSeconds(start), CMTimeGetSeconds(fullVid.duration));
if (CMTimeGetSeconds(start) < CMTimeGetSeconds(fullVid.duration)) {
NSString *outputPath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"output%lu.mp4", splitArray.count]];
[self cutVideo:output withRange:range2 withOutput:outputPath];
}else{
[self saveVideo:true];
}
}else if ([exportSession status] == AVAssetExportSessionStatusFailed){
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
}else if ([exportSession status] == AVAssetExportSessionStatusCancelled){
NSLog(#"Export canceled");
}
}];
}
}
File size is : 18.86 MB And Duration: 9.171667
first
start: 0.000000 end: 4.000000
DL: 8.194733
4.000000 < 9.171667
second
start: 4.000000 end: 8.000000
DL: 12.784523
It's not incorrect, because video decoders stores changes from last frame, not just a set of "images". I guess your video in have more color changes in second chunk, that's why you get more space.

Strange Issue loading audio files within app

I am having issues loading certain audio files in my app, certain songs /audio will work fine, others do not and I can't figure out how or why.
When it is cloud based iTunes music, it does not work for sure, but mostly if the music has been downloaded to the device from iTunes, it will load, but even then, some files simply refuse to load.
Here is the code, for opening;
#define EXPORT_NAME #"exported.m4a"
- (void) openSoundFile:(ProductInfo*)info
{
[self dismissViewControllerAnimated:YES completion:nil];
if (currentProduct) {
[currentProduct release];
currentProduct = nil;
}
currentProduct = [[ProductInfo alloc] init];
currentProduct.index = info.index;
currentProduct.parent = info.parent;
currentProduct.title = [info.title copy];
currentProduct.name = [info.name copy];
currentProduct.tempo = info.tempo;
currentProduct.pitch = info.pitch;
currentProduct.key = info.key;
[m_lblName setText:info.title];
[m_viewIsLoading setHidden:NO];
[self loadSoundFileToM4A];
}
Then we show loadSoundFileToM4A method;
-(void) loadSoundFileToM4A
{
NSURL *assetURL = [NSURL URLWithString:currentProduct.name];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc]
initWithAsset: songAsset
presetName: AVAssetExportPresetAppleM4A];
exporter.outputFileType = #"com.apple.m4a-audio";
// set up export (hang on to exportURL so convert to PCM can find it)
NSArray *dirs = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [dirs objectAtIndex:0];
NSString *exportPath = [[documentsDirectoryPath stringByAppendingPathComponent:EXPORT_NAME] retain];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
NSURL *exportURL = [NSURL fileURLWithPath:exportPath];
exporter.outputURL = exportURL;
// do the export
[exporter exportAsynchronouslyWithCompletionHandler:^{
int exportStatus = exporter.status;
switch (exportStatus) {
case AVAssetExportSessionStatusFailed: {
// log error to text view
NSError *exportError = exporter.error;
NSLog (#"AVAssetExportSessionStatusFailed: %#", exportError);
[self loadSoundFileFail];
break;
}
case AVAssetExportSessionStatusCompleted: {
NSLog (#"AVAssetExportSessionStatusCompleted");
// set up AVPlayer
[self loadSoundFile];
break;
}
case AVAssetExportSessionStatusUnknown: { NSLog (#"AVAssetExportSessionStatusUnknown"); break;}
case AVAssetExportSessionStatusExporting: { NSLog (#"AVAssetExportSessionStatusExporting"); break;}
case AVAssetExportSessionStatusCancelled: { NSLog (#"AVAssetExportSessionStatusCancelled"); break;}
case AVAssetExportSessionStatusWaiting: { NSLog (#"AVAssetExportSessionStatusWaiting"); break;}
default: { NSLog (#"didn't get export status"); break;}
}
}];
}
when it fails, it uses this method;
- (void) loadSoundFileFail
{
[m_viewIsLoading setHidden:YES];
UIAlertView* alert = [[UIAlertView alloc] initWithTitle:#"Failed"
message:#"Couldn't load file"
delegate:self
cancelButtonTitle:nil
otherButtonTitles:#"OK", nil];
[alert show];
}
And on success, this is the method;
- (void) loadSoundFile
{
[m_viewIsLoading setHidden:YES];
NSArray *dirs = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [dirs objectAtIndex:0];
NSString *exportPath = [[documentsDirectoryPath stringByAppendingPathComponent:EXPORT_NAME] retain];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
if( m_pPlayer != NULL )
{
if( m_pPlayer->IsRunning() )
{
OSStatus result = m_pPlayer->StopQueue();
if (result == noErr)
[[NSNotificationCenter defaultCenter] postNotificationName:#"soundQueueResumed" object:self];
}
m_pPlayer->DisposeQueue( TRUE );
CFStringRef strFilePath = (CFStringRef)exportPath;
m_pPlayer->CreateQueueForFile( strFilePath );
nTempo = (float)currentProduct.tempo;
nPitch = (float)currentProduct.pitch/100.0f;
nKey = (float)currentProduct.key;
[self setKeyValue];
[self setTempoValue];
[self setPitchValue];
[m_sldTempo setValue:nTempo];
[m_sldPitch setValue:nPitch];
[m_sldKey setValue:nKey];
[self setRepeatSwitchState];
[self setReverseSwitchState];
// Set the button's state back to "record"
if( m_pPlayer->Queue() == NULL )
m_btnPlay.enabled = NO;
else
m_btnPlay.enabled = YES;
}
}
}
I capture this error in the debugger;
AVAssetExportSessionStatusFailed: Error Domain=NSURLErrorDomain Code=-1 "unknown error"
UserInfo=0x1700ffa80 {NSUnderlyingError=0x17804edf0 "The operation couldn’t be completed.
(OSStatus error -12935.)", NSErrorFailingURLStringKey=(null), NSErrorFailingURLKey=(null), NSURL=(null),
NSLocalizedDescription=unknown error}

using [exporter exportAsynchronouslyWithCompletionHandler] to export an audio file take long time to export from AVAset

I am using below code to get the Audio from library for triming
here my problem is i am able to get the the audio but is is taking long time to get the audio
AAVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:appDelegate.librarySongURL options:nil];
NSLog (#"Core Audio %# directly open library URL %#",
coreAudioCanOpenURL (appDelegate.librarySongURL) ? #"can" : #"cannot",
appDelegate.librarySongURL);
NSLog (#"compatible presets for songAsset: %#",
[AVAssetExportSession exportPresetsCompatibleWithAsset:songAsset]);
AVAssetExportSession *exporter = [[AVAssetExportSession alloc]
initWithAsset: songAsset
presetName: AVAssetExportPresetAppleM4A];
NSLog (#"created exporter. supportedFileTypes: %#", exporter.supportedFileTypes);
[self handlePlayPauseDefault:0];
exporter.outputFileType = #"com.apple.m4a-audio";
NSString *exportFile = [myDocumentsDirectory() stringByAppendingPathComponent: #"exported.m4a"];
myDeleteFile(exportFile);
appDelegate.libraryUrl = [NSURL fileURLWithPath:exportFile];
exporter.outputURL = appDelegate.libraryUrl;
[exporter exportAsynchronouslyWithCompletionHandler:^{
int exportStatus = exporter.status;
switch (exportStatus) {
case AVAssetExportSessionStatusFailed: {
NSError *exportError = exporter.error;
NSLog (#"AVAssetExportSessionStatusFailed: %#", exportError);
break;
}
case AVAssetExportSessionStatusCompleted: {
NSLog (#"AVAssetExportSessionStatusCompleted");
self.navigationController.navigationBar.userInteractionEnabled = YES;
break;
}
case AVAssetExportSessionStatusUnknown: { NSLog (#"AVAssetExportSessionStatusUnknown"); break;}
case AVAssetExportSessionStatusExporting: { NSLog (#"AVAssetExportSessionStatusExporting"); break;}
case AVAssetExportSessionStatusCancelled: { NSLog (#"AVAssetExportSessionStatusCancelled"); break;}
case AVAssetExportSessionStatusWaiting: { NSLog (#"AVAssetExportSessionStatusWaiting"); break;}
default: { NSLog (#"didn't get export status"); break;}
}
}];
can any one help me ?
You're probably causing some kind of conversion - that will be slow (not that much faster than realtime). Make sure you're using the passthrough preset, AVAssetExportPresetPassthrough.

Resources