Strange Issue loading audio files within app - ios

I am having issues loading certain audio files in my app, certain songs /audio will work fine, others do not and I can't figure out how or why.
When it is cloud based iTunes music, it does not work for sure, but mostly if the music has been downloaded to the device from iTunes, it will load, but even then, some files simply refuse to load.
Here is the code, for opening;
#define EXPORT_NAME #"exported.m4a"
- (void) openSoundFile:(ProductInfo*)info
{
[self dismissViewControllerAnimated:YES completion:nil];
if (currentProduct) {
[currentProduct release];
currentProduct = nil;
}
currentProduct = [[ProductInfo alloc] init];
currentProduct.index = info.index;
currentProduct.parent = info.parent;
currentProduct.title = [info.title copy];
currentProduct.name = [info.name copy];
currentProduct.tempo = info.tempo;
currentProduct.pitch = info.pitch;
currentProduct.key = info.key;
[m_lblName setText:info.title];
[m_viewIsLoading setHidden:NO];
[self loadSoundFileToM4A];
}
Then we show loadSoundFileToM4A method;
-(void) loadSoundFileToM4A
{
NSURL *assetURL = [NSURL URLWithString:currentProduct.name];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc]
initWithAsset: songAsset
presetName: AVAssetExportPresetAppleM4A];
exporter.outputFileType = #"com.apple.m4a-audio";
// set up export (hang on to exportURL so convert to PCM can find it)
NSArray *dirs = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [dirs objectAtIndex:0];
NSString *exportPath = [[documentsDirectoryPath stringByAppendingPathComponent:EXPORT_NAME] retain];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
NSURL *exportURL = [NSURL fileURLWithPath:exportPath];
exporter.outputURL = exportURL;
// do the export
[exporter exportAsynchronouslyWithCompletionHandler:^{
int exportStatus = exporter.status;
switch (exportStatus) {
case AVAssetExportSessionStatusFailed: {
// log error to text view
NSError *exportError = exporter.error;
NSLog (#"AVAssetExportSessionStatusFailed: %#", exportError);
[self loadSoundFileFail];
break;
}
case AVAssetExportSessionStatusCompleted: {
NSLog (#"AVAssetExportSessionStatusCompleted");
// set up AVPlayer
[self loadSoundFile];
break;
}
case AVAssetExportSessionStatusUnknown: { NSLog (#"AVAssetExportSessionStatusUnknown"); break;}
case AVAssetExportSessionStatusExporting: { NSLog (#"AVAssetExportSessionStatusExporting"); break;}
case AVAssetExportSessionStatusCancelled: { NSLog (#"AVAssetExportSessionStatusCancelled"); break;}
case AVAssetExportSessionStatusWaiting: { NSLog (#"AVAssetExportSessionStatusWaiting"); break;}
default: { NSLog (#"didn't get export status"); break;}
}
}];
}
when it fails, it uses this method;
- (void) loadSoundFileFail
{
[m_viewIsLoading setHidden:YES];
UIAlertView* alert = [[UIAlertView alloc] initWithTitle:#"Failed"
message:#"Couldn't load file"
delegate:self
cancelButtonTitle:nil
otherButtonTitles:#"OK", nil];
[alert show];
}
And on success, this is the method;
- (void) loadSoundFile
{
[m_viewIsLoading setHidden:YES];
NSArray *dirs = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [dirs objectAtIndex:0];
NSString *exportPath = [[documentsDirectoryPath stringByAppendingPathComponent:EXPORT_NAME] retain];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
if( m_pPlayer != NULL )
{
if( m_pPlayer->IsRunning() )
{
OSStatus result = m_pPlayer->StopQueue();
if (result == noErr)
[[NSNotificationCenter defaultCenter] postNotificationName:#"soundQueueResumed" object:self];
}
m_pPlayer->DisposeQueue( TRUE );
CFStringRef strFilePath = (CFStringRef)exportPath;
m_pPlayer->CreateQueueForFile( strFilePath );
nTempo = (float)currentProduct.tempo;
nPitch = (float)currentProduct.pitch/100.0f;
nKey = (float)currentProduct.key;
[self setKeyValue];
[self setTempoValue];
[self setPitchValue];
[m_sldTempo setValue:nTempo];
[m_sldPitch setValue:nPitch];
[m_sldKey setValue:nKey];
[self setRepeatSwitchState];
[self setReverseSwitchState];
// Set the button's state back to "record"
if( m_pPlayer->Queue() == NULL )
m_btnPlay.enabled = NO;
else
m_btnPlay.enabled = YES;
}
}
}
I capture this error in the debugger;
AVAssetExportSessionStatusFailed: Error Domain=NSURLErrorDomain Code=-1 "unknown error"
UserInfo=0x1700ffa80 {NSUnderlyingError=0x17804edf0 "The operation couldn’t be completed.
(OSStatus error -12935.)", NSErrorFailingURLStringKey=(null), NSErrorFailingURLKey=(null), NSURL=(null),
NSLocalizedDescription=unknown error}

Related

iOS how to export video from PHAsset faster?

I need to upload video from the album, so I use Phasset to fetch the video source, I am going to export video from PHAsset to sandbox, in which the video will be compressed by different rate. I found the duration of export video too long in high rate. so I am looking for a faster way to export video but without lose the video's rate.
here is my video, 500M, takes me about a half minute to export in the highest rate.
or is there any other way to upload the video in album directly without export?
here is my way to export:
PHVideoRequestOptions* options = [[PHVideoRequestOptions alloc] init];
options.version = PHVideoRequestOptionsVersionCurrent;
options.deliveryMode = PHVideoRequestOptionsDeliveryModeHighQualityFormat;
options.networkAccessAllowed = YES;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset* avasset, AVAudioMix* audioMix, NSDictionary* info){
// NSLog(#"Info:\n%#",info);
AVURLAsset *videoAsset = (AVURLAsset*)avasset;
// NSLog(#"AVAsset URL: %#",myAsset.URL);
[self startExportVideoWithVideoAsset:videoAsset presetName:presetName success:success failure:failure];
}];
- (void)startExportVideoWithVideoAsset:(AVURLAsset *)videoAsset presetName:(NSString *)presetName success:(void (^)(NSString *outputPath))success failure:(void (^)(NSString *errorMessage, NSError *error))failure {
// Find compatible presets by video asset.
NSArray *presets = [AVAssetExportSession exportPresetsCompatibleWithAsset:videoAsset];
// Begin to compress video
// Now we just compress to low resolution if it supports
// If you need to upload to the server, but server does't support to upload by streaming,
// You can compress the resolution to lower. Or you can support more higher resolution.
if ([presets containsObject:presetName]) {
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:presetName];
NSDateFormatter *formater = [[NSDateFormatter alloc] init];
[formater setDateFormat:#"yyyy-MM-dd-HH:mm:ss-SSS"];
// NSLog(#"video outputPath = %#",outputPath);
NSString *outputPath = [NSString stringWithFormat:#"%#output-%#.mp4",
[UserCenterUtiles TempVideoPath],
[formater stringFromDate:[NSDate date]]];
session.outputURL = [NSURL fileURLWithPath:outputPath];
// Optimize for network use.
session.shouldOptimizeForNetworkUse = true;
NSArray *supportedTypeArray = session.supportedFileTypes;
if ([supportedTypeArray containsObject:AVFileTypeMPEG4]) {
session.outputFileType = AVFileTypeMPEG4;
} else if (supportedTypeArray.count == 0) {
if (failure) {
failure(#"No supported file types", nil);
}
NSLog(#"No supported file types");
return;
} else {
session.outputFileType = [supportedTypeArray objectAtIndex:0];
}
if (![[NSFileManager defaultManager] fileExistsAtPath:[NSHomeDirectory() stringByAppendingFormat:#"/tmp"]]) {
[[NSFileManager defaultManager] createDirectoryAtPath:[NSHomeDirectory() stringByAppendingFormat:#"/tmp"] withIntermediateDirectories:YES attributes:nil error:nil];
}
AVMutableVideoComposition *videoComposition = [self fixedCompositionWithAsset:videoAsset];
if (videoComposition.renderSize.width) {
session.videoComposition = videoComposition;
}
// Begin to export video to the output path asynchronously.
[session exportAsynchronouslyWithCompletionHandler:^(void) {
dispatch_async(dispatch_get_main_queue(), ^{
switch (session.status) {
case AVAssetExportSessionStatusUnknown: {
NSLog(#"AVAssetExportSessionStatusUnknown");
} break;
case AVAssetExportSessionStatusWaiting: {
NSLog(#"AVAssetExportSessionStatusWaiting");
} break;
case AVAssetExportSessionStatusExporting: {
NSLog(#"AVAssetExportSessionStatusExporting");
} break;
case AVAssetExportSessionStatusCompleted: {
NSLog(#"AVAssetExportSessionStatusCompleted");
if (success) {
success(outputPath);
}
} break;
case AVAssetExportSessionStatusFailed: {
NSLog(#"AVAssetExportSessionStatusFailed");
if (failure) {
failure(#"export failed", session.error);
}
[[NSNotificationCenter defaultCenter] postNotificationName:#"XMVideoOutputNoticeKey" object:[NSNumber numberWithFloat:session.progress]];
} break;
case AVAssetExportSessionStatusCancelled: {
NSLog(#"AVAssetExportSessionStatusCancelled");
if (failure) {
failure(#"export canceled", nil);
}
[[NSNotificationCenter defaultCenter] postNotificationName:#"XMVideoOutputNoticeKey" object:[NSNumber numberWithFloat:session.progress]];
} break;
default: break;
}
});
}];
} else {
if (failure) {
NSString *errorMessage = [NSString stringWithFormat:#"the device not support export"];
failure(errorMessage, nil);
}
}
}

cannot download file from google drive in ios

I am very new to IOS development, I am developing an app in which I want to download a file from Google Drive (pdf,doxc,doc and xlxs) and which will be sent to sent it to our server.
First I have installed it in pod file by using
pod 'GoogleAPIClient/Drive', '~> 1.0.2'
pod 'GTMOAuth2', '~> 1.1.0'
Second I created a project in Google Developer Console with bundle id and got client id, I have saved it into my project.
When user clicks select file button
-(IBAction)onclickfileFromDrive:(id)sender
{
self.service = [[GTLServiceDrive alloc] init];
self.service.authorizer = [GTMOAuth2ViewControllerTouch authForGoogleFromKeychainForName:kKeychainItemName clientID:kClientID clientSecret:nil];
NSLog(#"service reply%hhd",self.service.authorizer.canAuthorize);
if (!self.service.authorizer.canAuthorize)
{
[self presentViewController:[self createAuthController] animated:YES completion:nil];
}
else
{
[self fetchFiles];
}
}
- (GTMOAuth2ViewControllerTouch *)createAuthController
{
GTMOAuth2ViewControllerTouch *authController;
NSArray *scopes = [NSArray arrayWithObjects:kGTLAuthScopeDrive, nil];
authController = [[GTMOAuth2ViewControllerTouch alloc]
initWithScope:[scopes componentsJoinedByString:#" "]
clientID:kClientID
clientSecret:nil
keychainItemName:kKeychainItemName
delegate:self
finishedSelector:#selector(viewController:finishedWithAuth:error:)];
return authController;
}
- (void)viewController:(GTMOAuth2ViewControllerTouch *)viewController finishedWithAuth:(GTMOAuth2Authentication *)authResult error:(NSError *)error
{
if (error != nil)
{
[self showAlert:#"Authentication Error" message:error.localizedDescription];
self.service.authorizer = nil;
}
else
{
self.service.authorizer = authResult;
[self dismissViewControllerAnimated:YES completion:nil];
}
self.service.authorizer = [GTMOAuth2ViewControllerTouch authForGoogleFromKeychainForName:kKeychainItemName clientID:kClientID clientSecret:nil];
NSLog(#"service reply%hhd",self.service.authorizer.canAuthorize);
if (self.service.authorizer.canAuthorize)
{
[self fetchFiles];
}
}
The process of login is working fine here I am getting access token and refresh token
while I am trying to download a file
- (void)fetchFiles
{
GTLQueryDrive *query =[GTLQueryDrive queryForFilesList];
query.q = #"mimeType ='application/pdf' or mimeType ='text/plain' or mimeType ='application/msword' or mimeType ='application/vnd.openxmlformats-officedocument.spreadsheetml.sheet' or mimeType = 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'";
[self.service executeQuery:query delegate:self didFinishSelector:#selector(displayResultWithTicket:finishedWithObject:error:)];
}
- (void)displayResultWithTicket:(GTLServiceTicket *)ticket finishedWithObject:(GTLDriveFileList *)response error:(NSError *)error
{
if (error == nil)
{
NSMutableString *filesString = [[NSMutableString alloc] init];
if (response.files.count > 0)
{
for (GTLDriveFile *file in response.files)
{
[filesString appendFormat:#"%# (%#)\n", file.name, file.identifier];
NSString *url = [NSString stringWithFormat:#"https://www.googleapis.com/drive/v3/files/%#?key=%#", file.identifier,kClientID];
GTMSessionFetcher *fetcher = [_service.fetcherService fetcherWithURLString:url];
[fetcher beginFetchWithCompletionHandler:^(NSData *datav, NSError *error)
{
if (error == nil)
{
NSLog(#" file web link%#",file.webViewLink);
NSLog(#"file extension%#",file.fileExtension);
NSLog(#"Retrieved file content%#",datav);
NSString *prefixString = #"EMR File";
NSString *guid = [[NSProcessInfo processInfo] globallyUniqueString] ;
NSString *uniqueFileName = [NSString stringWithFormat:#"%#_%#", prefixString, guid];
NSString *filePath = [[NSString alloc]init];
/***************** save the file in document directory ***********************/
if (datav!=nil)
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
//Get the docs directory
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
NSString *folderPath = [documentsDirectoryPath stringByAppendingPathComponent:#"emrFiles"]; // subDirectory
if (![[NSFileManager defaultManager] fileExistsAtPath:folderPath])
[[NSFileManager defaultManager] createDirectoryAtPath:folderPath withIntermediateDirectories:NO attributes:nil error:nil];
//Add the FileName to FilePath
filePath = [folderPath stringByAppendingPathComponent:[uniqueFileName stringByAppendingFormat:#".%#",file.fileExtension]];
//Write the file to documents directory
[datav writeToFile:filePath atomically:YES];
}
}
else
{
NSLog(#"An error occurred: %#", error.localizedDescription);
}
}];
}
}
else
{
[filesString appendString:#"No files found."];
}
}
else
{
[self showAlert:#"Error" message:error.localizedDescription];
}
}
the file details like
NSLog(#" file web link%#",file.webViewLink);
NSLog(#"file extension%#",file.fileExtension);
file.fileExtension
These are getting null in all the time
However I am getting value for
file.name
file.identifier
Finally if I try to download a file using this url
NSString *url = [NSString stringWithFormat:#"https://www.googleapis.com/drive/v3/files/%#?key=%#", file.identifier,kClientID];
and
NSString *url = [NSString stringWithFormat:#"https://www.googleapis.com/drive/v3/files/%#?key=My API key from google console",
myfile.identifier];
I am getting nsdata very small content
Printing description of datav:
<7b0a2022 6b696e64 223a2022 64726976 65236669 6c65222c 0a202269
64223a20 22304236 6a6e576c 46497868 434f566e 6c586546 4a496344
526d596d 68694e44 6c735432 557a516b 7057566c 70716431 5246222c
0a20226e 616d6522 3a202253 41544859 412e646f 6378222c 0a20226d
696d6554 79706522 3a202261 70706c69 63617469 6f6e2f76 6e642e6f
70656e78 6d6c666f 726d6174 732d6f66 66696365 646f6375 6d656e74
2e776f72 6470726f 63657373 696e676d 6c2e646f 63756d65 6e74220a 7d0a>
Because of getting null value of file extension I tried with giving static extension and saved it in directory and try to view it and I get this error
“ File_B85DABFF-BCA5-4B5E-8FDC-59D2907F0509-1053-000003B0E6E4C9A9.docx” can’t be opened for some reason.
I don't know that where I am lacking I have searched more but could not help please any one help me on this
thanks in advance

Videos shot from android phones get ruined after editing with AVFoundation iOS

I am working on an app that requires editing videos(setting overlays).Now,while the videos shot from iPhones are edited fine,the ones shot from android phones are getting blank after editing.
I can't imagine what the problem could be.I would appreciate an immediate help.
This is one of the methods(Trim functionality).
- (IBAction)cutButtonTapped:(id)sender {
hud = [MBProgressHUD showHUDAddedTo:self.view animated:YES];
hud.mode = MBProgressHUDModeText;
hud.labelText = #"Encoding...";
[self.playButton setBackgroundImage:[UIImage imageNamed:#"video_pause.png"] forState:UIControlStateNormal];
NSString *uniqueString = [[NSProcessInfo processInfo]globallyUniqueString];
//do this to export video
NSURL *videoFileUrl = [NSURL fileURLWithPath:[AppHelper userDefaultsForKey:#"videoURL"]];
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {
self.exportSession_ = [[AVAssetExportSession alloc]
initWithAsset:anAsset presetName:AVAssetExportPresetPassthrough];
// Implementation continues.
// NSURL *furl = [self newURLWithName:[uniqueString stringByAppendingString:#".mov"]];
NSURL *furl = [self newURLWithName:[uniqueString stringByAppendingString:[NSString stringWithFormat:#".%#",[videoFileUrl pathExtension]]]];
self.exportSession_.outputURL = furl;
self.exportSession_.outputFileType = AVFileTypeMPEG4;
CMTime start = CMTimeMakeWithSeconds(self.startTime, anAsset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.stopTime-self.startTime, anAsset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
CMTimeShow( self.exportSession_.timeRange.duration);
self.exportSession_.timeRange = range;
CMTimeShow( self.exportSession_.timeRange.duration);
[self.exportSession_ exportAsynchronouslyWithCompletionHandler:^{
switch ([self.exportSession_ status]) {
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[self.exportSession_ error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
NSLog(#"NONE");
dispatch_async(dispatch_get_main_queue(), ^{
// [self playDocumentDirectoryVideoWithURLString:[uniqueString stringByAppendingString:#".mov"]];
[self playDocumentDirectoryVideoWithURLString:[uniqueString stringByAppendingString:[NSString stringWithFormat:#".%#",[videoFileUrl pathExtension]]]];
});
}
}];
}
}
Could anyone please help me with this?
First of all, I recommend you to check duration & range values. It seems like an issue with CMTime and decoding.
And second, try to initialise your AVURLAsset with an option to force duration extraction:
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:#{AVURLAssetPreferPreciseDurationAndTimingKey: #(YES)}];

AWS SDK for IOS from v1 to v2 about the "S3objectsummary"

Here is a function of progressbar. In the sdk v1 I can use "S3Objectsummary" to know the summary of the the file, but in the sdk v2 i can not found the "S3Objectsummary".
Which one is the similar one in the v2? If any one can show an example that will be great.
Also, i have the same question with
S3GetObjectRequest/S3GetObjectResponse/S3PutObjectRequest/AmazonClientException
Code is in the sdk ios v1:
-(AmazonS3Client *)s3{
[self validateCredentials];
return s3;}
-(void)validateCredentials{
NSLog(#"validating credentials.");
if (s3 == nil) {
[self clearCredentials];
s3 = [[AmazonS3Client alloc] initWithAccessKey:ACCESS_KEY_ID withSecretKey:SECRET_KEY];
}
}
-(void)setProgressBar{
[delegate setProgressStatus:progPercent];
}
-(void)downloadPlists{
#try {
NSArray *Plists = [[self s3] listObjectsInBucket:#"~~~~"];
float numfile = 1;
float totalfiles = [Plists count];
for (S3ObjectSummary *file in Plists) {
float percent = numfile/totalfiles;
progPercent = [NSNumber numberWithFloat:percent];
[self performSelectorOnMainThread:#selector(setProgressBar) withObject:progPercent waitUntilDone:YES];
numfile++;
NSString *key = [file key];
NSLog(#"key: %#", key);
if ([key rangeOfString:#".plist"].location != NSNotFound) {
NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *plistFilePath = [NSString stringWithFormat:#"%#/Plists/%#",docDir, key];
NSLog(#"plistFilePath: %#", plistFilePath);
S3GetObjectRequest *plist = [[S3GetObjectRequest alloc] initWithKey:key withBucket:#"~~~~~"];
S3GetObjectResponse *getObjectResponse = [[self s3] getObject:plist];
NSData *data2 = [NSData dataWithData: getObjectResponse.body];
NSString *courseFilePath = [plistFilePath substringToIndex:[plistFilePath rangeOfString:#"/" options:NSBackwardsSearch].location];
bool testDirectoryCreated = [[NSFileManager defaultManager]createDirectoryAtPath: courseFilePath
withIntermediateDirectories: YES
attributes: nil
error: NULL];
if (!testDirectoryCreated)
NSLog(#"error creating test directory.");
if (![data2 writeToFile:plistFilePath atomically:YES])
NSLog(#"error writing to path.");
}
}
}
#catch (NSException *exception) {
UIAlertView *failureAlert = [[UIAlertView alloc] initWithTitle:#"Oops!" message:[NSString stringWithFormat: #"There was an error performing this operation. Please try again later. Error: %#", exception] delegate:nil cancelButtonTitle:#"Okay" otherButtonTitles: nil];
[failureAlert show];
}
}
I try to do the same thing in v2 in the code as follow, is the code right?
-(void)downloadPlists
{
AWSS3 *s3 = [AWSS3 defaultS3];
AWSS3ListObjectsRequest *listObjectReq=[AWSS3ListObjectsRequest new];
listObjectReq.bucket=#"PLists";
[[[s3 listObjects:listObjectReq] continueWithBlock:^id(BFTask *task) {
if(task.error){
NSLog(#"the request failed. error %#",task.error);
}
if(task.result){
AWSS3ListObjectsOutput *listObjectsOutput=task.result;
NSArray *Plists = task.result; //Is the result of task in listObjectOutput a NSArray?
float numfile = 1;
float totalfiles = [Plists count];
for(AWSS3Object *file in listObjectsOutput.contents){
float percent = numfile/totalfiles;
progPercent = [NSNumber numberWithFloat:percent];
[self performSelectorOnMainThread:#selector(setProgressBar) withObject:progPercent waitUntilDone:YES];
numfile++;
NSString *key = [file key];
NSLog(#"key: %#", key);
if ([key rangeOfString:#".plist"].location != NSNotFound) {
NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *plistFilePath = [NSString stringWithFormat:#"%#/Plists/%#",docDir, key];
NSLog(#"plistFilePath: %#", plistFilePath);
AWSS3TransferManager *transferManager = [AWSS3TransferManager defaultS3TransferManager];
AWSS3TransferManagerDownloadRequest *downloadRequest = [AWSS3TransferManagerDownloadRequest new];
downloadRequest.bucket = #"PLists";
downloadRequest.key = key;
//downloadRequest.downloadingFileURL=[NSURL fileURLWithPath: #"???"]; I'm not sure the path. In sdk V1 there is no URL ?
[[transferManager download: downloadRequest] continueWithBlock:^id(BFTask *task) {
if (task.error) {
UIAlertView *failureAlert = [[UIAlertView alloc] initWithTitle:#"Oops!"
message:[NSString stringWithFormat: #"There was an error performing this operation. Please try again later. Error: %#", task.error]
delegate:nil
cancelButtonTitle:#"Okay"
otherButtonTitles: nil];
[failureAlert show];
}
if (task.result) {
AWSS3TransferManagerDownloadOutput *downloadOutput = task.result;
NSData *data2 = [NSData dataWithData: downloadOutput.body];
NSString *courseFilePath = [plistFilePath substringToIndex:[plistFilePath rangeOfString:#"/" options:NSBackwardsSearch].location];
bool testDirectoryCreated = [[NSFileManager defaultManager]createDirectoryAtPath: courseFilePath
withIntermediateDirectories: YES
attributes: nil
error: NULL];
if (!testDirectoryCreated)
NSLog(#"error creating test directory.");
if (![data2 writeToFile:plistFilePath atomically:YES])
NSLog(#"error writing to path.");
}
return nil;
}];
}
}
return nil;
}
return nil;
}] waitUntilFinished]; //In the integration test still use the "waitUntilFinisher".But in the "Working with BFTask" said the continueWithBolck won't execute until the previous asychronous call has already finished exceuting?
}
AWSS3Object in v2 is equivalent to S3ObjectSummary in v1.
You are not invoking - listObjects:, so your v2 code snippet does not work. You should take a look at the integration test as an example. Note that you should avoid calling - waitUntilFinished in your production app. See Working with BFTask for further details.

How to programmatically perform fastest 'trans-wrap' of mov to mp4 on iPhone/iPad application?

I want to change the container of .mov video files that I pick using 
UIImagePickerController and compressed them via AVAssetExportSession with AVAssetExportPresetMediumQuality and  shouldOptimizeForNetworkUse = YES to .mp4 container.
I need programmatically way/sample code to perform a fastest trans-wrap on iPhone/iPad application
I tried to set AVAssetExportSession.outputFileType property to AVFileTypeMPEG4 but it is not supported and I got an exception.
I tried to do this transform using AVAssetWriter by specifying fileType:AVFileTypeMPEG4, actually I got .mp4 output file, but it was not wrap-trans, the output file was  3x bigger than source, and the convert process took 128 sec for video with 60 sec duration.
I need solution that will run quickly and will keep the file size 
This is the code I use to convert .mov to .mp4:
I set assetWriter options on setUpReaderAndWriterReturningError method
#import "MCVideoConverter.h"
#import <AVFoundation/AVAsset.h>
#import <AVFoundation/AVAssetTrack.h>
#import <AVFoundation/AVAssetReader.h>
#import <AVFoundation/AVAssetReaderOutput.h>
#import <AVFoundation/AVAssetWriter.h>
#import <AVFoundation/AVAssetWriterInput.h>
#import <AVFoundation/AVMediaFormat.h>
#import <AVFoundation/AVAudioSettings.h>
#import <AVFoundation/AVVideoSettings.h>
#import <AVFoundation/AVAssetImageGenerator.h>
#import <AVFoundation/AVTime.h>
#import <CoreMedia/CMSampleBuffer.h>
#protocol RWSampleBufferChannelDelegate;
#interface RWSampleBufferChannel : NSObject
{
#private
AVAssetReaderOutput *assetReaderOutput;
AVAssetWriterInput *assetWriterInput;
dispatch_block_t completionHandler;
dispatch_queue_t serializationQueue;
BOOL finished; // only accessed on serialization queue
}
- (id)initWithAssetReaderOutput:(AVAssetReaderOutput *)assetReaderOutput assetWriterInput:(AVAssetWriterInput *)assetWriterInput;
- (void)startWithDelegate:(id <RWSampleBufferChannelDelegate>)delegate completionHandler:(dispatch_block_t)completionHandler; // delegate is retained until completion handler is called. Completion handler is guaranteed to be called exactly once, whether reading/writing finishes, fails, or is cancelled. Delegate may be nil.
- (void)cancel;
#property (nonatomic, readonly) NSString *mediaType;
#end
#protocol RWSampleBufferChannelDelegate <NSObject>
#required
- (void)sampleBufferChannel:(RWSampleBufferChannel *)sampleBufferChannel didReadSampleBuffer:(CMSampleBufferRef)sampleBuffer;
#end
#interface MCVideoConverter () <RWSampleBufferChannelDelegate>
// These three methods are always called on the serialization dispatch queue
- (BOOL)setUpReaderAndWriterReturningError:(NSError **)outError; // make sure "tracks" key of asset is loaded before calling this
- (BOOL)startReadingAndWritingReturningError:(NSError **)outError;
- (void)readingAndWritingDidFinishSuccessfully:(BOOL)success withError:(NSError *)error;
#end
#implementation MCVideoConverter
+ (NSArray *)readableTypes
{
return [AVURLAsset audiovisualTypes];;
}
+ (BOOL)canConcurrentlyReadDocumentsOfType:(NSString *)typeName
{
return YES;
}
- (id)init
{
self = [super init];
if (self)
{
NSString *serializationQueueDescription = [NSString stringWithFormat:#"%# serialization queue", self];
serializationQueue = dispatch_queue_create([serializationQueueDescription UTF8String], NULL);
}
return self;
}
- (void)dealloc
{
[asset release];
[outputURL release];
[assetReader release];
[assetWriter release];
[audioSampleBufferChannel release];
[videoSampleBufferChannel release];
if (serializationQueue)
dispatch_release(serializationQueue);
[super dealloc];
}
#synthesize asset=asset;
#synthesize timeRange=timeRange;
#synthesize writingSamples=writingSamples;
#synthesize outputURL=outputURL;
#synthesize propgerssView;
- (void)convertVideo:(NSURL*) inputURL outputURL: (NSURL*) _outputURL progress:(UIProgressView*) _propgerssView
{
self.asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
self.propgerssView = _propgerssView;
cancelled = NO;
[self performSelector:#selector(startProgressSheetWithURL:) withObject:_outputURL afterDelay:0.0]; // avoid starting a new sheet while in
}
- (void)startProgressSheetWithURL:(NSURL *)localOutputURL
{
[self setOutputURL:localOutputURL];
[self setWritingSamples:YES];
AVAsset *localAsset = [self asset];
[localAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObjects:#"tracks", #"duration", nil] completionHandler:^
{
// Dispatch the setup work to the serialization queue, to ensure this work is serialized with potential cancellation
dispatch_async(serializationQueue, ^{
// Since we are doing these things asynchronously, the user may have already cancelled on the main thread. In that case, simply return from this block
if (cancelled)
return;
BOOL success = YES;
NSError *localError = nil;
success = ([localAsset statusOfValueForKey:#"tracks" error:&localError] == AVKeyValueStatusLoaded);
if (success)
success = ([localAsset statusOfValueForKey:#"duration" error:&localError] == AVKeyValueStatusLoaded);
if (success)
{
[self setTimeRange:CMTimeRangeMake(kCMTimeZero, [localAsset duration])];
// AVAssetWriter does not overwrite files for us, so remove the destination file if it already exists
NSFileManager *fm = [NSFileManager defaultManager];
NSString *localOutputPath = [localOutputURL path];
if ([fm fileExistsAtPath:localOutputPath])
success = [fm removeItemAtPath:localOutputPath error:&localError];
}
// Set up the AVAssetReader and AVAssetWriter, then begin writing samples or flag an error
if (success)
success = [self setUpReaderAndWriterReturningError:&localError];
if (success)
success = [self startReadingAndWritingReturningError:&localError];
if (!success)
[self readingAndWritingDidFinishSuccessfully:success withError:localError];
});
}];
}
- (BOOL)setUpReaderAndWriterReturningError:(NSError **)outError
{
BOOL success = YES;
NSError *localError = nil;
AVAsset *localAsset = [self asset];
NSURL *localOutputURL = [self outputURL];
// Create asset reader and asset writer
assetReader = [[AVAssetReader alloc] initWithAsset:asset error:&localError];
success = (assetReader != nil);
if (success)
{
//changed assetWriter = [[AVAssetWriter alloc] initWithURL:localOutputURL fileType:AVFileTypeQuickTimeMovie error:&localError];
assetWriter = [[AVAssetWriter alloc] initWithURL:localOutputURL fileType:AVFileTypeMPEG4 error:&localError];
success = (assetWriter != nil);
}
// Create asset reader outputs and asset writer inputs for the first audio track and first video track of the asset
if (success)
{
AVAssetTrack *audioTrack = nil, *videoTrack = nil;
// Grab first audio track and first video track, if the asset has them
NSArray *audioTracks = [localAsset tracksWithMediaType:AVMediaTypeAudio];
if ([audioTracks count] > 0)
audioTrack = [audioTracks objectAtIndex:0];
NSArray *videoTracks = [localAsset tracksWithMediaType:AVMediaTypeVideo];
if ([videoTracks count] > 0)
videoTrack = [videoTracks objectAtIndex:0];
if (audioTrack)
{
// Decompress to Linear PCM with the asset reader
NSDictionary *decompressionAudioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithUnsignedInt:kAudioFormatLinearPCM], AVFormatIDKey,
nil];
AVAssetReaderOutput *output = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:decompressionAudioSettings];
[assetReader addOutput:output];
AudioChannelLayout stereoChannelLayout = {
.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo,
.mChannelBitmap = 0,
.mNumberChannelDescriptions = 0
};
NSData *channelLayoutAsData = [NSData dataWithBytes:&stereoChannelLayout length:offsetof(AudioChannelLayout, mChannelDescriptions)];
// Compress to 128kbps AAC with the asset writer
NSDictionary *compressionAudioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithUnsignedInt:kAudioFormatMPEG4AAC], AVFormatIDKey,
[NSNumber numberWithInteger:128000], AVEncoderBitRateKey,
[NSNumber numberWithInteger:44100], AVSampleRateKey,
channelLayoutAsData, AVChannelLayoutKey,
[NSNumber numberWithUnsignedInteger:2], AVNumberOfChannelsKey,
nil];
AVAssetWriterInput *input = [AVAssetWriterInput assetWriterInputWithMediaType:[audioTrack mediaType] outputSettings:compressionAudioSettings];
[assetWriter addInput:input];
// Create and save an instance of RWSampleBufferChannel, which will coordinate the work of reading and writing sample buffers
audioSampleBufferChannel = [[RWSampleBufferChannel alloc] initWithAssetReaderOutput:output assetWriterInput:input];
}
if (videoTrack)
{
// Decompress to ARGB with the asset reader
NSDictionary *decompressionVideoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB], (id)kCVPixelBufferPixelFormatTypeKey,
[NSDictionary dictionary], (id)kCVPixelBufferIOSurfacePropertiesKey,
nil];
AVAssetReaderOutput *output = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:decompressionVideoSettings];
[assetReader addOutput:output];
// Get the format description of the track, to fill in attributes of the video stream that we don't want to change
CMFormatDescriptionRef formatDescription = NULL;
NSArray *formatDescriptions = [videoTrack formatDescriptions];
if ([formatDescriptions count] > 0)
formatDescription = (CMFormatDescriptionRef)[formatDescriptions objectAtIndex:0];
// Grab track dimensions from format description
CGSize trackDimensions = {
.width = 0.0,
.height = 0.0,
};
if (formatDescription)
trackDimensions = CMVideoFormatDescriptionGetPresentationDimensions(formatDescription, false, false);
else
trackDimensions = [videoTrack naturalSize];
// Grab clean aperture, pixel aspect ratio from format description
NSMutableDictionary *compressionSettings = nil;
// [NSMutableDictionary dictionaryWithObjectsAndKeys:
// AVVideoProfileLevelH264Baseline30, AVVideoProfileLevelKey,
// [NSNumber numberWithInt:960000], AVVideoAverageBitRateKey,
// [NSNumber numberWithInt:1],AVVideoMaxKeyFrameIntervalKey,
// nil ];
//NSDictionary *videoSettings = nil;
if (formatDescription)
{
NSDictionary *cleanAperture = nil;
NSDictionary *pixelAspectRatio = nil;
CFDictionaryRef cleanApertureFromCMFormatDescription = CMFormatDescriptionGetExtension(formatDescription, kCMFormatDescriptionExtension_CleanAperture);
if (cleanApertureFromCMFormatDescription)
{
cleanAperture = [NSDictionary dictionaryWithObjectsAndKeys:
CFDictionaryGetValue(cleanApertureFromCMFormatDescription, kCMFormatDescriptionKey_CleanApertureWidth), AVVideoCleanApertureWidthKey,
CFDictionaryGetValue(cleanApertureFromCMFormatDescription, kCMFormatDescriptionKey_CleanApertureHeight), AVVideoCleanApertureHeightKey,
CFDictionaryGetValue(cleanApertureFromCMFormatDescription, kCMFormatDescriptionKey_CleanApertureHorizontalOffset), AVVideoCleanApertureHorizontalOffsetKey,
CFDictionaryGetValue(cleanApertureFromCMFormatDescription, kCMFormatDescriptionKey_CleanApertureVerticalOffset), AVVideoCleanApertureVerticalOffsetKey,
nil];
}
CFDictionaryRef pixelAspectRatioFromCMFormatDescription = CMFormatDescriptionGetExtension(formatDescription, kCMFormatDescriptionExtension_PixelAspectRatio);
if (pixelAspectRatioFromCMFormatDescription)
{
pixelAspectRatio = [NSDictionary dictionaryWithObjectsAndKeys:
CFDictionaryGetValue(pixelAspectRatioFromCMFormatDescription, kCMFormatDescriptionKey_PixelAspectRatioHorizontalSpacing), AVVideoPixelAspectRatioHorizontalSpacingKey,
CFDictionaryGetValue(pixelAspectRatioFromCMFormatDescription, kCMFormatDescriptionKey_PixelAspectRatioVerticalSpacing), AVVideoPixelAspectRatioVerticalSpacingKey,
nil];
}
if (cleanAperture || pixelAspectRatio)
{
if (cleanAperture)
[compressionSettings setObject:cleanAperture forKey:AVVideoCleanApertureKey];
if (pixelAspectRatio)
[compressionSettings setObject:pixelAspectRatio forKey:AVVideoPixelAspectRatioKey];
}
}
// Compress to H.264 with the asset writer
NSMutableDictionary *videoSettings = [NSMutableDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithDouble:trackDimensions.width], AVVideoWidthKey,
[NSNumber numberWithDouble:trackDimensions.height], AVVideoHeightKey,
nil];
if (compressionSettings)
[videoSettings setObject:compressionSettings forKey:AVVideoCompressionPropertiesKey];
AVAssetWriterInput *input = [AVAssetWriterInput assetWriterInputWithMediaType:[videoTrack mediaType] outputSettings:videoSettings];
[assetWriter addInput:input];
// Create and save an instance of RWSampleBufferChannel, which will coordinate the work of reading and writing sample buffers
videoSampleBufferChannel = [[RWSampleBufferChannel alloc] initWithAssetReaderOutput:output assetWriterInput:input];
}
}
if (outError)
*outError = localError;
return success;
}
- (BOOL)startReadingAndWritingReturningError:(NSError **)outError
{
BOOL success = YES;
NSError *localError = nil;
// Instruct the asset reader and asset writer to get ready to do work
success = [assetReader startReading];
if (!success)
localError = [assetReader error];
if (success)
{
success = [assetWriter startWriting];
if (!success)
localError = [assetWriter error];
}
if (success)
{
dispatch_group_t dispatchGroup = dispatch_group_create();
// Start a sample-writing session
[assetWriter startSessionAtSourceTime:[self timeRange].start];
// Start reading and writing samples
if (audioSampleBufferChannel)
{
// Only set audio delegate for audio-only assets, else let the video channel drive progress
id <RWSampleBufferChannelDelegate> delegate = nil;
if (!videoSampleBufferChannel)
delegate = self;
dispatch_group_enter(dispatchGroup);
[audioSampleBufferChannel startWithDelegate:delegate completionHandler:^{
dispatch_group_leave(dispatchGroup);
}];
}
if (videoSampleBufferChannel)
{
dispatch_group_enter(dispatchGroup);
[videoSampleBufferChannel startWithDelegate:self completionHandler:^{
dispatch_group_leave(dispatchGroup);
}];
}
// Set up a callback for when the sample writing is finished
dispatch_group_notify(dispatchGroup, serializationQueue, ^{
BOOL finalSuccess = YES;
NSError *finalError = nil;
if (cancelled)
{
[assetReader cancelReading];
[assetWriter cancelWriting];
}
else
{
if ([assetReader status] == AVAssetReaderStatusFailed)
{
finalSuccess = NO;
finalError = [assetReader error];
}
if (finalSuccess)
{
finalSuccess = [assetWriter finishWriting];
if (!finalSuccess)
finalError = [assetWriter error];
}
}
[self readingAndWritingDidFinishSuccessfully:finalSuccess withError:finalError];
});
dispatch_release(dispatchGroup);
}
if (outError)
*outError = localError;
return success;
}
- (void)cancel
{
self.propgerssView = nil;
// Dispatch cancellation tasks to the serialization queue to avoid races with setup and teardown
dispatch_async(serializationQueue, ^{
[audioSampleBufferChannel cancel];
[videoSampleBufferChannel cancel];
cancelled = YES;
});
}
- (void)readingAndWritingDidFinishSuccessfully:(BOOL)success withError:(NSError *)error
{
NSLog(#"%s[%d] - success = %d error = %#", __FUNCTION__, __LINE__, success, error);
if (!success)
{
[assetReader cancelReading];
[assetWriter cancelWriting];
}
// Tear down ivars
[assetReader release];
assetReader = nil;
[assetWriter release];
assetWriter = nil;
[audioSampleBufferChannel release];
audioSampleBufferChannel = nil;
[videoSampleBufferChannel release];
videoSampleBufferChannel = nil;
cancelled = NO;
// Dispatch UI-related tasks to the main queue
dispatch_async(dispatch_get_main_queue(), ^{
if (!success)
{
}
[self setWritingSamples:NO];
});
}
static double progressOfSampleBufferInTimeRange(CMSampleBufferRef sampleBuffer, CMTimeRange timeRange)
{
CMTime progressTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
progressTime = CMTimeSubtract(progressTime, timeRange.start);
CMTime sampleDuration = CMSampleBufferGetDuration(sampleBuffer);
if (CMTIME_IS_NUMERIC(sampleDuration))
progressTime= CMTimeAdd(progressTime, sampleDuration);
return CMTimeGetSeconds(progressTime) / CMTimeGetSeconds(timeRange.duration);
}
static void removeARGBColorComponentOfPixelBuffer(CVPixelBufferRef pixelBuffer, size_t componentIndex)
{
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
size_t bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
size_t bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer);
static const size_t bytesPerPixel = 4; // constant for ARGB pixel format
unsigned char *base = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
for (size_t row = 0; row < bufferHeight; ++row)
{
for (size_t column = 0; column < bufferWidth; ++column)
{
unsigned char *pixel = base + (row * bytesPerRow) + (column * bytesPerPixel);
pixel[componentIndex] = 0;
}
}
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
}
+ (size_t)componentIndexFromFilterTag:(NSInteger)filterTag
{
return (size_t)filterTag; // we set up the tags in the popup button to correspond directly with the index they modify
}
- (void)sampleBufferChannel:(RWSampleBufferChannel *)sampleBufferChannel didReadSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
CVPixelBufferRef pixelBuffer = NULL;
// Calculate progress (scale of 0.0 to 1.0)
double progress = progressOfSampleBufferInTimeRange(sampleBuffer, [self timeRange]);
NSLog(#"%s[%d] - progress = %f", __FUNCTION__, __LINE__, progress);
// Grab the pixel buffer from the sample buffer, if possible
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (imageBuffer && (CFGetTypeID(imageBuffer) == CVPixelBufferGetTypeID()))
{
pixelBuffer = (CVPixelBufferRef)imageBuffer;
if (filterTag >= 0) // -1 means "no filtering, please"
removeARGBColorComponentOfPixelBuffer(pixelBuffer, [[self class] componentIndexFromFilterTag:filterTag]);
}
}
#end
#interface RWSampleBufferChannel ()
- (void)callCompletionHandlerIfNecessary; // always called on the serialization queue
#end
#implementation RWSampleBufferChannel
- (id)initWithAssetReaderOutput:(AVAssetReaderOutput *)localAssetReaderOutput assetWriterInput:(AVAssetWriterInput *)localAssetWriterInput
{
self = [super init];
if (self)
{
assetReaderOutput = [localAssetReaderOutput retain];
assetWriterInput = [localAssetWriterInput retain];
finished = NO;
NSString *serializationQueueDescription = [NSString stringWithFormat:#"%# serialization queue", self];
serializationQueue = dispatch_queue_create([serializationQueueDescription UTF8String], NULL);
}
return self;
}
- (void)dealloc
{
[assetReaderOutput release];
[assetWriterInput release];
if (serializationQueue)
dispatch_release(serializationQueue);
[completionHandler release];
[super dealloc];
}
- (NSString *)mediaType
{
return [assetReaderOutput mediaType];
}
- (void)startWithDelegate:(id <RWSampleBufferChannelDelegate>)delegate completionHandler:(dispatch_block_t)localCompletionHandler
{
completionHandler = [localCompletionHandler copy]; // released in -callCompletionHandlerIfNecessary
[assetWriterInput requestMediaDataWhenReadyOnQueue:serializationQueue usingBlock:^{
if (finished)
return;
BOOL completedOrFailed = NO;
// Read samples in a loop as long as the asset writer input is ready
while ([assetWriterInput isReadyForMoreMediaData] && !completedOrFailed)
{
CMSampleBufferRef sampleBuffer = [assetReaderOutput copyNextSampleBuffer];
if (sampleBuffer != NULL)
{
if ([delegate respondsToSelector:#selector(sampleBufferChannel:didReadSampleBuffer:)])
[delegate sampleBufferChannel:self didReadSampleBuffer:sampleBuffer];
BOOL success = [assetWriterInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
sampleBuffer = NULL;
completedOrFailed = !success;
}
else
{
completedOrFailed = YES;
}
}
if (completedOrFailed)
[self callCompletionHandlerIfNecessary];
}];
}
- (void)cancel
{
dispatch_async(serializationQueue, ^{
[self callCompletionHandlerIfNecessary];
});
}
- (void)callCompletionHandlerIfNecessary
{
// Set state to mark that we no longer need to call the completion handler, grab the completion handler, and clear out the ivar
BOOL oldFinished = finished;
finished = YES;
if (oldFinished == NO)
{
[assetWriterInput markAsFinished]; // let the asset writer know that we will not be appending any more samples to this input
dispatch_block_t localCompletionHandler = [completionHandler retain];
[completionHandler release];
completionHandler = nil;
if (localCompletionHandler)
{
localCompletionHandler();
[localCompletionHandler release];
}
}
}
#end
Hey It was for a long while, but I end up with good solution and it may help someone in future
my code:
-(void) compressVideo
{
asset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetPassthrough];
NSLog(#" %#", [AVAssetExportSession exportPresetsCompatibleWithAsset:asset]);
NSLog(#" %#", exportSession.supportedFileTypes);
NSLog(#"----------------------------------------- convert to mp4");
NSLog(#" %#", exportSession.supportedFileTypes);
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.outputURL = [self outputVideoPath:#"outPut" ext:#"mp4"];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
ICQLog(#" exportSession.status = %d exportSession.error = %#", exportSession.status, exportSession.error);
if ( exportSession && (exportSession.status == AVAssetExportSessionStatusCompleted) )
{
ICQLog(#" exportSession.outputURL = %#", exportSession.outputURL);
// we need to remove temporary files
[[NSFileManager defaultManager] removeItemAtURL:videoUrl error:NULL];
[videoUrl release];
videoUrl = [exportSession.outputURL retain];
}
else
{
//TODO - report error
}
[exportSession release], exportSession = nil;
[asset release], asset = nil;
}];
I can't help with the trans-wrap stuff, I haven't got my head into this.
Is the main priority to get the file output as a .mp4 without having to reprocess it? If it is then just use .mp4 as the file extension of the movie clip that was output by you code and this should work fine. I have used this approach today and it works. i didn't have to convert it from .mov to .mp4 because essentially a .mp4 file is the same as a .mov file with some additional standards based functionality.
Hope this is of help.
This is the code I used.
(BOOL)encodeVideo:(NSURL *)videoURL
{
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
// Create the composition and tracks
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSArray *assetVideoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if (assetVideoTracks.count <= 0)
{
NSLog(#"Error reading the transformed video track");
return NO;
}
// Insert the tracks in the composition's tracks
AVAssetTrack *assetVideoTrack = [assetVideoTracks firstObject];
[videoTrack insertTimeRange:assetVideoTrack.timeRange ofTrack:assetVideoTrack atTime:CMTimeMake(0, 1) error:nil];
[videoTrack setPreferredTransform:assetVideoTrack.preferredTransform];
AVAssetTrack *assetAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioTrack insertTimeRange:assetAudioTrack.timeRange ofTrack:assetAudioTrack atTime:CMTimeMake(0, 1) error:nil];
// Export to mp4
NSString *mp4Quality = [MGPublic isIOSAbove:#"6.0"] ? AVAssetExportPresetMediumQuality : AVAssetExportPresetPassthrough;
NSString *exportPath = [NSString stringWithFormat:#"%#/%#.mp4",
[NSHomeDirectory() stringByAppendingString:#"/tmp"],
[BSCommon uuidString]];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:mp4Quality];
exportSession.outputURL = exportUrl;
CMTime start = CMTimeMakeWithSeconds(0.0, 0);
CMTimeRange range = CMTimeRangeMake(start, [asset duration]);
exportSession.timeRange = range;
exportSession.outputFileType = AVFileTypeMPEG4;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusCompleted:
NSLog(#"MP4 Successful!");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
default:
break;
}
}];
return YES;
}

Resources