I am trying to export an audio file from the ipod library . If the start value is 0.0 then the export works just fine. If the start value is greater then 0.0 then the audio file is just empty. The status is completed but the file size is 0 kb . Can someone please give me a solution for this or is this another Apple bug ?
float vocalStartMarker = startValue;
float vocalEndMarker = endValue;
NSURL *audioFileInput = url;
NSString *voicePath = [path stringByAppendingPathComponent:#"audio"];
NSString *uniquePath = [mainViewController uniqueFilename:voicePath withExtension:#"caf"];
NSURL *audioFileOutput = [NSURL fileURLWithPath:uniquePath];
if (!audioFileInput || !audioFileOutput)
{
return NO;
}
[[NSFileManager defaultManager] removeItemAtURL:audioFileOutput error:NULL];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:audioFileInput options:nil];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:asset
presetName:AVAssetExportPresetAppleM4A];
if (exportSession == nil)
{
return NO;
}
CMTime startTime = CMTimeMake((int)(floor(vocalStartMarker * 100)), 100);
CMTime stopTime = CMTimeMake((int)(ceil(vocalEndMarker * 100)), 100);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime);
exportSession.outputURL = audioFileOutput;
exportSession.outputFileType = AVFileTypeAppleM4A;
exportSession.timeRange = exportTimeRange;
[exportSession exportAsynchronouslyWithCompletionHandler:^
{
if (AVAssetExportSessionStatusCompleted == exportSession.status)
{
NSLog(#"it worked:%#",uniquePath);
// It worked!
[music replaceObjectAtIndex:currentPage withObject:uniquePath];
}
else if (AVAssetExportSessionStatusFailed == exportSession.status)
{
NSLog(#"it failed");
// It failed...
}
}];
[self.popoverController dismissPopoverAnimated:YES];
self.popoverController=nil;
return YES;
Related
I have following code to fix Tranform of video
- (AVVideoComposition *)squareVideoCompositionFor:(AVAsset *)asset {
AVAssetTrack *track = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
CGFloat length = MAX(track.naturalSize.width, track.naturalSize.height);
CGSize size = track.naturalSize;
CGFloat scale = 0;
CGAffineTransform transform = track.preferredTransform;
if (transform.a == 0 && transform.b == 1 && transform.c == -1 && transform.d == 0) {
scale = -1;
}
else if (transform.a == 0 && transform.b == -1 && transform.c == 1 && transform.d == 0) {
scale = -1;
}
else if (transform.a == 1 && transform.b == 0 && transform.c == 0 && transform.d == 1) {
scale = 1;
}
else if (transform.a == -1 && transform.b == 0 && transform.c == 0 && transform.d == -1) {
scale = -1;
}
transform = CGAffineTransformTranslate(transform, scale * -(size.width - length) / 2, scale * -(size.height - length) / 2);
AVMutableVideoCompositionLayerInstruction *transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:track];
[transformer setTransform:transform atTime:kCMTimeZero];
// CGAffineTransform finalTransform = t2;
// [transformer setTransform:finalTransform atTime:kCMTimeZero];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, kCMTimePositiveInfinity);
instruction.layerInstructions = #[transformer];
AVMutableVideoComposition *composition = [AVMutableVideoComposition videoComposition];
composition.frameDuration = CMTimeMake(1, 30);
composition.renderSize = CGSizeMake(length, length);
composition.instructions = #[instruction];
composition.renderScale = 1.0;
return composition;
}
And Following code for Mute Audio
- (AVMutableComposition *) removeAudioFromVideoFileFor:(AVAsset *)asset {
AVMutableComposition *composition_Mix = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition_Mix addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
BOOL ok = NO;
AVAssetTrack * sourceVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CMTimeRange x = CMTimeRangeMake(kCMTimeZero, [asset duration]);
NSError *error;
ok = [compositionVideoTrack insertTimeRange:x ofTrack:sourceVideoTrack atTime:kCMTimeZero error:&error];
return composition_Mix;
}
Here how i call the function
AVAsset *asset = [AVAsset assetWithURL:inputURL];
AVMutableComposition *composition = [self removeAudioFromVideoFileFor:asset];
AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
session.videoComposition = [self squareVideoCompositionFor:asset];
session.outputURL = outputURL;
session.outputFileType = AVFileTypeMPEG4;
session.shouldOptimizeForNetworkUse = true;
session.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
But it shows error if I used both composition and [self squareVideoCompositionFor:asset]
Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}
If I omit one then it is working fine means One AVAssetExportSession can either mute audio from video or squareVideo
Is there a way I can achieve both using single progress of export of AVAssetExportSession ?
Your code looks good but I have made changes in your code to get it work.
The inputURL and outputURL should be prefixed with either file:// or https:// (as it is url, in your case it should be start with file://)
If your is not valid then you will not get the desired output.
//FOR OUTPUT URL
NSString *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
path = [path stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLQueryAllowedCharacterSet]];
//the output video will be written to file final.mp4
NSURL *outputURL = [NSURL fileURLWithPath:path];
outputURL = [outputURL URLByAppendingPathComponent:#"final.mp4"];
NSLog(#"outputURL = %#", outputURL);
//FOR INPUT URL
//This is the path of the bundle resource that is going to be used
NSURL *inputURL = [[NSBundle mainBundle] URLForResource:#"video" withExtension:#"mp4"];
NSLog(#"inputURL = %#", inputURL);
Export the composition
//this will export the composition with the specified configuration
[session exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"Success");
}];
When you see the "Success" log in console check the document directory of your application. The video will be written at outptURL.
NOTE: USE CMD + SHIFT + G and pase the outputURL. You will be redirected to the document folder of your app (For simulator only). For device, you need to download the app container and see the package content.
Complete CODE
The removeAudioFromVideoFileFor: and squareVideoCompositionFor: methods looks good. just need to change the following.
Here "video" is the name of the resource file in app bundle.
- (void)viewDidLoad {
[super viewDidLoad];
NSString *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
path = [path stringByAddingPercentEncodingWithAllowedCharacters:[NSCharacterSet URLQueryAllowedCharacterSet]];
NSURL *outputURL = [NSURL fileURLWithPath:path];
outputURL = [outputURL URLByAppendingPathComponent:#"final.mp4"];
NSLog(#"outputURL = %#", outputURL);
NSURL *inputURL = [[NSBundle mainBundle] URLForResource:#"video" withExtension:#"mp4"];
NSLog(#"inputURL = %#", inputURL);
AVAsset *asset = [AVAsset assetWithURL:inputURL];
AVMutableComposition *composition = [self removeAudioFromVideoFileFor: asset];
AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
session.videoComposition = [self squareVideoCompositionFor:asset];
session.outputURL = outputURL;
session.outputFileType = AVFileTypeMPEG4;
session.shouldOptimizeForNetworkUse = true;
session.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
[session exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"Success:");
}];
}
Hope it will help
I am trying to create a square video using AVCaptureSession and I am successfully capture video but issue is that if my device is portrait mode and I am capture video then its Orientation record correct but if my device is landscape and I capture video I wanted to make this video orientation change to portrait. Following code I use for crop a video after capture:
-(void)cropView:(NSURL*)outputfile
{
AVAsset *asset = [AVAsset assetWithURL:outputfile];
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize =CGSizeMake(clipVideoTrack.naturalSize.height, clipVideoTrack.naturalSize.height);
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30));
// rotate to portrait
AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
CGAffineTransform t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height, -(clipVideoTrack.naturalSize.width - clipVideoTrack.naturalSize.height) /2 );
CGAffineTransform t2 = CGAffineTransformRotate(t1, M_PI_2);
CGAffineTransform finalTransform = t2;
[transformer setTransform:finalTransform atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
NSString *outputPath = [NSString stringWithFormat:#"%#%#", NSTemporaryDirectory(), #"video.mp4"];
NSURL *exportUrl = [NSURL fileURLWithPath:outputPath];
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
//Export
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL = exportUrl;
exporter.outputFileType = AVFileTypeMPEG4;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
//Call when finished
[self exportDidFinish:exporter];
});
}];
}
I just fix the issue by using follow code and steps:
First my device orientation is lock and my app is only support portrait orientation so i think i get only portrait orientation however i capture video by landscape mode so using Core-motion i get device orientation using following code
#import <CoreMotion/CoreMotion.h>
#interface ViewController ()
{
AVCaptureVideoOrientation orientationLast, orientationAfterProcess;
CMMotionManager *motionManager;
}
#implementation ViewController
- (void)initializeMotionManager{
motionManager = [[CMMotionManager alloc] init];
motionManager.accelerometerUpdateInterval = .2;
motionManager.gyroUpdateInterval = .2;
[motionManager startAccelerometerUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler:^(CMAccelerometerData *accelerometerData, NSError *error) {
if (!error) {
[self outputAccelertionData:accelerometerData.acceleration];
}
else{
NSLog(#"%#", error);
}
}];
}
- (void)outputAccelertionData:(CMAcceleration)acceleration{
AVCaptureVideoOrientation orientationNew;
if (acceleration.x >= 0.75) {
orientationNew = AVCaptureVideoOrientationLandscapeLeft;
}
else if (acceleration.x <= -0.75) {
orientationNew =AVCaptureVideoOrientationLandscapeRight;
}
else if (acceleration.y <= -0.75) {
orientationNew =AVCaptureVideoOrientationPortrait;
}
else if (acceleration.y >= 0.75) {
orientationNew =AVCaptureVideoOrientationPortraitUpsideDown;
}
else {
// Consider same as last time
return;
}
if (orientationNew == orientationLast)
return;
orientationLast = orientationNew;
}
so based on device rotation that orientationLast update device orientation. after that when i tap button for record video i set the AVCaptureConnection orientation.
AVCaptureConnection *CaptureConnection = [MovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([CaptureConnection isVideoOrientationSupported])
{
[CaptureConnection setVideoOrientation:orientationLast];
}
Now after capture video. crop time i did following code and that works perfect.
-(void)cropView:(NSURL*)outputfile
{
AVAsset *asset = [AVAsset assetWithURL:outputfile];
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30));
CGSize videoSize = [[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];
float scaleFactor;
if (videoSize.width > videoSize.height) {
scaleFactor = videoSize.height/320;
}
else if (videoSize.width == videoSize.height){
scaleFactor = videoSize.height/320;
}
else{
scaleFactor = videoSize.width/320;
}
CGFloat cropOffX = 0;
CGFloat cropOffY = 0;
CGFloat cropWidth = 320 *scaleFactor;
CGFloat cropHeight = 320 *scaleFactor;
videoComposition.renderSize = CGSizeMake(cropWidth, cropHeight);
AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset];
CGAffineTransform t1 = CGAffineTransformIdentity;
CGAffineTransform t2 = CGAffineTransformIdentity;
switch (videoOrientation) {
case UIImageOrientationUp:
t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropOffX, 0 - cropOffY );
t2 = CGAffineTransformRotate(t1, M_PI_2 );
break;
case UIImageOrientationDown:
t1 = CGAffineTransformMakeTranslation(0 - cropOffX, clipVideoTrack.naturalSize.width - cropOffY ); // not fixed width is the real height in upside down
t2 = CGAffineTransformRotate(t1, - M_PI_2 );
break;
case UIImageOrientationRight:
t1 = CGAffineTransformMakeTranslation(0 - cropOffX, 0 - cropOffY );
t2 = CGAffineTransformRotate(t1, 0 );
break;
case UIImageOrientationLeft:
t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropOffX, clipVideoTrack.naturalSize.height - cropOffY );
t2 = CGAffineTransformRotate(t1, M_PI );
break;
default:
NSLog(#"no supported orientation has been found in this video");
break;
}
CGAffineTransform finalTransform = t2;
[transformer setTransform:finalTransform atTime:kCMTimeZero];
//add the transformer layer instructions, then add to video composition
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
NSString *outputPath = [NSString stringWithFormat:#"%#%#", NSTemporaryDirectory(), #"video.mp4"];
NSURL *exportUrl = [NSURL fileURLWithPath:outputPath];
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
//Export
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL = exportUrl;
exporter.outputFileType = AVFileTypeMPEG4;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
//Call when finished
[self exportDidFinish:exporter];
});
}];
}
- (UIImageOrientation)getVideoOrientationFromAsset:(AVAsset *)asset
{
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize size = [videoTrack naturalSize];
CGAffineTransform txf = [videoTrack preferredTransform];
if (size.width == txf.tx && size.height == txf.ty)
return UIImageOrientationLeft; //return UIInterfaceOrientationLandscapeLeft;
else if (txf.tx == 0 && txf.ty == 0)
return UIImageOrientationRight; //return UIInterfaceOrientationLandscapeRight;
else if (txf.tx == 0 && txf.ty == size.width)
return UIImageOrientationDown; //return UIInterfaceOrientationPortraitUpsideDown;
else
return UIImageOrientationUp; //return UIInterfaceOrientationPortrait;
}
I am working on a Video making app.
In that I need to record a video in first View and after that display in second View.For recording a video I followed this tutorial.
In that I have made some changes as per my need in didFinishRecordingToOutputFileAtURL method.
Here is my updated method.
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
NSLog(#"didFinishRecordingToOutputFileAtURL - enter");
BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
// A problem occurred: Find out if the recording was successful.
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if (value)
{
RecordedSuccessfully = [value boolValue];
}
}
else {
NSLog(#"didFinishRecordingToOutputFileAtURL error:%#",error);
}
if (RecordedSuccessfully)
{
//----- RECORDED SUCESSFULLY -----
NSLog(#"didFinishRecordingToOutputFileAtURL - success");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *track = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAsset *asset = [AVAsset assetWithURL:outputFileURL];
[track insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:CMTimeMake(0, 1) error:nil];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"%#%d.mov",NSBundle.mainBundle.infoDictionary[#"CFBundleExecutable"],++videoCounter]];
[[NSFileManager defaultManager] removeItemAtPath:myPathDocs error:nil];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:track];
AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp;
BOOL isVideoAssetPortrait_ = NO;
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationRight;
isVideoAssetPortrait_ = YES;
if ([[[NSUserDefaults standardUserDefaults] stringForKey:#"orientation"] isEqualToString:#"landscape"]) {
videoAssetOrientation_ = UIImageOrientationUp;
}
}
if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
videoAssetOrientation_ = UIImageOrientationLeft;
isVideoAssetPortrait_ = YES;
}
if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
videoAssetOrientation_ = UIImageOrientationUp;
}
if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
videoAssetOrientation_ = UIImageOrientationDown;
}
CGSize naturalSize;
if(isVideoAssetPortrait_){
naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
} else {
naturalSize = videoAssetTrack.naturalSize;
}
float renderWidth, renderHeight;
if (![self.ratioLabel.text isEqualToString:#"16:9"]) {
renderWidth = naturalSize.width;
renderHeight = naturalSize.width;
NSLog(#"Video:: width=%f height=%f",naturalSize.width,naturalSize.height);
}
else {
renderWidth = naturalSize.width;
renderHeight = naturalSize.height;
NSLog(#"Video:: width=%f height=%f",naturalSize.width,naturalSize.height);
}
if (![self.ratioLabel.text isEqualToString:#"16:9"])
{
CGAffineTransform t1 = CGAffineTransformMakeTranslation(videoAssetTrack.naturalSize.height, -(videoAssetTrack.naturalSize.width - videoAssetTrack.naturalSize.height) /2);
CGAffineTransform t2 = CGAffineTransformRotate(t1, M_PI_2);
[layerInstruction setTransform:t2 atTime:kCMTimeZero];
}
else
{
CGAffineTransform t2 = CGAffineTransformMakeRotation( M_PI_2);
[layerInstruction setTransform:t2 atTime:kCMTimeZero];
}
AVCaptureDevicePosition position = [[VideoInputDevice device] position];
if (position == AVCaptureDevicePositionFront)
{
/* For front camera only */
CGAffineTransform t = CGAffineTransformMakeScale(-1.0f, 1.0f);
t = CGAffineTransformTranslate(t, -videoAssetTrack.naturalSize.width, 0);
t = CGAffineTransformRotate(t, (DEGREES_TO_RADIANS(90.0)));
t = CGAffineTransformTranslate(t, 0.0f, -videoAssetTrack.naturalSize.width);
[layerInstruction setTransform:t atTime:kCMTimeZero];
/* For front camera only */
}
[layerInstruction setOpacity:0.0 atTime:asset.duration];
instruction.layerInstructions = [NSArray arrayWithObjects:layerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:instruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
AVAssetExportSession *exporter;
exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720];
exporter.videoComposition = mainCompositionInst;
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
self.doneButton.userInteractionEnabled = YES;
if(videoAddr==nil)
{
videoAddr = [[NSMutableArray alloc] init];
}
[videoAddr addObject:exporter.outputURL];
[[PreviewLayer connection] setEnabled:YES];
AVAsset *asset = [AVAsset assetWithURL:exporter.outputURL];
NSLog(#"remaining seconds before:%f",lastSecond);
double assetDuration = CMTimeGetSeconds(asset.duration);
if (assetDuration>3.0)
assetDuration = 3.0;
lastSecond = lastSecond- assetDuration;
NSLog(#"remaining seconds after:%f",lastSecond);
self.secondsLabel.text = [NSString stringWithFormat:#"%0.1fs",lastSecond];
self.secondsLabel.hidden = NO;
NSData *data = [NSKeyedArchiver archivedDataWithRootObject:videoAddr];
[[NSUserDefaults standardUserDefaults] setObject:data forKey:#"videoAddr"];
[[NSUserDefaults standardUserDefaults] synchronize];
videoURL = outputFileURL;
flagAutorotate = NO;
self.cancelButton.hidden = self.doneButton.hidden = NO;
imgCancel.hidden = imgDone.hidden = NO;
if ([[NSUserDefaults standardUserDefaults] boolForKey:#"Vibration"])
AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
[[UIApplication sharedApplication] endIgnoringInteractionEvents];
});
}];
}
else {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:[NSString stringWithFormat:#"Video can not be saved\nPlease free some storage space"] delegate:self cancelButtonTitle:nil otherButtonTitles:nil, nil];
[alert show];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(2.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[alert dismissWithClickedButtonIndex:0 animated:YES];
});
}
}
}
But here Is the issue.Video is not being recorded exactly shown in preview.
See these 2 screenShots.
Video recording preview
Video Playing View
The reason is because your iPad screen aspect ratio is not the same as camera aspect ratio.
You can modify camera preview size by setting videoGravity property of AVCaptureVideoPreviewLayer,
which influences how content is viewed relative to the layer bounds:
layer.videoGravity = AVLayerVideoGravityResizeAspect;
But in that case preview won't be fullscreen.
If you want the video with the same aspect ratio as on preview fullscreen, you will have to crop it. Cropping process explained here:
Exporting AVCaptureSession video in a size that matches the preview layer
Video capture with 1:1 aspect ratio in iOS
I have trimmed an audio for a particular duration by using AVAssetExportSession and i am also getting the trimmed audio.
But my problem is that i want to add fade in and fade out effect in my audio.
Let me know, how can i solve this?
Any help will be appreciated.
Code for trimming audio is here--
- (void)trimAudio:(NSString *)inputAudioPath audioStartTime:(float)sTime audioEndTime:(float)eTime outputPath:(NSString *)outputFilePath mode:(NSInteger)kSelectionMode
{
#try
{
AVAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:inputAudioPath] options:nil];
//Create the session with assets
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:asset presetName:AVAssetExportPresetAppleM4A];
//Set the output url
exportSession.outputURL = [NSURL fileURLWithPath:outputFilePath];
// Trim video in a particular duration
CMTime startTime = CMTimeMake((int)(floor(sTime * 100)), 100);
CMTime stopTime = CMTimeMake((int)(ceil(eTime * 100)), 100);
CMTimeRange range = CMTimeRangeFromTimeToTime(startTime, stopTime);
exportSession.timeRange = range;
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
NSLog(#"Export Complete");
if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(exportSession.outputURL.path)){
UISaveVideoAtPathToSavedPhotosAlbum(exportSession.outputURL.path, nil, nil, nil);
if ([self.delegate respondsToSelector:#selector(trimDidSucceed:mode:)]) {
[self.delegate trimDidSucceed:outputFilePath mode:kTrimAudio];
}
else{
[self.delegate trimDidFail:exportSession.error];
}
}
break;
}
case AVAssetExportSessionStatusFailed:{
NSLog(#"Export Error: %#", [exportSession.error description]);
break;
}
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export Cancelled");
break;
default:
break;
}
}];
exportSession = nil;
}
#catch (NSException * e)
{
NSLog(#"Exception Name:%# Reason:%#",[e name],[e reason]);
}
}
//fade in /out
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:asset.tracks.lastObject];
int start=2,length=3;
[exportAudioMixInputParameters setVolume:0.0 atTime:CMTimeMakeWithSeconds(start-1, 1)];
[exportAudioMixInputParameters setVolume:0.1 atTime:CMTimeMakeWithSeconds(start, 1)];
[exportAudioMixInputParameters setVolume:0.5 atTime:CMTimeMakeWithSeconds(start+1, 1)];
[exportAudioMixInputParameters setVolume:1.0 atTime:CMTimeMakeWithSeconds(start+2, 1)];
[exportAudioMixInputParameters setVolume:1.0 atTime:CMTimeMakeWithSeconds((start+length-2), 1)];
[exportAudioMixInputParameters setVolume:0.5 atTime:CMTimeMakeWithSeconds((start+length-1), 1)];
[exportAudioMixInputParameters setVolume:0.1 atTime:CMTimeMakeWithSeconds((start+length), 1)];
exportAudioMix.inputParameters = [NSArray arrayWithObject:exportAudioMixInputParameters];
exportSession.audioMix = exportAudioMix; // fade in audio mix
Just add these few line before [exportSession exportAsynchronouslyWithCompletionHandler:^{
}];
You can simply put if statement when playing that audio by taking nstimer like if time reach at particular second that increase volume and if it reach at peak second then decrease volume to initial volume.
Check this link for reference.
I am trying to split a video into 4 second chunks with AVAssetExportSession. The initial split works and returns a 8mb/4 second chunk. But the second returns 12mb which is incorrect when the original video os only 18mb.
- (void) splitVideo{
AVURLAsset *vidAsset = [AVURLAsset URLAssetWithURL:output options:nil];
CMTime duration = vidAsset.duration;
NSLog(#"File size is : %.2f MB And Duration: %f",(float)[NSData dataWithContentsOfURL:output].length/1024.0f/1024.0f, CMTimeGetSeconds(duration));
splitArray = [[NSMutableArray alloc]init];
CMTime end = CMTimeMake(4, 1);
CMTimeRange range = CMTimeRangeMake(kCMTimeZero, end);
NSString *outputPath = [NSTemporaryDirectory() stringByAppendingPathComponent:#"output0.mp4"];
totalSeconds = 4.0f;
[self cutVideo:output withRange:range withOutput:outputPath];
}
- (void) cutVideo:(NSURL *)url withRange:(CMTimeRange)range withOutput:(NSString*)path{
AVAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:asset];
if ([compatiblePresets containsObject:AVAssetExportPresetHighestQuality]) {
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
NSURL *finalUrl = [NSURL fileURLWithPath:path];
[[NSFileManager defaultManager] removeItemAtURL:finalUrl error:NULL];
exportSession.outputURL = finalUrl;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.timeRange = range;
NSLog(#"start: %f end: %f", CMTimeGetSeconds(range.start), CMTimeGetSeconds(range.duration));
[exportSession exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
});
if ([exportSession status] == AVAssetExportSessionStatusCompleted){
NSData *videoData = [[NSData alloc]initWithContentsOfURL:exportSession.outputURL];
NSLog(#"DL: %f", (float)videoData.length/1024.0f/1024.0f);
[self makeFile:finalUrl];
AVURLAsset *fullVid = [AVURLAsset URLAssetWithURL:output options:nil];
CMTime start = CMTimeMake(totalSeconds, 1);
totalSeconds = totalSeconds + 4.0f;
CMTime end;
if ((CMTimeGetSeconds(start) + 4) > CMTimeGetSeconds(fullVid.duration)) {
end = fullVid.duration;
}else{
end = CMTimeMake(CMTimeGetSeconds(start) + 4, 1);
}
CMTimeRange range2 = CMTimeRangeMake(start, end);
NSLog(#"%f < %f\n\n", CMTimeGetSeconds(start), CMTimeGetSeconds(fullVid.duration));
if (CMTimeGetSeconds(start) < CMTimeGetSeconds(fullVid.duration)) {
NSString *outputPath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"output%lu.mp4", splitArray.count]];
[self cutVideo:output withRange:range2 withOutput:outputPath];
}else{
[self saveVideo:true];
}
}else if ([exportSession status] == AVAssetExportSessionStatusFailed){
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
}else if ([exportSession status] == AVAssetExportSessionStatusCancelled){
NSLog(#"Export canceled");
}
}];
}
}
File size is : 18.86 MB And Duration: 9.171667
first
start: 0.000000 end: 4.000000
DL: 8.194733
4.000000 < 9.171667
second
start: 4.000000 end: 8.000000
DL: 12.784523
It's not incorrect, because video decoders stores changes from last frame, not just a set of "images". I guess your video in have more color changes in second chunk, that's why you get more space.