I have a requirement where user will be allowed to trim a audio file before submitting to server. The trimming function works fine in iOS 6 and not in iOS 7.
This happens in iOS 7 when user chooses a song from iTunes library and start trimming. It appears as trimmed. The new file which is created after trimming plays upto trimmed and rest will be blank. Also the duration shows the original song duration. This doesn't happen for all files. It happens only for some files. Also I did check the exportable and hasProtectedContent . Both have correct values (exportable - yes, hasProtectedContent - no). Can I know what could be issue in iOS 7.
I am pasting the trimming audio file code for reference
- (void)trimAndExportAudio:(AVAsset *)avAsset withDuration:(NSInteger)durationInSeconds withStartTime:(NSInteger)startTime endTime:(NSInteger)endTime toFileName:(NSString *)filename withTrimCompleteBlock:(TrimCompleteBlock)trimCompleteBlock
{
if (startTime < 0 || startTime > durationInSeconds || startTime >= endTime)
{
CGLog(#"start time = %d endTime %d durationInSeconds %d", startTime, endTime, durationInSeconds);
trimCompleteBlock(NO, #"Invalid Start Time");
return;
}
if (endTime > durationInSeconds)
{
CGLog(#"start time = %d endTime %d durationInSeconds %d", startTime, endTime, durationInSeconds);
trimCompleteBlock(NO, #"Invalid End Time");
return;
}
// create the export session
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:avAsset presetName:AVAssetExportPresetAppleM4A];
if (exportSession == nil)
{
trimCompleteBlock(NO, #"Could not create an Export Session.");
return;
}
//export file path
NSError *removeError = nil;
NSString *filePath = [[CGUtilities applicationLibraryMyRecordingsDirectory] stringByAppendingPathComponent:filename];
if ([[NSFileManager defaultManager] fileExistsAtPath:filePath])
{
[[NSFileManager defaultManager] removeItemAtPath:filePath error:&removeError];
}
if (removeError)
{
CGLog(#"Error removing existing file = %#", removeError);
}
// create trim time range
CMTime exportStartTime = CMTimeMake(startTime, 1);
CMTime exportStopTime = CMTimeMake(endTime, 1);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(exportStartTime, exportStopTime);
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:filePath]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
exportSession.timeRange = exportTimeRange; // trim time range
//perform the export
__weak AVAssetExportSession *weakExportSession = exportSession;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status)
{
if (![filename isEqualToString:kLibraryTempFileName])
{
//created a new recording
}
trimCompleteBlock(YES, nil);
}
else if (AVAssetExportSessionStatusFailed == exportSession.status)
{
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
trimCompleteBlock(NO, weakExportSession.error.description);
}
else
{
trimCompleteBlock(NO, weakExportSession.error.description);
}
}];
}
Thanks
We can import AVFoundation/AVFoundation.h
-(BOOL)trimAudiofile{
float audioStartTime;//define start time of audio
float audioEndTime;//define end time of audio
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSLibraryDirectory, NSUserDomainMask, YES);
NSString *libraryCachesDirectory = [paths objectAtIndex:0];
libraryCachesDirectory = [libraryCachesDirectory stringByAppendingPathComponent:#"Caches"];
NSString *OutputFilePath = [libraryCachesDirectory stringByAppendingFormat:#"/output_%#.mp4", [dateFormatter stringFromDate:[NSDate date]]];
NSURL *audioFileOutput = [NSURL fileURLWithPath:OutputFilePath];
NSURL *audioFileInput;//<Path of orignal audio file>
if (!audioFileInput || !audioFileOutput)
{
return NO;
}
[[NSFileManager defaultManager] removeItemAtURL:audioFileOutput error:NULL];
AVAsset *asset = [AVAsset assetWithURL:audioFileInput];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:asset
presetName:AVAssetExportPresetAppleM4A];
if (exportSession == nil)
{
return NO;
}
CMTime startTime = CMTimeMake((int)(floor(audioStartTime * 100)), 100);
CMTime stopTime = CMTimeMake((int)(ceil(audioEndTime * 100)), 100);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime);
exportSession.outputURL = audioFileOutput;
exportSession.timeRange = exportTimeRange;
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^
{
if (AVAssetExportSessionStatusCompleted == exportSession.status)
{
NSLog(#"Export OK");
}
else if (AVAssetExportSessionStatusFailed == exportSession.status)
{
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
}
}];
return YES;
}
Related
On a device running iOS 13, [exportSession exportAsynchronouslyWithCompletionHandler: always fails with message "The operation could not be completed" while converting .MOV video to mp4. However, the same code runs fine on iOS prior to 13 i.e 12. I am pasting below my complete method
- (void)encodeVideo:(NSString *)videoURL
{
// Create the asset url with the video file
AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoURL] options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];
// Check if video is supported for conversion or not
if ([compatiblePresets containsObject: AVAssetExportPresetLowQuality])
{
//Create Export session
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:avAsset presetName:AVAssetExportPresetLowQuality];
//Creating temp path to save the converted video
NSString* documentsDirectory= [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString* myDocumentPath= [documentsDirectory stringByAppendingPathComponent:#"temp.mp4"];
NSURL *url = [[NSURL alloc] initFileURLWithPath:myDocumentPath];
//Check if the file already exists then remove the previous file
if ([[NSFileManager defaultManager]fileExistsAtPath:myDocumentPath])
{
[[NSFileManager defaultManager]removeItemAtPath:myDocumentPath error:nil];
}
exportSession.outputURL = url;
//set the output file format if you want to make it in other file format (ex .3gp)
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status])
{
case AVAssetExportSessionStatusFailed:
NSLog(#"Export session failed");
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export canceled");
break;
case AVAssetExportSessionStatusCompleted:
{
//Video conversion finished
NSLog(#"Successful!");
}
break;
default:
break;
}
}];
}
else
{
NSLog(#"Video file not supported!");
}
}
1.
Create Folder -
let filePath = documentDirectory.appendingPathComponent("FolderName")
if !fileManager.fileExists(atPath: filePath.path) {
do {
try fileManager.createDirectory(atPath: filePath.path, withIntermediateDirectories: true, attributes: nil)
} catch {
print(error.localizedDescription)
return nil
}
}
2.
Let url = videoURL
destinationURL = filePath.appendingPathComponent("filename.mp4")
url.startAccessingSecurityScopedResource()
do {
try FileManager.default.copyItem(at: url, to: destinationURL)
} catch {
Logging.Log.error("EncodeVideo failed \(error.localizedDescription)")
}
url.startAccessingSecurityScopedResource()
Start Mov to MP4 now it is working.
I want to implement audio trimming in iOS, I want something like a two way slider that the user can slide over a particular period of the recorded audio, and then I will save that audio lie the image below
AVAssetExportSession - Exporting a Trimmed Audio Asset
The below code will do the job for you please refer to this link for further detail
- (BOOL)exportAsset:(AVAsset *)avAsset toFilePath:(NSString *)filePath {
// we need the audio asset to be at least 50 seconds long for this snippet
CMTime assetTime = [avAsset duration];
Float64 duration = CMTimeGetSeconds(assetTime);
if (duration < 50.0) return NO;
// get the first audio track
NSArray *tracks = [avAsset tracksWithMediaType:AVMediaTypeAudio];
if ([tracks count] == 0) return NO;
AVAssetTrack *track = [tracks objectAtIndex:0];
// create the export session
// no need for a retain here, the session will be retained by the
// completion handler since it is referenced there
AVAssetExportSession *exportSession = [AVAssetExportSession
exportSessionWithAsset:avAsset
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession) return NO;
// create trim time range - 20 seconds starting from 30 seconds into the asset
CMTime startTime = CMTimeMake(30, 1);
CMTime stopTime = CMTimeMake(50, 1);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime);
// create fade in time range - 10 seconds starting at the beginning of trimmed asset
CMTime startFadeInTime = startTime;
CMTime endFadeInTime = CMTimeMake(40, 1);
CMTimeRange fadeInTimeRange = CMTimeRangeFromTimeToTime(startFadeInTime,
endFadeInTime);
// setup audio mix
AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *exportAudioMixInputParameters =
[AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[exportAudioMixInputParameters setVolumeRampFromStartVolume:0.0 toEndVolume:1.0
timeRange:fadeInTimeRange];
exportAudioMix.inputParameters = [NSArray
arrayWithObject:exportAudioMixInputParameters];
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:filePath]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
exportSession.timeRange = exportTimeRange; // trim time range
exportSession.audioMix = exportAudioMix; // fade in audio mix
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"AVAssetExportSessionStatusFailed");
} else {
NSLog(#"Export Session Status: %d", exportSession.status);
}
}];
return YES;
}
I want to trim an audio which is recorded with EZAudioRecorder.
I am writing this code to trim an audio. This is working fine for audio recorded with AVAudioRecorder but it triggers error block with EZAudioRecorder, with an error couldn't open file.
-(BOOL)trimAudiofile{
float audioStartTime=1.0;
float audioEndTime=2.0;//define end time of audio
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss"];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSLibraryDirectory, NSUserDomainMask, YES);
NSString *libraryCachesDirectory = [paths objectAtIndex:0];
libraryCachesDirectory = [libraryCachesDirectory stringByAppendingPathComponent:#"Caches"];
NSString *OutputFilePath = [libraryCachesDirectory stringByAppendingFormat:#"/output_%#.m4a", [dateFormatter stringFromDate:[NSDate date]]];
NSURL *audioFileOutput = [NSURL fileURLWithPath:OutputFilePath];
NSURL *audioFileInput=[self testFilePathURL];//<Path of orignal audio file>
if (!audioFileInput || !audioFileOutput)
{
return NO;
}
[[NSFileManager defaultManager] removeItemAtURL:audioFileOutput error:NULL];
AVAsset *asset = [AVAsset assetWithURL:audioFileInput];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:asset
presetName:AVAssetExportPresetAppleM4A];
if (exportSession == nil)
{
return NO;
}
CMTime startTime = CMTimeMake((int)(floor(audioStartTime * 100)), 100);
CMTime stopTime = CMTimeMake((int)(ceil(audioEndTime * 100)), 100);
CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime);
exportSession.outputURL = audioFileOutput;
exportSession.timeRange = exportTimeRange;
exportSession.outputFileType = AVFileTypeAppleM4A;
[exportSession exportAsynchronouslyWithCompletionHandler:^
{
if (AVAssetExportSessionStatusCompleted == exportSession.status)
{
NSLog(#"Export OK");
}
else if (AVAssetExportSessionStatusFailed == exportSession.status)
{
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
}
}];
return YES;
}
Note:- The audio file exists in document directory and EZAudioPlayer is also able to play this file.
Can anyone tell me where am I doing wrong ?
Any help on this will be appreciated.
Thanks in advance.
1.Check file formate in both cases when you trim audio recorded by AVAudioRecorder and audio by EZAudioRecorder.
2.You are trying to export file before record audio successfully,you have to wait until audio is not recorded.
Thanks all.. I found the solution of this issue...!!
I was not closing audio file. I added this code before trimming audio, Now everything working as expecting.
We need to close this file even file exits in document directory.
if (self.recorder)
{
[self.recorder closeAudioFile];
}
have a look on this git issue
I am using AVAssetExportSession for Audio Recording with assert and here is my code to convert AVAssert to AVAssertExportSession.
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:self.asset presetName:AVAssetExportPresetAppleM4A];
exportSession.outputURL = [NSURL URLWithString:dataPath];
exportSession.outputFileType = AVFileTypeAppleM4A;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(#".... Audio... %#",exportSession);
}];
It gives me output like this
<AVAssetExportSession: 0x177f4b30, asset = <AVURLAsset: 0x18981f60, URL = file:///private/var/mobile/Containers/Data/Application/8BB39AD5-EEFB-4AF1-A913-B26C5C072E61/tmp/1422861622SCVideo.0.m4a>, presetName = AVAssetExportPresetAppleM4A, outputFileType = com.apple.m4a-audio
Here i just want URL to NSString.
Help me for this
After the export session is complete, you can get what you want. Because it is an asynchronous operation.
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (exportSession.status == AVAssetExportSessionStatusFailed) {
NSLog(#"failed");
} else if(exportSession.status == AVAssetExportSessionStatusCompleted){
NSLog(#"completed!");
// here you can get the output url.
}
}];
There's a strange behaviour I've found when trying to merge videos with AVFoundation. I'm pretty sure that I've made a mistake somewhere but I'm too blind to see it. My goal is just to merge 4 videos (later there will be crossfade transition between them).
Everytime I'm trying to export video I get this error:
Error Domain=AVFoundationErrorDomain Code=-11821 "Cannot Decode" UserInfo=0x7fd94073cc30 {NSLocalizedDescription=Cannot Decode, NSLocalizedFailureReason=The media data could not be decoded. It may be damaged.}
The funniest thing is that if I don't provide AVAssetExportSession with AVMutableVideoComposition, then everything works fine! I can't understand what I'm doing wrong. The source videos are downloaded from youtube and have .mp4 extension. I can play them with MPMoviePlayerController. While checking the source code, please, look carefully at AVMutableVideoComposition.
I was testing this code in Xcode 6.0.1 on iOS simulator.
#import "VideoStitcher.h"
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
#implementation VideoStitcher
{
VideoStitcherCompletionBlock _completionBlock;
AVMutableComposition *_composition;
AVMutableVideoComposition *_videoComposition;
}
- (instancetype)init
{
self = [super init];
if (self)
{
_composition = [AVMutableComposition composition];
_videoComposition = [AVMutableVideoComposition videoComposition];
}
return self;
}
- (void)compileVideoWithAssets:(NSArray *)assets completion:(VideoStitcherCompletionBlock)completion
{
_completionBlock = [completion copy];
if (assets == nil || assets.count < 2)
{
// We need at least two video to make a stitch, right?
NSAssert(NO, #"VideoStitcher: assets parameter is nil or has not enough items in it");
}
else
{
[self composeAssets:assets];
if (_composition != nil) // if stitching went good and no errors were found
[self exportComposition];
}
}
- (void)composeAssets:(NSArray *)assets
{
AVMutableCompositionTrack *compositionVideoTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *compositionError = nil;
CMTime currentTime = kCMTimeZero;
AVAsset *asset = nil;
for (int i = (int)assets.count - 1; i >= 0; i--) //For some reason videos are compiled in reverse order. Find the bug later. 06.10.14
{
asset = assets[i];
AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
BOOL success = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetVideoTrack.timeRange.duration)
ofTrack:assetVideoTrack
atTime:currentTime
error:&compositionError];
if (success)
{
CMTimeAdd(currentTime, asset.duration);
}
else
{
NSLog(#"VideoStitcher: something went wrong during inserting time range in composition");
if (compositionError != nil)
{
NSLog(#"%#", compositionError);
_completionBlock(nil, compositionError);
_composition = nil;
return;
}
}
}
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration);
videoCompositionInstruction.backgroundColor = [[UIColor redColor] CGColor];
_videoComposition.instructions = #[videoCompositionInstruction];
_videoComposition.renderSize = [self calculateOptimalRenderSizeFromAssets:assets];
_videoComposition.frameDuration = CMTimeMake(1, 600);
}
- (void)exportComposition
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:#"testVideo.mov"];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
NSString *filePath = [url path];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:filePath]) {
NSError *error;
if ([fileManager removeItemAtPath:filePath error:&error] == NO) {
NSLog(#"removeItemAtPath %# error:%#", filePath, error);
}
}
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:_composition
presetName:AVAssetExportPreset1280x720];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = _videoComposition;
[exporter exportAsynchronouslyWithCompletionHandler:^{
[self exportDidFinish:exporter];
}];
}
- (void)exportDidFinish:(AVAssetExportSession*)session
{
NSLog(#"%li", session.status);
if (session.status == AVAssetExportSessionStatusCompleted)
{
NSURL *outputURL = session.outputURL;
// time to call delegate methods, but for testing purposes we save the video in 'photos' app
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error){
if (error == nil)
{
NSLog(#"successfully saved video");
}
else
{
NSLog(#"saving video failed.\n%#", error);
}
}];
}
}
else if (session.status == AVAssetExportSessionStatusFailed)
{
NSLog(#"VideoStitcher: exporting failed.\n%#", session.error);
}
}
- (CGSize)calculateOptimalRenderSizeFromAssets:(NSArray *)assets
{
AVAsset *firstAsset = assets[0];
AVAssetTrack *firstAssetVideoTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
CGFloat maxWidth = firstAssetVideoTrack.naturalSize.height;
CGFloat maxHeight = firstAssetVideoTrack.naturalSize.width;
for (AVAsset *asset in assets)
{
AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
if (assetVideoTrack.naturalSize.width > maxWidth)
maxWidth = assetVideoTrack.naturalSize.width;
if (assetVideoTrack.naturalSize.height > maxHeight)
maxHeight = assetVideoTrack.naturalSize.height;
}
return CGSizeMake(maxWidth, maxHeight);
}
#end
Thank you for your attention. I am really tired, I've been trying to find the bug for four hours straight. I'll go to sleep now.
I've finally found the solution. The description of error lead me in the wrong direction: "Cannot Decode. The media data could not be decoded. It may be damaged.". From this description you may think that there is something wrong with your video files. I've spent 5 hours experimenting with formats, debugging and etc.
Well, THE ANSWER IS COMPLETELY DIFFERENT!
My mistake was that I forgot that CMTimeADD() returns value. I thought that it changes the value of its first argument, and in the code you can see this:
CMTime currentTime = kCMTimeZero;
for (int i = (int)assets.count - 1; i >= 0; i--)
{
CMTimeAdd(currentTime, asset.duration); //HERE!! I don't actually increment the value! currentTime is always kCMTimeZero
}
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration); // And that's where everything breaks!
The lesson that I've learned: When working with AVFoundation always check your time values! It's very important, otherwise you'll get a lot of bugs.
Error:-
domain: "AVFoundationErrorDomain" - code: 18446744073709539816
Solution:- [Swift 5.5]
Stop running mutiple av player in back ground thread.