Memory Increases When Merging and Playing Audio - ios

I'm trying to merge multiple pieces of audio into one simultaneous sound and then play it. I can merge and play, but my app's memory usage keeps increasing over time.
Looking online, it seems like there's ARC/memory issues with AVPlayer.
I've added all relevant code below for setting up the files, merging them and then playing them.
Setting up sound files
- (void) setUpSoundFiles {
AVURLAsset *songAsset = nil;
AVAssetTrack *sourceAudioTrack = nil;
for (int i = 0; i < numberOfSoundsToMerge; i++){
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:[#(i) stringValue] ofType:#"mp3"]];
songAsset = [AVURLAsset URLAssetWithURL:url options:nil];
sourceAudioTrack = [[songAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[songTracks insertObject:sourceAudioTrack atIndex:i];
[songAssets insertObject:[AVURLAsset URLAssetWithURL:url options:nil] atIndex:i];
//[sourceAudioTrack.asset cancelLoading];
}
}
Merging Sound Files
- (void) mergeAudio: (AVMutableComposition *)composition{
for(int i = 0; i < numberOfSoundsToMerge; i++) {
AVMutableCompositionTrack *track = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *error = nil;
BOOL ok = NO;
CMTime startTime = CMTimeMakeWithSeconds(0, 1);
CMTime trackDuration = ((AVURLAsset *)[songAssets objectAtIndex:i]).duration;
CMTimeRange tRange = CMTimeRangeMake(startTime, trackDuration);
//Set Volume
AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[trackMix setVolume:0.8f atTime:startTime];
[audioMixParams addObject:trackMix];
//Insert audio into track
ok = [track insertTimeRange:tRange ofTrack:(AVAssetTrack *)[songTracks objectAtIndex:i] atTime:CMTimeMake(0, 1) error:&error];
[((AVAssetTrack *)[songTracks objectAtIndex:i]).asset cancelLoading];
}
}
Playing Sound Files
- (void) playSounds {
AVMutableComposition *composition = [AVMutableComposition composition];
audioMixParams = [[NSMutableArray alloc] initWithObjects:nil];
[self mergeAudio:composition];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithArray:audioMixParams];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[playerItem setAudioMix:audioMix];
[player play]; }

Related

Build AVMutableComposition from AVURLAssets in loop

I'm working on an app that will need to concat a group of videos recorded from camera. Ultimately I'll have an array of URL's to work with but I can't figure out how to get two movie assets to concat properly. Here's some standalone code:
- (void)buildComposition {
NSString *path1 = [[NSBundle mainBundle] pathForResource:#"IMG_1049" ofType:#"MOV"];
NSString *path2 = [[NSBundle mainBundle] pathForResource:#"IMG_1431" ofType:#"MOV"];
NSURL *url1 = [NSURL fileURLWithPath:path1];
NSURL *url2 = [NSURL fileURLWithPath:path2];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoCompositionInstruction *compositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
NSMutableArray *layerInstructions = [NSMutableArray array];
CGSize renderSize = CGSizeZero;
NSUInteger count = 0;
for (NSURL *url in #[url1, url2]) {
NSDictionary *options = #{ AVURLAssetPreferPreciseDurationAndTimingKey: #(YES) };
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:options];
CMTimeRange editRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(1.0, 600));
NSError *error = nil;
CMTime insertionTime = composition.duration;
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoTrack = videoTracks.firstObject;
AVMutableCompositionTrack *videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoCompositionTrack insertTimeRange:editRange ofTrack:videoTrack atTime:insertionTime error:&error];
if (count == 0) {
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
CGAffineTransform scale = CGAffineTransformMakeScale(0.6, 0.6);
[layerInstruction setTransform:CGAffineTransformConcat(videoTrack.preferredTransform, scale) atTime:kCMTimeZero];
[layerInstructions addObject:layerInstruction];
}
else {
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
CGAffineTransform scale = CGAffineTransformMakeScale(0.9, 0.9);
[layerInstruction setTransform:CGAffineTransformConcat(videoTrack.preferredTransform, scale) atTime:kCMTimeZero];
[layerInstructions addObject:layerInstruction];
}
// set the render size
CGRect transformed = CGRectApplyAffineTransform(CGRectMakeWithCGSize(videoTrack.naturalSize), videoTrack.preferredTransform);
renderSize = CGSizeUnion(renderSize, transformed.size);
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack *audioTrack = audioTracks.firstObject;
AVMutableCompositionTrack *audioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioCompositionTrack insertTimeRange:editRange ofTrack:audioTrack atTime:insertionTime error:&error];
++count;
}
// set the composition instructions
compositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
compositionInstruction.layerInstructions = layerInstructions;
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:composition];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.instructions = #[compositionInstruction];
videoComposition.renderSize = renderSize;
// export the composition
NSTimeInterval time = [NSDate timeIntervalSinceReferenceDate];
NSString *filename = [[NSString stringWithFormat:#"video-export-%f", time] stringByAppendingPathExtension:#"mov"];
NSString *pathTo = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"Documents/%#", filename]];
NSURL *fileUrl = [NSURL fileURLWithPath:pathTo];
AVAssetExportSession *assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
assetExport.videoComposition = videoComposition;
assetExport.outputFileType = AVFileTypeQuickTimeMovie;
assetExport.shouldOptimizeForNetworkUse = YES;
assetExport.outputURL = fileUrl;
[assetExport exportAsynchronouslyWithCompletionHandler:^{
switch (assetExport.status) {
case AVAssetExportSessionStatusFailed:
NSLog(#"\n\nFailed: %#\n\n", assetExport.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"\n\nCancelled: %#\n\n", assetExport.error);
break;
default:
NSLog(#"\n\nExported: %#\n\n", fileUrl);
break;
}
}];
}
What I expect to happen is the first video plays for 1 second at 60% scale, and then the second video plays for 1 second at 90% scale.
What actually happens is the first video plays at both 60% and 90% at the start of the video. After 1 second, the video goes black but the audio plays correctly.
Any ideas? Thanks!
Figured it out for anyone who is curious. In my layer instructions, I was mistakenly building them using the AVURLAsset's videoTrack, not the AVMutableComposition's compositionTrack!
This line:
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
Should be:
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];

AVAsset "tracksWithMediaType" returns an empty array

These is the beginning o a method I am using to merge videos together
-(void) mergeVideosAndAudio:(AVAsset *)audioAsset{
//Load Video Assets
NSError *error;
NSArray *dirFiles;
if ((dirFiles = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:[self documentsDirectory] error:&error]) == nil) {
// handle the error
};
// find all the temp files
NSArray *movFiles = [dirFiles filteredArrayUsingPredicate:[NSPredicate predicateWithFormat:#"self BEGINSWITH 'temp'"]];
NSLog(#"The are %i temp files",movFiles.count);
//Create assets array
NSMutableArray *assets = [[NSMutableArray alloc]init];
for (int i = 0; i < movFiles.count; i++) {
NSString *videoURL = [[self documentsDirectory] stringByAppendingPathComponent:
[NSString stringWithFormat:#"temp%i.mov", i]];
NSURL *url = [NSURL fileURLWithPath:videoURL];
AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:url options:nil];
[assets addObject:videoAsset];
}
NSLog(#"assets:%i ", assets.count);
// a second way
for (id obj in assets)
NSLog(#"obj: %#", obj);
//Create the composition
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 1 - Video track
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime videoTrackDuration;
for (int j = 0; j < assets.count; j++) {
AVURLAsset *currentAsset = assets[j];
videoTrackDuration = CMTimeAdd(videoTrackDuration, currentAsset.duration);
CMTime time;
if (j == 0) {
time = kCMTimeZero;
}else{
AVURLAsset *previousAsset = assets[j-1];
time = previousAsset.duration;
}
AVAssetTrack *assetTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:assetTrack atTime:time error:nil];
}
The problem I am having is that the tracksWithMediaType property of currentAsset is an empty array.
Here is the console
Any help will be greatly appreciated.
Thanks
Have you seen this link?
I'm working on a solution now that KVO's the tracks key:
[item addObserver:self forKeyPath:kTracksKey options:opts context:nil];

AVPlayer from AVMutableComposition with audio and video won't play

I'm trying to show a video from a composition of both video and audio. However, I seem to have a problem, once the video status never reaches AVPlayerStatusReadyToPlay.
If I include the video asset or the audio asset directly to the player item it will work. Thus, I know there is no problem with the assets.
This is my code:
- (void) loadPlayer {
NSURL *videoURL = **;
AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
NSURL *audioURL = **;
AVURLAsset *audioAsset = [AVURLAsset URLAssetWithURL:audioURL options:nil];
NSArray *keys = [NSArray arrayWithObject:#"duration"];
[videoAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [videoAsset statusOfValueForKey:#"duration" error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:;
_videoDuration = videoAsset.duration;
if (_audioDuration.flags == kCMTimeFlags_Valid) {
[self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
}
break;
}
}];
[audioAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {
NSError *error = nil;
AVKeyValueStatus durationStatus = [audioAsset statusOfValueForKey:#"duration" error:&error];
switch (durationStatus) {
case AVKeyValueStatusLoaded:;
_audioDuration = audioAsset.duration;
if (_videoDuration.flags == kCMTimeFlags_Valid) {
[self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
}
break;
}
}];
}
- (void) loadPlayWithVideoAsset:(AVURLAsset *)videoAsset withDuration:(CMTime)videoDuration andAudioAsset:(AVURLAsset *)audioAsset withDuration:(CMTime)audioDuration {
AVMutableComposition *composition = [AVMutableComposition composition];
//Video
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
NSError *videoError = nil;
if (![compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoDuration)
ofTrack:videoTrack
atTime:kCMTimeZero
error:&videoError]) {
NSLog(#"videoError: %#",videoError);
}
//Audio
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *audioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];
NSError *audioError = nil;
if (![compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,audioDuration)
ofTrack:audioTrack
atTime:kCMTimeZero
error:&audioError]) {
NSLog(#"audioError: %#",audioError);
}
NSInteger compare = CMTimeCompare(videoDuration, audioDuration);
if (compare == 1) {
//The video is larger
CMTime timeDiff = CMTimeSubtract(videoDuration, audioDuration);
[compositionAudioTrack insertEmptyTimeRange:CMTimeRangeMake(audioDuration, timeDiff)];
}
else {
CMTime timeDiff = CMTimeSubtract(audioDuration, videoDuration);
[compositionVideoTrack insertEmptyTimeRange:CMTimeRangeMake(videoDuration, timeDiff)];
}
AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.mPlayer = [AVPlayer playerWithPlayerItem:playerItem];
self.mPlaybackView = [[AVPlayerPlaybackView alloc] initWithFrame:CGRectZero];
[self.view addSubview:self.mPlaybackView];
[self.mPlayer addObserver:self forKeyPath:#"status" options:0 context:AVPlayerPlaybackViewControllerStatusObservationContext];
}
- (void)observeValueForKeyPath:(NSString*) path ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
if (self.mPlayer.status == AVPlayerStatusReadyToPlay) {
[self.mPlaybackView setPlayer:self.mPlayer];
isReadyToPlay = YES;
_playVideoBtn.hidden = NO;
}
}
- (void) playVideo {
if (YES || isReadyToPlay) {
[self.mPlayer play];
}
}
From my experience, AVPlayer works with AVMutableComposition only if resource/video is bundled with app. If video resource is on network, then AVPlayer won't play with AVMUtableComposition, despite status reported by AVPlayerItem and AVPlayer as "Ready to Play".

iOS - how to play sound file using avfoundation

I want to play a sound file (which I have dragged into xcode and copy to the project) using av foundation with the following code but fail.
I think NSURL *url = [[NSURL alloc] initWithString:#"sound.caf"]; this is where it goes wrong, but I have no idea other than this way, how I could instantiate an AVAsset with this the sound file (of course, it would be other place that goes wrong). Anyway, can someone offer me some helps? thanks
AVMutableComposition *composition = [AVMutableComposition composition];
CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;
AVMutableCompositionTrack *compositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:trackID];
NSURL *url = [[NSURL alloc] initWithString:#"sound.caf"];
AVAsset *songAsset = [AVURLAsset URLAssetWithURL:url options:nil];
AVAssetTrack *assetTrack = [[songAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTime startTime = CMTimeMakeWithSeconds(0, 1);
CMTime endTime = songAsset.duration;
CMTimeRange tRange = CMTimeRangeMake(startTime, endTime);
NSError *error = nil;
[compositionTrack insertTimeRange:tRange ofTrack:assetTrack atTime:CMTimeMake(0, 44100) error:&error];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[self.player play];
This code should help you:
NSString *soundPath = [[NSBundle mainBundle] pathForResource:#"myfile" ofType:#"wav"];
NSURL *soundURL = [NSURL fileURLWithPath:soundPath];
NSError *error = nil;
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:soundURL error:&error];
if (error) {
NSLog(#"%#",[error localizedDescription]);
}
[player play];
Try this
NSString *shutterplayerPath =
[[NSBundle mainBundle] pathForResource:#"shutter" ofType:#"mp3"];
NSString *tickplayerPath =
[[NSBundle mainBundle] pathForResource:#"tick" ofType:#"wav"];
shutterAudioPlayer =
[[AVAudioPlayer alloc] initWithContentsOfURL:[[NSURL alloc]
initFileURLWithPath: shutterplayerPath] error:NULL];
tickAudioPlayer =
[[AVAudioPlayer alloc] initWithContentsOfURL:[[NSURL alloc];
initFileURLWithPath:tickplayerPath] error:NULL];
[shutterAudioPlayer play];
[tickAudioPlayer play];

Merging NSData video files into one video file

I have a bunch of video files that I want to merge into one video file, I am using NSMutableData to achieve the task
NSMutableData *concatenatedData = [[NSMutableData alloc] init];
for (int i=0; i <[videoArray count]; i ++) {
[concatenatedData appendData: [videoArray objectAtIndex:i]];
}
[concatenatedData writeToFile:[[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"outputConct.mov"] atomically:YES];
UISaveVideoAtPathToSavedPhotosAlbum([[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(),#"outputConct.mov"], nil, nil, nil);
after the video is saved in my camera roll I try to play it but only the first NSData video is in it. I am not sure why.
edit
I tried AVMutableComposition, even then I am having the same issues
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
AVURLAsset* firstAsset;
AVMutableCompositionTrack *firstTrack;
CMTime time = kCMTimeZero;
for (int i=0; i<[videoArray count]; i++) {
firstAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:[[NSString alloc] initWithFormat:#"%#%d%#", NSTemporaryDirectory(),i, #"output.mov"]] options:nil];
firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:time error:nil];
time = firstAsset.duration;
}
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:#"mergeVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
// 5 - Create exporter
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (exporter.status == AVAssetExportSessionStatusCompleted) {
//NSURL *outputURL = exporter.outputURL;
UISaveVideoAtPathToSavedPhotosAlbum(myPathDocs, nil, nil, nil);
}
});
}];
edit
I also tried this, but it gives me an error, it says [__NSArrayM insertObject:atIndex:]: object cannot be nil'
AVMutableComposition *mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSError * error = nil;
NSMutableArray * timeRanges = [NSMutableArray arrayWithCapacity:[videoArray count]];
NSMutableArray * tracks = [NSMutableArray arrayWithCapacity:[videoArray count]];
for (int i=0; i<[videoArray count]; i++) {
AVURLAsset *assetClip = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:[[NSString alloc] initWithFormat:#"%#%d%#", NSTemporaryDirectory(),i, #"output.mov"]] options:nil];
AVAssetTrack *clipVideoTrackB = [[assetClip tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[timeRanges addObject:[NSValue valueWithCMTimeRange:CMTimeRangeMake(kCMTimeZero, assetClip.duration)]];
[tracks addObject:clipVideoTrackB];
}
[compositionTrack insertTimeRanges:timeRanges ofTracks:tracks atTime:kCMTimeZero error:&error];
This is the line where the program is crashing.
[compositionTrack insertTimeRanges:timeRanges ofTracks:tracks atTime:kCMTimeZero error:&error];
I am not sure why as both timeRanges and tracks have values in them
You can merge multiple videos and create a single video by appending one after other using AVFoundation classes- AVURLAsset, AVMutableComposition, AVMutableCompositionTrack etc.
You can check this Append/merge multiple video files into one final output video
There is a nice tutorials also
1.Tutorial 1
2.Tutorial 2
Hope it helps you.

Resources