No sound when I play an AVMutableComposition with AVPlayer - ios

I'm trying to create an AVMutableComposition and playing it using AVPlayer. This is what I'm doing.
//Define the AVComposition
self.composition = [[AVMutableComposition alloc] init];
//Define Mutable Track
self.mainVideoTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
//Define the asset
AVAsset *asset = [AVAsset assetWithURL:url];
//Insert the asset in the track
[self.mainVideoTrack insertTimeRange:CMTimeRangeFromTimeToTime(CMTimeMake(0,1000),CMTimeMake(asset.duration*1000,1000)) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:[self startingTimeForSegment:videoSegment] error:&error];
//Define player item with the composition
self.playerItem = [[AVPlayerItem alloc] initWithAsset:self.composition];
//Define the player, player layer & add the layer
self.avPlayer = [[AVPlayer alloc]initWithPlayerItem:self.playerItem];
self.layerVideo = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
self.layerVideo.frame = self.previewContainer.bounds;
[self.previewContainer.layer addSublayer:self.layerVideo];
However I can hear no sound in the player layer. If I initialize the playeritem directly with the asset without using AVMutableComposition, it plays the sound. What am I doing wrong? Stuck on this.

It looks like you are only adding one track to your AVMutableComposition. You would need to add the Audio track from the asset as well. This can be done exactly the same way as adding a video track.
self.mainAudioTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
And then you can insert from your assets audio track. This is assuming you want your audio to start at the same point your video does.
[self.mainAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:[self startingTimeForSegment:audio] error:&error];

Related

MP3 Queue Player - Load in background thread?

I have an AVQueuePlayer that is used to play a list of MP3 songs from the internet (http). I need to also know which song is currently playing. The current problem is that loading the song causes a delay that blocks the main thread while waiting for the song to load (first song as well as sequential songs after the first has completed playback).
The following code blocks the main thread:
queuePlayer = [[AVQueuePlayer alloc] init];
[queuePlayer insertItem: [AVPlayerItem playerItemWithURL:url] afterItem: nil]; // etc.
[queuePlayer play]
I am looking for a way to create a playlist of MP3s where the next file to be played back is preloaded in the background.
I tried the following code:
NSArray* tracks = [NSArray arrayWithObjects:#"http://example.com/song1.mp3", #"http://example.com/song2.mp3", #"http://example.com/song3.mp3", nil];
for (NSString* trackName in tracks)
{
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:[NSURL URLWithString:trackName]
options:nil];
AVMutableCompositionTrack* audioTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
NSError* error;
[audioTrack insertTimeRange:CMTimeRangeMake([_composition duration], audioAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
if (error)
{
NSLog(#"%#", [error localizedDescription]);
}
// Store the track IDs as track name -> track ID
[_audioMixTrackIDs setValue:[NSNumber numberWithInteger:audioTrack.trackID]
forKey:trackName];
}
_player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[_player play];
The issue with this is that I am not sure how to detect when the next song starts playing. Also, the docs don't specify whether or not this will pre-load MP3 files or not.
I am looking for a solution that:
Plays MP3s by pre-loading them in the background prior to playback (ideally start loading the next song before the current song finishes, so it is ready for immediate playback once the current song finishes)
Allow me to view the current song playing.
AVFoundation has some classes designed to do exactly what you're looking for.
It looks like your current solution is to build a single AVPlayerItem that concatenates all of the MP3 files that you want to play. A better solution is to create an AVQueuePlayer with an array of the AVPlayerItem objects that you want to play.
NSArray* tracks = [NSArray arrayWithObjects:#"http://example.com/song1.mp3", #"http://example.com/song2.mp3", #"http://example.com/song3.mp3", nil];
NSMutableArray *playerItems = [[NSMutableArray alloc] init];
for (NSString* trackName in tracks)
{
NSURL *assetURL = [NSURL URLWithString:trackName];
if (!assetURL) {
continue;
}
AVURLAsset* audioAsset = [[AVURLAsset alloc] initWithURL:assetURL
options:nil];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:audioAsset];
[playerItems addObject:playerItem];
}
_player = [[AVQueuePlayer alloc] initWithItems:playerItems];
[_player play];
In answer to your final wrap-up questions:
Yes, AVQueuePlayer DOES preload the next item in the playlist while it's playing the current one.
You can access the currentItem property to determine which AVPlayerItem is currently playing.

AVMutableComposition - scaleTimeRange causing high bit-rate video to stutter on fast motion

I am using scaleTimeRange:toDuration: to produce a fast-motion effect of upto 10x the original video speed. However I have noticed that videos with a higher bit-rate (say 20MBits/s and above) begin to stutter when played through an AVPlayer for anything above 4x the normal speed, enough to crash the AVPlayerLayer (disappear) if it runs for a while.
The code.
//initialize the player
self.player = [[AVPlayer alloc] init];
//load up the asset
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[[NSBundle mainBundle] URLForResource:#"sample-video" withExtension:#"mov"] options:nil];
[asset loadValuesAsynchronouslyForKeys:#[#"playable", #"hasProtectedContent", #"tracks"] completionHandler:
^{
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// composition set to play at 60fps
videoComposition.frameDuration = CMTimeMake(1,60);
//add video track to composition
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
//1080p composition
CGSize renderSize = CGSizeMake(1920.0, 1080.0);
CMTime currentTime = kCMTimeZero;
CGFloat scale = 1.0;
AVAssetTrack *assetTrack = nil;
assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetTrack atTime:currentTime error:nil];
CMTimeRange scaleTimeRange = CMTimeRangeMake(currentTime, asset.duration);
//Speed it up to 8x.
CMTime scaledDuration = CMTimeMultiplyByFloat64(asset.duration,1.0/8.0);
[videoTrack scaleTimeRange:scaleTimeRange toDuration:scaledDuration];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
//ensure video is scaled up/down to match the composition
scale = renderSize.width/assetTrack.naturalSize.width;
[layerInstruction setTransform:CGAffineTransformMakeScale(scale, scale) atTime:currentTime];
AVMutableVideoCompositionInstruction *videoInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
videoInstruction.layerInstructions = #[ layerInstruction];
videoComposition.instructions = #[videoInstruction];
videoComposition.renderSize = renderSize;
//pass the stuff to AVPlayer for playback
self.playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.playerItem.videoComposition = videoComposition;
[self.player replaceCurrentItemWithPlayerItem:self.playerItem];
//playerView is a custom view with AVPlayerLayer, picked up from https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html#//apple_ref/doc/uid/TP40010188-CH3-SW11
[self.playerView setPlayer:self.player];
//call [self.player play] when ready.
}];
Some notes:
All testing done on iPhone 6
I am deliberately not adding any audio tracks to rule out the possibility of audio playing a part here.
Normal bitrate videos (16Mbits/s on average) play fine on 10x
Same composition code produces a smooth playback on an OSX application
The stutter is more obvious with higher bitrates.
All videos being tested are 1080p 60fps
A high-bitrate video behaves well if opened in and exported to 1080, so to tone-down the bit-rate and maintain FPS.
There is no rendering/exporting of video involved.
Has anyone else ran into this and know a way around?

iOS AVFoundation audio/video out of sync

The Problem:
During every playback, the audio is between 1-2 seconds behind the video.
The Setup:
The assets are loaded with AVURLAssets from a media stream.
To write the composition, I'm using AVMutableCompositions and AVMutableCompositionTracks with asymmetric timescales. The audio and video are both streamed to the device. The timescale for audio is 44100; the timescale for video is 600.
The playback is done with AVPlayer.
Attempted Solutions:
Using videoAssetTrack.timeRange for [composition insertTimeRange].
Using CMTimeRangeMake(kCMTimeZero, videoAssetTrack.duration);
Using CMTimeRangeMake(kCMTimeZero, videoAssetTrack.timeRange.duration);
The Code:
+(AVMutableComposition*)overlayAudio:(AVURLAsset*)audioAsset
withVideo:(AVURLAsset*)videoAsset
{
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVAssetTrack* audioTrack = [self getTrackFromAsset:audioAsset withMediaType:AVMediaTypeAudio];
AVAssetTrack* videoTrack = [self getTrackFromAsset:videoAsset withMediaType:AVMediaTypeVideo];
CMTime duration = videoTrack.timeRange.duration;
AVMutableCompositionTrack* audioComposition = [self composeTrack:audioTrack withComposition:mixComposition andDuration:duration andMedia:AVMediaTypeAudio];
AVMutableCompositionTrack* videoComposition = [self composeTrack:videoTrack withComposition:mixComposition andDuration:duration andMedia:AVMediaTypeVideo];
[self makeAssertionAgainstAudio:audioComposition andVideo:videoComposition];
return mixComposition;
}
+(AVAssetTrack*)getTrackFromAsset:(AVURLAsset*)asset withMediaType:(NSString*)mediaType
{
return [[asset tracksWithMediaType:mediaType] objectAtIndex:0];
}
+(AVAssetExportSession*)configureExportSessionWithAsset:(AVMutableComposition*)composition toUrl:(NSURL*)url
{
AVAssetExportSession* exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
exportSession.outputFileType = #"com.apple.quicktime-movie";
exportSession.outputURL = url;
exportSession.shouldOptimizeForNetworkUse = YES;
return exportSession;
}
-(IBAction)playVideo
{
[avPlayer pause];
avPlayerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
avPlayer = [[AVPlayer alloc]initWithPlayerItem:avPlayerItem];
avPlayerLayer =[AVPlayerLayer playerLayerWithPlayer:avPlayer];
[avPlayerLayer setFrame:CGRectMake(0, 0, 305, 283)];
[avPlayerLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[playerView.layer addSublayer:avPlayerLayer];
[avPlayer seekToTime:kCMTimeZero];
[avPlayer play];
}
Comments:
I don't understand much of the AVFoundation framework. It is entirely probable that I am simply misusing the snippets I have provided. (i.e. why "insertTimeRange" for composition?)
I can provide any other information needed for resolution - including debug asset track property values, network telemetry, streaming information, etc.
If it's consistent, it appears there is an enforced delay to sample the audio properly.
Apple's guides are usually easier to read than their accompanying books, however here is the specific note on delay.
https://developer.apple.com/library/ios/technotes/tn2258/_index.html
The programming guides will detail why/what.

Reduce AVPlayer Video Memory Usage

We have a video player where we play videos inside an AVPlayer (1GB of content in about 8MB .mov files in size). We load the AVPlayer using an AVMutableComposition of a video track and audio track that are on local disk bundled with the app.
We do something like:
AVAsset* videoAsset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
AVAsset* voiceAsset = useVoice ? [[AVURLAsset alloc] initWithURL:voiceUrl options:nil] : nil;
AVMutableComposition* composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack* videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack* audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack* voiceTrack = useVoice ? [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid] : nil;
NSError* error = nil;
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject] atTime:kCMTimeZero error:&error];
if (error) {
[[MNGAppDelegate sharedManagers].errorManager presentError:error];
}
if ([videoAsset tracksWithMediaType:AVMediaTypeAudio].count > 0) {
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:kCMTimeZero error:&error];
if (error) {
[[MNGAppDelegate sharedManagers].errorManager presentError:error];
}
}
if (useVoice) {
[voiceTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, voiceAsset.duration) ofTrack:[[voiceAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:kCMTimeZero error:&error];
if (error) {
[[MNGAppDelegate sharedManagers].errorManager presentError:error];
}
}
And we load it using a replaceCurrentItemWithPlayerItem (except for the first one).
[self.player replaceCurrentItemWithPlayerItem:nextItem];
We never create a playlist or can go back. We simply replace it when a new video needs to be played.
What we're noticing is that VM Tracker is showing our Dirty Size is going crazy. Once we play the first 8MB file we approach about 80MB of dirty. As we replace more and more videos we can easily get our Dirty Size to 200MB+. Within about 20-30 videos the app will usually be killed and we get a Low Memory Crash Log.
Is there something special we should be doing to reduce the memory of AVPlayer as we replace clips in the player?
I have found that setting:
[someAssetWriterInput setExpectsMediaDataInRealTime:NO];
.. has some effect on memory pressure experienced during AVComposition-oriented export sessions .. it would appear to be at least one way to govern the internal memory usage within the framework ..
self.player?,pause()
self.activityIndicator?.startAnimating()
self.removeObserverPlayerItem()
let item = AVPlayerItem(url: fileurl)
player?.replaceCurrentItem(with: item)
self.addObserverPlayerItem()
self.player?.play()
This will controll your memory and will take only needed memory. this resolved my problem.

Black frames in AVMutableComposition

This question is quite related to AVMutableComposition - Blank/Black frame between videos assets but as I am not using an AVAssetExportSession the answers doesn't fit my problem.
I'm using an AVMutableComposition to create a video composition and I'm reading it using an AVAssetReader (I need to have the frame data, I can't use a AVPlayer) but I often have black frames between my video chunks (there is no glitch noticeable in the audio).
I create my Composition as
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSMutableArray* durationList = [NSMutableArray array];
NSMutableArray* videoList= [NSMutableArray array];
NSMutableArray* audioList= [NSMutableArray array];
for (NSInteger i = 0; i < [clips count]; i++)
{
AVURLAsset *myasset = [clips objectAtIndex:i];
AVAssetTrack *clipVideoTrack = [[myasset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videoList addObject:clipVideoTrack];
AVAssetTrack *clipAudioTrack = [[myasset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioList addObject:clipAudioTrack];
CMTime clipDuration = [myasset duration];
CMTimeRange clipRange = CMTimeRangeMake(kCMTimeZero, clipDuration);
[durationList addObject:[NSValue valueWithCMTimeRange:clipRange]];
}
[compositionVideoTrack insertTimeRanges:durationList ofTracks:videoList atTime:kCMTimeZero error:nil];
[compositionAudioTrack insertTimeRanges:durationList ofTracks:audioList atTime:kCMTimeZero error:nil];
I also tried to insert each track manually in my composition but I have the same phenomenon.
Thanks
I managed to solve this problem after many hours of testing. There are two things I needed to change.
1) add the 'precise' option when creating the asset.
NSDictionary *options = #{AVURLAssetPreferPreciseDurationAndTimingKey:#YES};
AVURLAsset *videoAsset3 = [AVURLAsset URLAssetWithURL:clip3URL options:options];
2) don't use
CMTime duration3 = [videoAsset3 duration];
use instead
AVAssetTrack *videoAsset3Track = [[videoAsset3 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CMTime duration3 = videoAsset3Track.timeRange.duration;
I found this out after I set the AVPlayer background color to Blue, then I noticed blue frames appearing, so the problem was to do with timings. Once I changed to above settings, the different videos aligned up fine when using:
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration3)
ofTrack:videoAsset3Track
atTime:kCMTimeZero error:&error];
Try creating two video tracks, and then alternate between the two when adding the clips.
calculate Last CMTime is very important to insertTimeRange: for new AVAsset
I had the same problem and solve it with:
func addChunk(media: AVAsset) throws {
let duration = self.mutableComposition.tracks.last?.timeRange.end
try mutableComposition.insertTimeRange(CMTimeRange(start: .zero, duration: media.duration), of: media, at: duration ?? .zero)
}

Resources