I've created an AVMutableComposition that consists of a bunch of audio tracks that start at specific times. From there, following Apple recommendations, i turned it into an AVComposition before playing it with AVPlayer.
It all works fine playing this AVPlayer item, but if I pause it and then continue, all the tracks in the composition appear to slip back about 0.2 seconds relative to each other (i.e., they bunch up). Hitting pause and continuing several times compounds the effect and the overlap is more significant (basically if I hit it enough, I will end up with all 8 tracks playing simultaneously).
if (self.player.rate > 0.0) {
//if player is playing, pause
[self.player pause];
} else {
if (self.player) {
[self.player play];
return;
}
*/CODE CREATING COMPOSITION - missed out big chunk of code relating to finding the track and retrieving its position and scale/*
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES]
forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *sourceAsset = [[AVURLAsset alloc] initWithURL:url options:options];
//calculate times
NSNumber *time = [soundArray1 objectAtIndex:1]; //this is the time scale - e.g. 96 or 120 etc.
double timenow = [time doubleValue];
double insertTime = (240*y);
AVMutableCompositionTrack *track =
[composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
//insert the audio track from the asset into the track added to the mutable composition
AVAssetTrack *myTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTimeRange myTrackRange = myTrack.timeRange;
NSError *error = nil;
[track insertTimeRange:myTrackRange
ofTrack:myTrack
atTime:CMTimeMake(insertTime, timenow)
error:&error];
[sourceAsset release];
}
}
AVComposition *immutableSnapshotOfMyComposition = [composition copy];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:immutableSnapshotOfMyComposition];
self.player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
NSLog(#"here");
[self.player play];
Thanks
OK, this feels a little hacky, but it definitely works if anybody is stuck. If someone has a better answer, do let me know!
Basically, I just save the player.currentTime of the track when I hit pause and remake the track when i hit play, just starting from the point at which i paused it. No discernible delay, but I'd still be happier without wasting this extra processing.
Make sure you properly release your player item after you hit pause, otherwise you'll end up with a giant stack of AVPlayers!
I have a solution that is a bit less hacky but still hacky.
The solution comes from the fact that I noticed that if you seeked on the player, the latency between audio and video introduced by pausing disappeared.
Hence: just save the player.currentTime just before pausing and, player seekToTime just before playing again. It works pretty well on iOS 6, haven't tested on other versions yet.
Related
I have an AVQueuePlayer that is used to play a list of MP3 songs from the internet (http). I need to also know which song is currently playing. The current problem is that loading the song causes a delay that blocks the main thread while waiting for the song to load (first song as well as sequential songs after the first has completed playback).
The following code blocks the main thread:
queuePlayer = [[AVQueuePlayer alloc] init];
[queuePlayer insertItem: [AVPlayerItem playerItemWithURL:url] afterItem: nil]; // etc.
[queuePlayer play]
I am looking for a way to create a playlist of MP3s where the next file to be played back is preloaded in the background.
I tried the following code:
NSArray* tracks = [NSArray arrayWithObjects:#"http://example.com/song1.mp3", #"http://example.com/song2.mp3", #"http://example.com/song3.mp3", nil];
for (NSString* trackName in tracks)
{
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:[NSURL URLWithString:trackName]
options:nil];
AVMutableCompositionTrack* audioTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
NSError* error;
[audioTrack insertTimeRange:CMTimeRangeMake([_composition duration], audioAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
if (error)
{
NSLog(#"%#", [error localizedDescription]);
}
// Store the track IDs as track name -> track ID
[_audioMixTrackIDs setValue:[NSNumber numberWithInteger:audioTrack.trackID]
forKey:trackName];
}
_player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
[_player play];
The issue with this is that I am not sure how to detect when the next song starts playing. Also, the docs don't specify whether or not this will pre-load MP3 files or not.
I am looking for a solution that:
Plays MP3s by pre-loading them in the background prior to playback (ideally start loading the next song before the current song finishes, so it is ready for immediate playback once the current song finishes)
Allow me to view the current song playing.
AVFoundation has some classes designed to do exactly what you're looking for.
It looks like your current solution is to build a single AVPlayerItem that concatenates all of the MP3 files that you want to play. A better solution is to create an AVQueuePlayer with an array of the AVPlayerItem objects that you want to play.
NSArray* tracks = [NSArray arrayWithObjects:#"http://example.com/song1.mp3", #"http://example.com/song2.mp3", #"http://example.com/song3.mp3", nil];
NSMutableArray *playerItems = [[NSMutableArray alloc] init];
for (NSString* trackName in tracks)
{
NSURL *assetURL = [NSURL URLWithString:trackName];
if (!assetURL) {
continue;
}
AVURLAsset* audioAsset = [[AVURLAsset alloc] initWithURL:assetURL
options:nil];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:audioAsset];
[playerItems addObject:playerItem];
}
_player = [[AVQueuePlayer alloc] initWithItems:playerItems];
[_player play];
In answer to your final wrap-up questions:
Yes, AVQueuePlayer DOES preload the next item in the playlist while it's playing the current one.
You can access the currentItem property to determine which AVPlayerItem is currently playing.
[Edit: I was able to figure out a workaround for this, see below.]
I'm trying to stream multiple remote MP4 clips from S3, playing them in a sequence as one continuous video (to enable scrubbing within and between clips) with no stuttering, without explicitly downloading them to the device first. However, I find that the clips buffer very slowly (even on a fast network connection) and have been unable to find an adequate way to address that.
I've been trying to use AVPlayer for this, since AVPlayer with AVMutableComposition plays the supplied video tracks as one continuous track (unlike AVQueuePlayer, which I gather plays each video separately and thus doesn't support continuous scrubbing between the clips).
When I stick one of the assets directly into an AVPlayerItem and play that (with no AVMutableComposition), it buffers fast. But using AVMutableComposition, the video starts stuttering very badly on the second clip (my test case has 6 clips, each around 6 seconds), while the audio keeps going. After it plays through once, it plays perfectly smoothly if I rewind to the beginning, so I assume the problem lies in the buffering.
My current attempt to fix this problem feels convoluted, given that this seems like a rather basic use-case for AVPlayer - I do hope there's a simpler solution to all this that works properly. Somehow I doubt that the buffering player I use below is really necessary, but I'm running out of ideas.
Here's the main code that sets up the AVMutableComposition:
// Build an AVAsset for each of the source URIs
- (void)prepareAssetsForSources:(NSArray *)sources
{
NSMutableArray *assets = [[NSMutableArray alloc] init]; // the assets to be used in the AVMutableComposition
NSMutableArray *offsets = [[NSMutableArray alloc] init]; // for tracking buffering progress
CMTime currentOffset = kCMTimeZero;
for (NSDictionary* source in sources) {
bool isNetwork = [RCTConvert BOOL:[source objectForKey:#"isNetwork"]];
bool isAsset = [RCTConvert BOOL:[source objectForKey:#"isAsset"]];
NSString *uri = [source objectForKey:#"uri"];
NSString *type = [source objectForKey:#"type"];
NSURL *url = isNetwork ?
[NSURL URLWithString:uri] :
[[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:uri ofType:type]];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
currentOffset = CMTimeAdd(currentOffset, asset.duration);
[assets addObject:asset];
[offsets addObject:[NSNumber numberWithFloat:CMTimeGetSeconds(currentOffset)]];
}
_clipAssets = assets;
_clipEndOffsets = offsets;
}
// Called with _clipAssets
- (AVPlayerItem*)playerItemForAssets:(NSMutableArray *)assets
{
AVMutableComposition* composition = [AVMutableComposition composition];
for (AVAsset* asset in assets) {
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), asset.duration);
NSError *editError;
[composition insertTimeRange:editRange
ofAsset:asset
atTime:composition.duration
error:&editError];
}
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:composition];
return playerItem; // this is used to initialize the main player
}
My initial thought was: Since it buffers fast with a vanilla AVPlayerItem, why not maintain a separate buffering player that's loaded with each asset in turn (with no AVMutableComposition) to buffer the assets for the main player?
- (void)startBufferingClips
{
_bufferingPlayerItem = [AVPlayerItem playerItemWithAsset:_clipAssets[0]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayer = [AVPlayer playerWithPlayerItem:_bufferingPlayerItem];
_currentlyBufferingIndex = 0;
}
// called every 250 msecs via an addPeriodicTimeObserverForInterval on the main player
- (void)updateBufferingProgress
{
// If the playable (loaded) range is within 100 milliseconds of the clip
// currently being buffered, load the next clip into the buffering player.
float playableDuration = [[self calculateBufferedDuration] floatValue];
CMTime totalDurationTime = [self playerItemDuration :_bufferingPlayer];
Float64 totalDurationSeconds = CMTimeGetSeconds(totalDurationTime);
bool bufferingComplete = totalDurationSeconds - playableDuration < 0.1;
float bufferedSeconds = [self bufferedSeconds :playableDuration];
float playerTimeSeconds = CMTimeGetSeconds([_player currentTime]);
__block NSUInteger playingClipIndex = 0;
// find the index of _player's currently playing clip
[_clipEndOffsets enumerateObjectsUsingBlock:^(id offset, NSUInteger idx, BOOL *stop) {
if (playerTimeSeconds < [offset floatValue]) {
playingClipIndex = idx;
*stop = YES;
}
}];
// TODO: if bufferedSeconds - playerTimeSeconds <= 0, pause the main player
if (bufferingComplete && _currentlyBufferingIndex < [_clipAssets count] - 1) {
// We're done buffering this clip, load the buffering player with the next asset
_currentlyBufferingIndex += 1;
_bufferingPlayerItem = [AVPlayerItem playerItemWithAsset:_clipAssets[_currentlyBufferingIndex]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayer = [AVPlayer playerWithPlayerItem:_bufferingPlayerItem];
}
}
- (float)bufferedSeconds:(float)playableDuration {
__block float seconds = 0.0; // total duration of clips already buffered
if (_currentlyBufferingIndex > 0) {
[_clipEndOffsets enumerateObjectsUsingBlock:^(id offset, NSUInteger idx, BOOL *stop) {
if (idx + 1 >= _currentlyBufferingIndex) {
seconds = [offset floatValue];
*stop = YES;
}
}];
}
return seconds + playableDuration;
}
- (NSNumber *)calculateBufferedDuration {
AVPlayerItem *video = _bufferingPlayer.currentItem;
if (video.status == AVPlayerItemStatusReadyToPlay) {
__block float longestPlayableRangeSeconds;
[video.loadedTimeRanges enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
CMTimeRange timeRange = [obj CMTimeRangeValue];
float seconds = CMTimeGetSeconds(CMTimeRangeGetEnd(timeRange));
if (seconds > 0.1) {
if (!longestPlayableRangeSeconds) {
longestPlayableRangeSeconds = seconds;
} else if (seconds > longestPlayableRangeSeconds) {
longestPlayableRangeSeconds = seconds;
}
}
}];
Float64 playableDuration = longestPlayableRangeSeconds;
if (playableDuration && playableDuration > 0) {
return [NSNumber numberWithFloat:longestPlayableRangeSeconds];
}
}
return [NSNumber numberWithInteger:0];
}
It initially seemed that this worked like a charm, but then I switched to another set of test clips and then the buffering was very slow again (the buffering player helped, but not enough). It seems like the loadedTimeRanges for the assets as loaded into the buffering player didn't match the loadedTimeRanges for the same assets inside the AVMutableComposition: Even after the loadedTimeRanges for each item loaded into the buffering player indicated that the whole asset had been buffered, the main player's video continued stuttering (while the audio played seamlessly to the end). Again, the playback was seamless after rewinding once the main player had run through all the clips once.
I hope the answer to this, whatever it is, will prove useful as a starting point for other iOS developers trying to implement this basic use-case. Thanks!
Edit: Since I posted this question, I made the following workaround for this. Hopefully this will save whoever runs into this some headache.
What I ended up doing was maintaining two buffering players (both AVPlayers) that started buffering the first two clips, moving on to the lowest-indexed unbuffered clip after their loadedTimeRanges indicated that buffering for their current clip was done. I made the logic pause/unpause playback based on the clips currently buffered, and the loadedTimeRanges of the buffering players, plus a small margin. This needed a few bookkeeping variables, but wasn't too complicated.
This is how the buffering players were initialized (I'm omitting the bookkeeping logic here):
- (void)startBufferingClips
{
_bufferingPlayerItemA = [AVPlayerItem playerItemWithAsset:_clipAssets[0]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayerA = [AVPlayer playerWithPlayerItem:_bufferingPlayerItemA];
_currentlyBufferingIndexA = [NSNumber numberWithInt:0];
if ([_clipAssets count] > 1) {
_bufferingPlayerItemB = [AVPlayerItem playerItemWithAsset:_clipAssets[1]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayerB = [AVPlayer playerWithPlayerItem:_bufferingPlayerItemB];
_currentlyBufferingIndexB = [NSNumber numberWithInt:1];
_nextIndexToBuffer = [NSNumber numberWithInt:2];
} else {
_nextIndexToBuffer = [NSNumber numberWithInt:1];
}
}
In addition, I needed to make sure that the video and audio tracks weren't being merged as they were added to AVMutableComposition, as this apparently interfered with the buffering (perhaps they didn't register as the same video/audio tracks as those the buffering players were loading, and thus didn't receive the new data). Here's the code where the AVMutableComposition is built from an array of NSAssets:
- (AVPlayerItem*)playerItemForAssets:(NSMutableArray *)assets
{
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime timeOffset = kCMTimeZero;
for (AVAsset* asset in assets) {
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), asset.duration);
NSError *editError;
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
if ([videoTracks count] > 0) {
AVAssetTrack *videoTrack = [videoTracks objectAtIndex:0];
[compVideoTrack insertTimeRange:editRange
ofTrack:videoTrack
atTime:timeOffset
error:&editError];
}
if ([audioTracks count] > 0) {
AVAssetTrack *audioTrack = [audioTracks objectAtIndex:0];
[compAudioTrack insertTimeRange:editRange
ofTrack:audioTrack
atTime:timeOffset
error:&editError];
}
if ([videoTracks count] > 0 || [audioTracks count] > 0) {
timeOffset = CMTimeAdd(timeOffset, asset.duration);
}
}
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:composition];
return playerItem;
}
With this approach, buffering while using AVMutableComposition for the main player works nice and fast, at least in my setup.
I'm trying to implement a fade-in effect based on AVPlayer + AVAudioMix + AVAudioMixInputParameters. It basically works except when playing the audio for the first time after starting my app there is a click in the beginning. Subsequent plays work perfect though, but the first-time glitch is pretty stable and reproducible.
My Play button is enabled only after the AVPlayerItem's status is set to ready, so it's impossible to fire a play method while the player is not ready. In fact it doesn't matter how long I wait after loading the audio file and constructing all the objects.
This happens on OS X, I haven't tested it on iOS (yet).
Note that for this test you need an audio file that starts with sound and not silence. Here is my stripped down code without the GUI part (testFadeIn is the entry point):
static AVPlayer* player;
static void* PlayerItemStatusObserverContext = &PlayerItemStatusObserverContext;
- (void)testFadeIn
{
AVURLAsset* asset = [AVURLAsset.alloc initWithURL:[NSURL fileURLWithPath:#"Helicopter.m4a"] options:#{AVURLAssetPreferPreciseDurationAndTimingKey: #YES}];
AVPlayerItem* item = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:item];
[item addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:PlayerItemStatusObserverContext];
}
- (void)observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
if (context == PlayerItemStatusObserverContext)
{
AVPlayerStatus status = (AVPlayerStatus)[[change objectForKey:NSKeyValueChangeNewKey] integerValue];
if (status == AVPlayerStatusReadyToPlay)
{
[self applyFadeIn];
[self performSelector:#selector(play:) withObject:nil afterDelay:1.0];
}
}
}
- (void)applyFadeIn
{
assert(player.currentItem.tracks.firstObject);
AVMutableAudioMixInputParameters* fadeIn = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:player.currentItem.tracks.firstObject];
[fadeIn setVolume:0 atTime:kCMTimeZero];
[fadeIn setVolume:1 atTime:CMTimeMake(2, 1)];
NSMutableArray* paramsArray = [NSMutableArray new];
[paramsArray addObject:fadeIn];
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = paramsArray;
player.currentItem.audioMix = audioMix;
}
- (void)play:(id)unused
{
[player play];
}
Click! What is wrong with this?
Edit:
An obvious workaround that I use at the moment is: when the player reports it's ready, I do a short 100ms playback with volume=0, then restore currentTime and volume and only then I report to the main app that the player is ready. This way there are no clicks. Interestingly, anything less than 100ms still gives the click.
This seems like an issue with something that's being cached by AVFoundation after the first playback. It's neither the tracks, as they are available when I set the fade in params, nor the seek status.
I have an 8 second mp3 file and I'm trying to loop it as main menu music. I do this by using SKAction repeatActionForever, however every time it starts over there is a small pause between the loops. This is very annoying since it doesn't sound like it is one long song this way. How can I fix this?
EDIT:
It also doesn't work with AVAudioPlayer :(
Use AVAudioPlayer and set the numberOfLoops property to -1 in order to loop the sound indefinitely.
For example:
NSError *error;
NSURL *soundURL = [[NSBundle mainBundle] URLForResource:#"SomeSound.wav" withExtension:nil];
myPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:soundURL error:&error];
myPlayer.numberOfLoops = -1;
myPlayer.volume = 0.4;
myPlayer.delegate = self;
[myPlayer prepareToPlay];
[myPlayer play];
I have two different views that are meant to play the same video, I am creating an app that will switch several times between the two views while the video is running.
I currently load the first view with the video as follows:
NSURL *url = [NSURL URLWithString:#"http://[URL TO VIDEO HERE]"];
AVURLAsset *avasset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:avasset];
player = [[AVPlayer alloc] initWithPlayerItem:item];
playerLayer = [[AVPlayerLayer playerLayerWithPlayer:player] retain];
CGSize size = self.bounds.size;
float x = size.width/2.0-202.0;
float y = size.height/2.0 - 100;
//[player play];
playerLayer.frame = CGRectMake(x, y, 404, 200);
playerLayer.backgroundColor = [UIColor blackColor].CGColor;
[self.layer addSublayer:playerLayer];
NSString *tracksKey = #"tracks";
[avasset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [avasset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
//videoInitialized = YES;
[player play];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
In my second view I want to load the video from the dispatch_get_main_queue so that the video in both views are in sync.
I was hoping someone could help me out with loading the data of the video from the first view into the second view.
It is very simple:
Init the first player:
AVAsset *asset = [AVAsset assetWithURL:URL];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
And the second player in the same way, BUT, use the same asset from the first one.
I have verified, it works.
There is all the info you need on the Apple page:
https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
This abstraction means that you can play a given asset using different
players simultaneously
this quote is from this page.
I don't think you will be able to get this approach to work. Videos are decoded in hardware and then the graphics buffer is sent to the graphics card. What you seem to want to do is decode a video in one view but then capture the contents of the first view and show it in a second view. That will not stay in sync because it would take time to capture the contents of the first window back into main memory and then those contents would need to be sent to the video card again. Basically, that is not going to work. You also cannot decode two h.264 videos streams and expect them to be in sync.
You could implement this with another approach entirely. If you decode the h.264 video to frames on disk (save each frame as a PNG) and then write your own loop that will decode the Nth PNG in a series of PNGs and then display the results in the two different windows. That will work fast enough to be an effective implementation on newer iPhone 4 and 5 and iPad 2 and 3. If you want to make use of a more advanced implementation, take a look at my AVAnimator library for iOS, you could get this approach working in 20 minutes if you use existing code.
For this ten year old question which has only ten year old answers which are out of date, here's the up to date answer.
var leadPlayer: AVPlayer ... the lead player you want to dupe
This does not work:
let leadPlayerItem: AVPlayerItem = leadPlayer.currentItem!
yourPlayer = AVPlayer(playerItem: leadPlayerItem)
yourPlayer.play()
Apple does not allow that (try it, see error).
This works. You must use the item:
let dupeItem: AVPlayerItem = AVPlayerItem(asset: leadPlayer.currentItem!.asset)
yourPlayer = AVPlayer(playerItem: dupeItem)
yourPlayer.play()
Fortunately it's now that easy.