AVPlayerLooper accepts a template AVPlayerItem and a AVQueuePlayer as setup parameters, then it internally manipulates items of the queue and player is constantly changing its currentItem.
This works perfect with AVPlayerLayer, which accepts this looped player as parameter and just renders it, but how can I use it with AVPlayerItemVideoOutput, which is being attached to the AVPlayerItem, which the player has multiple inside it? How do I reproduce the same thing AVPlayerLayer does internally?
AVPlayerLooper setup example from docs
NSString *videoFile = [[NSBundle mainBundle] pathForResource:#"example" ofType:#"mov"];
NSURL *videoURL = [NSURL fileURLWithPath:videoFile];
_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [AVQueuePlayer queuePlayerWithItems:#[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
_playerLayer.frame = self.view.bounds;
[self.view.layer addSublayer:_playerLayer];
[_player play];
This is how AVPlayerItemVideoOutput supposed to be used
[item addOutput:_videoOutput];
The only workround I came up with is to observe for changes of the currentItem and each time deattach the video output from old item and attach it to new one, like in example below, but this apparently neutralizes the gapless playback which I'm trying to achieve.
- (void)observeValueForKeyPath:(NSString*)path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context {
if (context == currentItemContext) {
AVPlayerItem* newItem = [change objectForKey:NSKeyValueChangeNewKey];
AVPlayerItem* oldItem = [change objectForKey:NSKeyValueChangeOldKey];
if(oldItem.status == AVPlayerItemStatusReadyToPlay) {
[newItem removeOutput:_videoOutput];
}
if(newItem.status == AVPlayerItemStatusReadyToPlay) {
[newItem addOutput:_videoOutput];
}
[self removeItemObservers:oldItem];
[self addItemObservers:newItem];
}
}
For more context, I'm trying to come up with a fix for flutter's video_player plugin https://github.com/flutter/flutter/issues/72878
Plugin's code can be found here https://github.com/flutter/plugins/blob/172338d02b177353bf517e5826cf6a25b5f0d887/packages/video_player/video_player/ios/Classes/FLTVideoPlayerPlugin.m
You can do this by subclassing AVQueuePlayer (yay OOP) and creating and adding AVPlayerItemVideoOutputs there, as needed. I've never seen multiple AVPlayerItemVideoOutputs before, but memory consumption seems reasonable and everything works.
#interface OutputtingQueuePlayer : AVQueuePlayer
#end
#implementation OutputtingQueuePlayer
- (void)insertItem:(AVPlayerItem *)item afterItem:(nullable AVPlayerItem *)afterItem;
{
if (item.outputs.count == 0) {
NSLog(#"Creating AVPlayerItemVideoOutput");
AVPlayerItemVideoOutput *videoOutput = [[AVPlayerItemVideoOutput alloc] initWithOutputSettings:nil]; // or whatever
[item addOutput:videoOutput];
}
[super insertItem:item afterItem:afterItem];
}
#end
The current output is accessed like so:
AVPlayerItemVideoOutput *videoOutput = _player.currentItem.outputs.firstObject;
CVPixelBufferRef pixelBuffer = [videoOutput copyPixelBufferForItemTime:_player.currentTime itemTimeForDisplay:nil];
// do something with pixelBuffer here
CVPixelBufferRelease(pixelBuffer);
and configuration becomes:
_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [OutputtingQueuePlayer queuePlayerWithItems:#[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
[self.view.layer addSublayer:_playerLayer];
[_player play];
Related
Edit 3
I have found the root cause. CADisplayLink has a strong reference of the target. So it makes Retain Cycles.
Edit 2
Now I think is the memory issue causing the crash.
What I am doing is capture the output of the player and draw it on the opengl layer.
AVPlayerItem *item = ...;
if (!self.player) {
self.player = [AVPlayer playerWithPlayerItem:item];
} else {
[self.player replaceCurrentItemWithPlayerItem:item];
}
NSDictionary *pixBuffAttributes = #{(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32BGRA)};
self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
[self.player.currentItem addOutput:self.videoOutput];
[self.player seekToTime:kCMTimeZero];
[self.player play];
In the callback of DisplayLink
CMTime itemTime = [self.videoOutput itemTimeForHostTime:CACurrentMediaTime()];
BOOL hasNewContent = [self.videoOutput hasNewPixelBufferForItemTime:itemTime];
if (hasNewContent) {
CVPixelBufferRef pixelBuffer = [self.videoOutput copyPixelBufferForItemTime:itemTime itemTimeForDisplay:NULL];
// creat texture with pixelBuffer
// display texture on opengl surface
if (pixelBuffer != NULL) {
CFRelease(pixelBuffer);
}
}
There is no memory leak by instruments, but memory is rising.
Edit 1:
I have found a workaround. the resolution of "video_1" and "video_3" is 3840 * 1920, and the resolution of "video_2" is 2160 * 1080.
When I use ffmpeg to change the all resolutions to 2160 * 1080, it's worked.
Origin
I want to play several videos in sequence and meet a very strange behavior.
AVPlayerItem *item = ...;
if (!self.player) {
self.player = [AVPlayer playerWithPlayerItem:item];
} else {
[self.player replaceCurrentItemWithPlayerItem:item];
}
[self.player seekToTime:kCMTimeZero];
[self.player play];
For examples, I have three video files, such as video_1, video_2 and video_3.
First, I set the playerItem with "video_1", then I replace with "video_2". That's ok.
But I replace with "video_3", the App has crashed. I can't find any device log on my iphone. Even more, when I was debugging and replacing with "video_3", it would disconnect the debug and no exception!
More information:
"video_2" can replace "video_1"
"video_1" can replace "video_2"
"video_3" can replace "video_2"
"video_3" can't replace "video_1"
"video_1" can't replace "video_3"
all videos can be played normal in alone.
Try below code
if ([playerItemVideoOutput hasNewPixelBufferForItemTime:currentTime]) {
__unsafe_unretained ViewController *weakSelf = self; //create weak reference of your viewcontroller
CVPixelBufferRef pixelBuffer = [playerItemVideoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
if(pixelBuffer) { //check if buffer exist
[weakSelf.metalView.realTimeRender setPixelBuffer:pixelBuffer]; //use weakSelf here
CFRelease(pixelBuffer);
}
}
In my iOS application, I'm trying to play list of videos downloaded to applications' Documents directory. To achieve that target, I used AVQueuePlayer. Following is my code which leads to app crash after 6/7 times looping.
#interface PlayYTVideoViewController () <NSURLConnectionDataDelegate, UITableViewDataSource, UITableViewDelegate>
{
AVQueuePlayer *avQueuePlayer;
}
- (void)playlistLoop
{
NSLog(#"%s - %d", __PRETTY_FUNCTION__, __LINE__);
lastPlayedVideoNumber = 0;
_loadingVideoLabel.hidden = YES;
avPlayerItemsMutArray = [[NSMutableArray alloc] init];
for (NSString *videoPath in clipUrlsMutArr)
{
NSURL *vidPathUrl = [NSURL fileURLWithPath:videoPath];
AVPlayerItem *avpItem = [AVPlayerItem playerItemWithURL:vidPathUrl];
[avPlayerItemsMutArray addObject:avpItem];
}
avPlayerItemsArray = [avPlayerItemsMutArray copy];
for(AVPlayerItem *item in avPlayerItemsArray)
{
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(itemDidPlayToEndTime:) name:AVPlayerItemDidPlayToEndTimeNotification object:item];
}
avQueuePlayer = [AVQueuePlayer queuePlayerWithItems:avPlayerItemsArray];
avQueuePlayer.actionAtItemEnd = AVPlayerActionAtItemEndAdvance;
introVideoLayer = [AVPlayerLayer playerLayerWithPlayer:avQueuePlayer];
introVideoLayer.frame = _mpIntroVideoView.bounds;
[_mpContainerView.layer addSublayer:introVideoLayer];
[avQueuePlayer play];
}
- (void)itemDidPlayToEndTime:(NSNotification *)notification
{
NSLog(#"%s - %d", __PRETTY_FUNCTION__, __LINE__);
AVPlayerItem *endedAVPlayerItem = [notification object];
[endedAVPlayerItem seekToTime:kCMTimeZero];
for (AVPlayerItem *item in avPlayerItemsArray)
{
if (item == endedAVPlayerItem)
{
lastPlayedVideoNumber++;
break;
}
}
[self reloadVideoClipsTable];
if ([endedAVPlayerItem isEqual:[avPlayerItemsArray lastObject]])
{
[self playlistLoop];
}
}
After getting memory issue, I tried to make some changes to above code.
I tried to set avQueuePlayer variable public and set it as strong variable
#property (strong, nonatomic) AVQueuePlayer *avQueuePlayer;
By doing that I expected avQueuePlayer variable remain in the memory till we manually set to nil. But that didn't solve the problem.
Then I tried to set player, related arrays and layers to nil and created again for new loop session.
if (avPlayerItemsMutArray != nil)
{
avPlayerItemsMutArray = nil;
}
avPlayerItemsMutArray = [[NSMutableArray alloc] init];
if (avPlayerItemsArray != nil)
{
avPlayerItemsArray = nil;
}
avPlayerItemsArray = [avPlayerItemsMutArray copy];
if (avQueuePlayer != nil)
{
avQueuePlayer = nil;
}
avQueuePlayer = [AVQueuePlayer queuePlayerWithItems:avPlayerItemsArray];
if(introVideoLayer != nil)
{
[introVideoLayer removeFromSuperlayer];
introVideoLayer = nil;
}
introVideoLayer = [AVPlayerLayer playerLayerWithPlayer:avQueuePlayer];
But that also didn't help to solve the issue.
Next I try to remove the observer before it re-initialized in a new loop
if (avPlayerItemsArray != nil)
{
avPlayerItemsArray = nil;
[[NSNotificationCenter defaultCenter] removeObserver:self
name:AVPlayerItemDidPlayToEndTimeNotification
object:nil];
}
But that also didn't help.
Next I used Instrument to find out memory usages and leaks. Application is not exceeding 18 MB when it is crashing and also there were more than 200 MB remaining as free. Instruments is little more complicated but still I didn't find any memory leaks related to this code.
Actually the error was not with the AVQueuePlayer. In my application I'm listing all the videos inside a table below the video playing view. In that table, each row consists with video thumbnail that I taken from below code.
+ (UIImage *)createThumbForVideo:(NSString *)vidFileName
{
NSString *videoFolder = [Video getVideoFolder];
NSString *videoFilePath = [videoFolder stringByAppendingFormat:#"/trickbook/videos/edited/%#",vidFileName];
NSURL *url = [NSURL fileURLWithPath:videoFilePath];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generateImg = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generateImg.appliesPreferredTrackTransform = YES;
NSError *error = NULL;
CMTime time = CMTimeMake(1, 65);
CGImageRef refImg = [generateImg copyCGImageAtTime:time actualTime:NULL error:&error];
UIImage *frameImage = [[UIImage alloc] initWithCGImage:refImg];
return frameImage;
}
Every time a video clip ends playing and also playlist begins a new loop, I update the table view. So each time I call above method and that's the reason for memory issue.
As the solution I call this method only once for a single video clip and store the returning UIImage in a mutable array. That solved the issue.
Heading of the question and the tags may not adequate with the answer, but I thought this is worth existing as a Q & A rather than deleting the post.
I am trying to play a video in a loop on a AVSampleBufferDisplayLayer. I can get it to play though once with no problem. But, when I try to loop it, it doesn't keep playing.
Per the answer to AVFoundation to reproduce a video loop there isn't a way to rewind the AVAssetReader so I re-create it. (I did see the answer for Looping a video with AVFoundation AVPlayer? but AVPlayer is more full-features. I am reading for a file, but want the AVSampleBufferDisplayLayer still.)
One hypothesis is that I need to stop some of the H264 headers, but I have no idea if that'll help (and how). Another is that it has something to do with the CMTimebase, but I've tried several things to no avail.
Code below, based on Apple's WWDC talk on Direct Access to Video Encoding:
- (void)viewDidLoad {
[super viewDidLoad];
NSString *filepath = [[NSBundle mainBundle] pathForResource:#"sample-mp4" ofType:#"mp4"];
NSURL *fileURL = [NSURL fileURLWithPath:filepath];
AVAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
UIView *view = self.view;
self.videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
self.videoLayer.bounds = view.bounds;
self.videoLayer.position = CGPointMake(CGRectGetMidX(view.bounds), CGRectGetMidY(view.bounds));
self.videoLayer.videoGravity = AVLayerVideoGravityResizeAspect;
self.videoLayer.backgroundColor = [[UIColor greenColor] CGColor];
CMTimebaseRef controlTimebase;
CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase );
self.videoLayer.controlTimebase = controlTimebase;
CMTimebaseSetTime(self.videoLayer.controlTimebase, CMTimeMake(5, 1));
CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);
[[view layer] addSublayer:_videoLayer];
dispatch_queue_t assetQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0); //??? right queue?
__block AVAssetReader *assetReaderVideo = [self createAssetReader:asset];
__block AVAssetReaderTrackOutput *outVideo = [assetReaderVideo outputs][0];
if( [assetReaderVideo startReading] )
{
[_videoLayer requestMediaDataWhenReadyOnQueue: assetQueue usingBlock: ^{
while( [_videoLayer isReadyForMoreMediaData] )
{
CMSampleBufferRef sampleVideo;
if ( ([assetReaderVideo status] == AVAssetReaderStatusReading) && ( sampleVideo = [outVideo copyNextSampleBuffer]) ) {
[_videoLayer enqueueSampleBuffer:sampleVideo];
CFRelease(sampleVideo);
CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
}
else {
[_videoLayer stopRequestingMediaData];
//CMTimebaseSetTime(_videoLayer.controlTimebase, CMTimeMake(5, 1));
//CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);
//CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
assetReaderVideo = [self createAssetReader:asset];
outVideo = [assetReaderVideo outputs][0];
[assetReaderVideo startReading];
//sampleVideo = [outVideo copyNextSampleBuffer];
//[_videoLayer enqueueSampleBuffer:sampleVideo];
}
}
}];
}
}
-(AVAssetReader *)createAssetReader:(AVAsset*)asset {
NSError *error=nil;
AVAssetReader *assetReaderVideo = [[AVAssetReader alloc] initWithAsset:asset error:&error];
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
AVAssetReaderTrackOutput *outVideo = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTracks[0] outputSettings:nil]; //dic];
[outVideo res]
[assetReaderVideo addOutput:outVideo];
return assetReaderVideo;
}
Thanks so much.
Try Making a loop with swift, then bridge the objective-c files with the swift files. google has many answers to bridging and looping, so just google it with swift.
I am implementing a n HTTP live streaming player with OSX using avplayer.
I am able to stream it properly seek and get duration timing etc.
Now i want to take screen shots and process the frames from it using OpenCV.
I went for using AVASSetImageGenerator. But there is no audio and video tracks with the AVAsset which is associated with player.currentItem.
The tracks are appearing in player.currentItem.tracks.
So i am not able to sue AVAssetGenerator. Can anybody help to find out a solution to extract screenshots and individual frames in such a scenario?
Please find the code below how i am initiating an HTTP live stream
Thanks in advance.
NSURL* url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
playeritem = [AVPlayerItem playerItemWithURL:url];
[playeritem addObserver:self forKeyPath:#"status" options:0 context:AVSPPlayerStatusContext];
[self setPlayer:[AVPlayer playerWithPlayerItem:playeritem]];
[self addObserver:self forKeyPath:#"player.rate" options:NSKeyValueObservingOptionNew context:AVSPPlayerRateContext];
[self addObserver:self forKeyPath:#"player.currentItem.status" options:NSKeyValueObservingOptionNew context:AVSPPlayerItemStatusContext];
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
[newPlayerLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable];
[newPlayerLayer setHidden:YES];
[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];
[self addObserver:self forKeyPath:#"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:AVSPPlayerLayerReadyForDisplay];
[self.player play];
Following is how i am checking whether video track is present with the Asset
case AVPlayerItemStatusReadyToPlay:
[self setTimeObserverToken:[[self player] addPeriodicTimeObserverForInterval:CMTimeMake(1, 10) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
[[self timeSlider] setDoubleValue:CMTimeGetSeconds(time)];
NSLog(#"%f,%f,%f",[self currentTime],[self duration],[[self player] rate]);
AVPlayerItem *item = playeritem;
if(item.status == AVPlayerItemStatusReadyToPlay)
{
AVAsset *asset = (AVAsset *)item.asset;
long audiotracks = [[asset tracks] count];
long videotracks = [[asset availableMediaCharacteristicsWithMediaSelectionOptions]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
}
}]];
AVPlayerItem *item = self.player.currentItem;
if(item.status != AVPlayerItemStatusReadyToPlay)
return;
AVURLAsset *asset = (AVURLAsset *)item.asset;
long audiotracks = [[asset tracksWithMediaType:AVMediaTypeAudio]count];
long videotracks = [[asset tracksWithMediaType:AVMediaTypeVideo]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
This is an older question but in case someone needs help for that i have an answer
AVURLAsset *asset = /* Your Asset here! */;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * /* Put the FPS of the source video here */ ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, /* Put the FPS of the source video here */);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
// Do what you want with the image, for example save it as UIImage
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
CGImageRelease(image);
}
}
You can easily get the FPS of a Video by using this code:
float fps=0.00;
if (asset) {
AVAssetTrack * videoATrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
if(videoATrack)
{
fps = [videoATrack nominalFrameRate];
}
}
Hope that helps someone who is asking how to get all frames from a video or just some specific (with CMTime for example) frames. Please bear in mind, that saving all frames to an array can impact the memory hardly!
How can I play music from the ipod music library (like user-defined playlists, etc.) at a different volume than the system volume?
This is for anyone who is trying to play music / playlists from the ipod music library at a different volume than the system volume. There are several posts out there saying that the [MPMusicPlayerController applicationMusicPlayer] can do this, but I have found that anytime I change the volume of the applicationMusicPlayer, the system volume changes too.
There is a more involved method of playing music using the AVAudioPlayer class, but it requires you to copy music files from the ipod library to the application bundle, and that can get tricky when you're playing dynamic things, like user generated playlists. That technique does give you access to the bytes though, and is the way to go if you want to do processing on the data (like a DJ app). Link to that solution HERE.
The solution I went with uses the AVPlayer class, there are several good posts out there about how to do it. This post is basically a composite of several different solutions I found on Stackoverflow and elsewhere.
I have the following Frameworks linked:
AVFoundation
MediaPlayer
AudioToolbox
CoreAudio
CoreMedia
(I'm not sure if all of those are critical, but that's what I have. I have some OpenAL stuff implemented too that I don't show in the following code)
// Presumably in your SoundManage.m file (or whatever you call it) ...
#import <CoreAudio/CoreAudioTypes.h>
#import <AudioToolbox/AudioToolbox.h>
#interface SoundManager()
#property (retain, nonatomic) AVPlayer* audioPlayer;
#property (retain, nonatomic) AVPlayerItem* currentItem;
#property (retain, nonatomic) MPMediaItemCollection* currentPlaylist;
#property (retain, nonatomic) MPMediaItem* currentTrack;
#property (assign, nonatomic) MPMusicPlaybackState currentPlaybackState;
#end
#implementation SoundManager
#synthesize audioPlayer;
#synthesize currentItem = m_currentItem;
#synthesize currentPlaylist;
#synthesize currentTrack;
#synthesize currentPlaybackState;
- (id) init
{
...
//Define an AVPlayer instance
AVPlayer* tempPlayer = [[AVPlayer alloc] init];
self.audioPlayer = tempPlayer;
[tempPlayer release];
...
//load the playlist you want to play
MPMediaItemCollection* playlist = [self getPlaylistWithName: #"emo-pop-unicorn-blood-rage-mix-to-the-max"];
if(playlist)
[self loadPlaylist: playlist];
...
//initialize the playback state
self.currentPlaybackState = MPMusicPlaybackStateStopped;
//start the music playing
[self playMusic];
...
}
//Have a way to get a playlist reference (as an MPMediaItemCollection in this case)
- (MPMediaItemCollection*) getPlaylistWithName:(NSString *)playlistName
{
MPMediaQuery* query = [[MPMediaQuery alloc] init];
MPMediaPropertyPredicate* mediaTypePredicate = [MPMediaPropertyPredicate predicateWithValue: [NSNumber numberWithInteger: MPMediaTypeMusic] forProperty:MPMediaItemPropertyMediaType];
[query addFilterPredicate: mediaTypePredicate];
[query setGroupingType: MPMediaGroupingPlaylist];
NSArray* playlists = [query collections];
[query release];
for(MPMediaItemCollection* testPlaylist in playlists)
{
NSString* testPlaylistName = [testPlaylist valueForProperty: MPMediaPlaylistPropertyName];
if([testPlaylistName isEqualToString: playlistName])
return testPlaylist;
}
return nil;
}
//Override the setter on currentItem so that you can add/remove
//the notification listener that will tell you when the song has completed
- (void) setCurrentItem:(AVPlayerItem *)currentItem
{
if(m_currentItem)
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemDidPlayToEndTimeNotification object:m_currentItem];
[m_currentItem release];
}
if(currentItem)
m_currentItem = [currentItem retain];
else
m_currentItem = nil;
if(m_currentItem)
{
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(handleMusicTrackFinished) name:AVPlayerItemDidPlayToEndTimeNotification object:m_currentItem];
}
}
//handler that gets called when the name:AVPlayerItemDidPlayToEndTimeNotification notification fires
- (void) handleMusicTrackFinished
{
[self skipSongForward]; //or something similar
}
//Have a way to load a playlist
- (void) loadPlaylist:(MPMediaItemCollection *)playlist
{
self.currentPlaylist = playlist;
self.currentTrack = [playlist.items objectAtIndex: 0];
}
//Play the beats, yo
- (void) playMusic
{
//check the current playback state and exit early if we're already playing something
if(self.currentPlaybackState == MPMusicPlaybackStatePlaying)
return;
if(self.currentPlaybackState == MPMusicPlaybackStatePaused)
{
[self.audioPlayer play];
}
else if(self.currentTrack)
{
//Get the system url of the current track, and use that to make an AVAsset object
NSURL* url = [self.currentTrack valueForProperty:MPMediaItemPropertyAssetURL];
AVAsset* asset = [AVURLAsset URLAssetWithURL:url options:nil];
//Get the track object from the asset object - we'll need to trackID to tell the
//AVPlayer that it needs to modify the volume of this track
AVAssetTrack* track = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
//Build the AVPlayerItem - this is where you modify the volume, etc. Not the AVPlayer itself
AVPlayerItem* playerItem = [[AVPlayerItem alloc] initWithAsset: asset]; //initWithURL:url];
self.currentItem = playerItem;
//Set up some audio mix parameters to tell the AVPlayer what to do with this AVPlayerItem
AVMutableAudioMixInputParameters* audioParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioParams setVolume: 0.5 atTime:kCMTimeZero]; //replace 0.5 with your volume
[audioParams setTrackID: track.trackID]; //here's the track id
//Set up the actual AVAudioMix object, which aggregates effects
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters: [NSArray arrayWithObject: audioParams]];
//apply your AVAudioMix object to the AVPlayerItem
[playerItem setAudioMix:audioMix];
//refresh the AVPlayer object, and play the track
[self.audioPlayer replaceCurrentItemWithPlayerItem: playerItem];
[self.audioPlayer play];
}
self.currentPlaybackState = MPMusicPlaybackStatePlaying;
}
- (void) pauseMusic
{
if(self.currentPlaybackState == MPMusicPlaybackStatePaused)
return;
[self.audioPlayer pause];
self.currentPlaybackState = MPMusicPlaybackStatePaused;
}
- (void) skipSongForward
{
//adjust self.currentTrack to be the next object in self.currentPlaylist
//start the new track in a manner similar to that used in -playMusic
}
- (void) skipSongBackward
{
float currentTime = self.audioPlayer.currentItem.currentTime.value / self.audioPlayer.currentItem.currentTime.timescale;
//if we're more than a second into the song, just skip back to the beginning of the current track
if(currentTime > 1.0)
{
[self.audioPlayer seekToTime: CMTimeMake(0, 1)];
}
else
{
//otherwise, adjust self.currentTrack to be the previous object in self.currentPlaylist
//start the new track in a manner similar to that used in -playMusic
}
}
//Set volume mid-song - more or less the same process we used in -playMusic
- (void) setMusicVolume:(float)vol
{
AVPlayerItem* item = self.audioPlayer.currentItem;
AVAssetTrack* track = [[item.asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableAudioMixInputParameters* audioParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioParams setVolume: vol atTime:kCMTimeZero];
[audioParams setTrackID: track.trackID];
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters: [NSArray arrayWithObject: audioParams]];
[item setAudioMix:audioMix];
}
#end
Please forgive any errors you see - let me know in the comments and I'll fix them. Otherwise, I hope this helps if anyone runs into the same challenge I did!
Actually I found a really easy way to do this by loading iPod URL's from MPMusicPlayer, but then doing playback through AVAudioPlayer.
// Get-da iTunes player thing
MPMusicPlayerController* iTunes = [MPMusicPlayerController iPodMusicPlayer];
// whazzong
MPMediaItem *currentSong = [iTunes nowPlayingItem];
// whazzurl
NSURL *currentSongURL = [currentSong valueForProperty:MPMediaItemPropertyAssetURL];
info( "AVAudioPlayer playing %s", [currentSongURL.absoluteString UTF8String] ) ;
// mamme AVAudioPlayer
NSError *err;
avAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:currentSongURL error:&err] ;
if( err!=nil )
{
error( "AVAudioPlayer couldn't load %s", [currentSongURL.absoluteString UTF8String] ) ;
}
avAudioPlayer.numberOfLoops = -1; //infinite
// Play that t
[avAudioPlayer prepareToPlay] ;
[avAudioPlayer play];
[avAudioPlayer setVolume:0.5]; // set the AVAUDIO PLAYER's volume to only 50%. This
// does NOT affect system volume. You can adjust this music volume anywhere else too.