I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.
I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage. That's it.
My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer without any problems.
What have I tried:
AVAssetImageGenerator. It is not working, the method copyCGImageAtTime:actualTime: error: returns null image ref. According to the answer here AVAssetImageGenerator doesn't work for streaming videos.
Taking snapshot of the player view. I tried first renderInContext: on AVPlayerLayer, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 - drawViewHierarchyInRect:afterScreenUpdates: which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown.
AVPlayerItemVideoOutput. I have added a video output for my AVPlayerItem, however whenever I call hasNewPixelBufferForItemTime: it returns NO. I guess the problem is again streaming video and I am not alone with this problem.
AVAssetReader. I was thinking to try it but decided not to lose time after finding a related question here.
So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.
AVAssetImageGenerator is the best way to snapshot a video, this method return asynchronously a UIImage :
import AVFoundation
// ...
var player:AVPlayer? = // ...
func screenshot(handler:#escaping ((UIImage)->Void)) {
guard let player = player ,
let asset = player.currentItem?.asset else {
return
}
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
let times = [NSValue(time:player.currentTime())]
imageGenerator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
if let img = image {
handler(UIImage(cgImage: img))
}
}
}
(It's Swift 4.2)
AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.
This answer mostly cribbed from here
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
#interface ViewController ()
#property (nonatomic) AVPlayer *player;
#property (nonatomic) AVPlayerItem *playerItem;
#property (nonatomic) AVPlayerItemVideoOutput *playerOutput;
#end
#implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = #{ (id)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];
[self.player play];
}
- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(#"The image: %#", buffer);
}
- (void)viewDidLoad {
[super viewDidLoad];
NSURL *someUrl = [NSURL URLWithString:#"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:#"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(#"%# Failed to load the tracks.", self);
}
}];
}
#end
Related
AVPlayerLooper accepts a template AVPlayerItem and a AVQueuePlayer as setup parameters, then it internally manipulates items of the queue and player is constantly changing its currentItem.
This works perfect with AVPlayerLayer, which accepts this looped player as parameter and just renders it, but how can I use it with AVPlayerItemVideoOutput, which is being attached to the AVPlayerItem, which the player has multiple inside it? How do I reproduce the same thing AVPlayerLayer does internally?
AVPlayerLooper setup example from docs
NSString *videoFile = [[NSBundle mainBundle] pathForResource:#"example" ofType:#"mov"];
NSURL *videoURL = [NSURL fileURLWithPath:videoFile];
_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [AVQueuePlayer queuePlayerWithItems:#[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
_playerLayer.frame = self.view.bounds;
[self.view.layer addSublayer:_playerLayer];
[_player play];
This is how AVPlayerItemVideoOutput supposed to be used
[item addOutput:_videoOutput];
The only workround I came up with is to observe for changes of the currentItem and each time deattach the video output from old item and attach it to new one, like in example below, but this apparently neutralizes the gapless playback which I'm trying to achieve.
- (void)observeValueForKeyPath:(NSString*)path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context {
if (context == currentItemContext) {
AVPlayerItem* newItem = [change objectForKey:NSKeyValueChangeNewKey];
AVPlayerItem* oldItem = [change objectForKey:NSKeyValueChangeOldKey];
if(oldItem.status == AVPlayerItemStatusReadyToPlay) {
[newItem removeOutput:_videoOutput];
}
if(newItem.status == AVPlayerItemStatusReadyToPlay) {
[newItem addOutput:_videoOutput];
}
[self removeItemObservers:oldItem];
[self addItemObservers:newItem];
}
}
For more context, I'm trying to come up with a fix for flutter's video_player plugin https://github.com/flutter/flutter/issues/72878
Plugin's code can be found here https://github.com/flutter/plugins/blob/172338d02b177353bf517e5826cf6a25b5f0d887/packages/video_player/video_player/ios/Classes/FLTVideoPlayerPlugin.m
You can do this by subclassing AVQueuePlayer (yay OOP) and creating and adding AVPlayerItemVideoOutputs there, as needed. I've never seen multiple AVPlayerItemVideoOutputs before, but memory consumption seems reasonable and everything works.
#interface OutputtingQueuePlayer : AVQueuePlayer
#end
#implementation OutputtingQueuePlayer
- (void)insertItem:(AVPlayerItem *)item afterItem:(nullable AVPlayerItem *)afterItem;
{
if (item.outputs.count == 0) {
NSLog(#"Creating AVPlayerItemVideoOutput");
AVPlayerItemVideoOutput *videoOutput = [[AVPlayerItemVideoOutput alloc] initWithOutputSettings:nil]; // or whatever
[item addOutput:videoOutput];
}
[super insertItem:item afterItem:afterItem];
}
#end
The current output is accessed like so:
AVPlayerItemVideoOutput *videoOutput = _player.currentItem.outputs.firstObject;
CVPixelBufferRef pixelBuffer = [videoOutput copyPixelBufferForItemTime:_player.currentTime itemTimeForDisplay:nil];
// do something with pixelBuffer here
CVPixelBufferRelease(pixelBuffer);
and configuration becomes:
_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [OutputtingQueuePlayer queuePlayerWithItems:#[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
[self.view.layer addSublayer:_playerLayer];
[_player play];
Edit 3
I have found the root cause. CADisplayLink has a strong reference of the target. So it makes Retain Cycles.
Edit 2
Now I think is the memory issue causing the crash.
What I am doing is capture the output of the player and draw it on the opengl layer.
AVPlayerItem *item = ...;
if (!self.player) {
self.player = [AVPlayer playerWithPlayerItem:item];
} else {
[self.player replaceCurrentItemWithPlayerItem:item];
}
NSDictionary *pixBuffAttributes = #{(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32BGRA)};
self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
[self.player.currentItem addOutput:self.videoOutput];
[self.player seekToTime:kCMTimeZero];
[self.player play];
In the callback of DisplayLink
CMTime itemTime = [self.videoOutput itemTimeForHostTime:CACurrentMediaTime()];
BOOL hasNewContent = [self.videoOutput hasNewPixelBufferForItemTime:itemTime];
if (hasNewContent) {
CVPixelBufferRef pixelBuffer = [self.videoOutput copyPixelBufferForItemTime:itemTime itemTimeForDisplay:NULL];
// creat texture with pixelBuffer
// display texture on opengl surface
if (pixelBuffer != NULL) {
CFRelease(pixelBuffer);
}
}
There is no memory leak by instruments, but memory is rising.
Edit 1:
I have found a workaround. the resolution of "video_1" and "video_3" is 3840 * 1920, and the resolution of "video_2" is 2160 * 1080.
When I use ffmpeg to change the all resolutions to 2160 * 1080, it's worked.
Origin
I want to play several videos in sequence and meet a very strange behavior.
AVPlayerItem *item = ...;
if (!self.player) {
self.player = [AVPlayer playerWithPlayerItem:item];
} else {
[self.player replaceCurrentItemWithPlayerItem:item];
}
[self.player seekToTime:kCMTimeZero];
[self.player play];
For examples, I have three video files, such as video_1, video_2 and video_3.
First, I set the playerItem with "video_1", then I replace with "video_2". That's ok.
But I replace with "video_3", the App has crashed. I can't find any device log on my iphone. Even more, when I was debugging and replacing with "video_3", it would disconnect the debug and no exception!
More information:
"video_2" can replace "video_1"
"video_1" can replace "video_2"
"video_3" can replace "video_2"
"video_3" can't replace "video_1"
"video_1" can't replace "video_3"
all videos can be played normal in alone.
Try below code
if ([playerItemVideoOutput hasNewPixelBufferForItemTime:currentTime]) {
__unsafe_unretained ViewController *weakSelf = self; //create weak reference of your viewcontroller
CVPixelBufferRef pixelBuffer = [playerItemVideoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
if(pixelBuffer) { //check if buffer exist
[weakSelf.metalView.realTimeRender setPixelBuffer:pixelBuffer]; //use weakSelf here
CFRelease(pixelBuffer);
}
}
I have the following code in my app:
NSURL *url = [NSURL fileURLWithPath: [self.DocDir stringByAppendingPathComponent: self.FileName] isDirectory: NO];
self.avPlayer = [AVPlayer playerWithURL: url];
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
This worked fine with iOS 6 but with iOS 7 for some reason it returns NaN. When inspecting self.avPlayer.currentItem.duration the CMTime object has 0's with a flag of 17.
Interestingly the player works fine, just the duration is wrong.
Has anyone else experienced the same issues? I am importing the following:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#import <MediaPlayer/MediaPlayer.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVAsset.h>
After playing around with different ways of initializing the objects I arrived at a working solution:
AVURLAsset *asset = [AVURLAsset assetWithURL: url];
Float64 duration = CMTimeGetSeconds(asset.duration);
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset: asset];
self.avPlayer = [[AVPlayer alloc] initWithPlayerItem: item];
It appears the duration value isn't always immediately available from an AVPlayerItem but it seems to work fine with an AVAsset immediately.
In iOS 7, for AVPlayerItem already created, you can also get duration from the underlaying asset:
CMTimeGetSeconds([[[[self player] currentItem] asset] duration]);
Instead of get it directly from AVPlayerItem, which gives you a NaN:
CMTimeGetSeconds([[[self player] currentItem] duration]);
The recommended way of doing this, as described in the manual is by observing the player item status:
[self.avPlayer.currentItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew|NSKeyValueObservingOptionInitial context:nil];
Then, inside observeValueForKeyPath:ofObject:change:context:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
// TODO: use either keyPath or context to differentiate between value changes
if (self.avPlayer.currentItem.status == AVPlayerStatusReadyToPlay) {
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
// ...
}
}
Also, make sure that you remove the observer when you change the player item:
if (self.avPlayer.currentItem) {
[self.avPlayer.currentItem removeObserver:self forKeyPath:#"status"];
}
Btw, you can also observe the duration property directly; however, it's been my personal experience that the results aren't as reliable as they should be ;-)
Swift version
You can get the duration using AVAsset which is AVPlayerItem property:
func getVideoDuration(from player: AVPlayer) -> Double? {
guard let duration = player.currentItem?.asset.duration else { return nil }
let durationSeconds = CMTimeGetSeconds(duration)
return durationSeconds
}
or by creating AVAsset from the scratch:
func getVideoDuration(for videoUrl: URL) -> Double {
let asset = AVAsset(url: videoUrl)
let duration = asset.duration
let durationSeconds = CMTimeGetSeconds(duration)
return durationSeconds
}
I have two different views that are meant to play the same video, I am creating an app that will switch several times between the two views while the video is running.
I currently load the first view with the video as follows:
NSURL *url = [NSURL URLWithString:#"http://[URL TO VIDEO HERE]"];
AVURLAsset *avasset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:avasset];
player = [[AVPlayer alloc] initWithPlayerItem:item];
playerLayer = [[AVPlayerLayer playerLayerWithPlayer:player] retain];
CGSize size = self.bounds.size;
float x = size.width/2.0-202.0;
float y = size.height/2.0 - 100;
//[player play];
playerLayer.frame = CGRectMake(x, y, 404, 200);
playerLayer.backgroundColor = [UIColor blackColor].CGColor;
[self.layer addSublayer:playerLayer];
NSString *tracksKey = #"tracks";
[avasset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [avasset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
//videoInitialized = YES;
[player play];
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
In my second view I want to load the video from the dispatch_get_main_queue so that the video in both views are in sync.
I was hoping someone could help me out with loading the data of the video from the first view into the second view.
It is very simple:
Init the first player:
AVAsset *asset = [AVAsset assetWithURL:URL];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
And the second player in the same way, BUT, use the same asset from the first one.
I have verified, it works.
There is all the info you need on the Apple page:
https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
This abstraction means that you can play a given asset using different
players simultaneously
this quote is from this page.
I don't think you will be able to get this approach to work. Videos are decoded in hardware and then the graphics buffer is sent to the graphics card. What you seem to want to do is decode a video in one view but then capture the contents of the first view and show it in a second view. That will not stay in sync because it would take time to capture the contents of the first window back into main memory and then those contents would need to be sent to the video card again. Basically, that is not going to work. You also cannot decode two h.264 videos streams and expect them to be in sync.
You could implement this with another approach entirely. If you decode the h.264 video to frames on disk (save each frame as a PNG) and then write your own loop that will decode the Nth PNG in a series of PNGs and then display the results in the two different windows. That will work fast enough to be an effective implementation on newer iPhone 4 and 5 and iPad 2 and 3. If you want to make use of a more advanced implementation, take a look at my AVAnimator library for iOS, you could get this approach working in 20 minutes if you use existing code.
For this ten year old question which has only ten year old answers which are out of date, here's the up to date answer.
var leadPlayer: AVPlayer ... the lead player you want to dupe
This does not work:
let leadPlayerItem: AVPlayerItem = leadPlayer.currentItem!
yourPlayer = AVPlayer(playerItem: leadPlayerItem)
yourPlayer.play()
Apple does not allow that (try it, see error).
This works. You must use the item:
let dupeItem: AVPlayerItem = AVPlayerItem(asset: leadPlayer.currentItem!.asset)
yourPlayer = AVPlayer(playerItem: dupeItem)
yourPlayer.play()
Fortunately it's now that easy.
How can I play music from the ipod music library (like user-defined playlists, etc.) at a different volume than the system volume?
This is for anyone who is trying to play music / playlists from the ipod music library at a different volume than the system volume. There are several posts out there saying that the [MPMusicPlayerController applicationMusicPlayer] can do this, but I have found that anytime I change the volume of the applicationMusicPlayer, the system volume changes too.
There is a more involved method of playing music using the AVAudioPlayer class, but it requires you to copy music files from the ipod library to the application bundle, and that can get tricky when you're playing dynamic things, like user generated playlists. That technique does give you access to the bytes though, and is the way to go if you want to do processing on the data (like a DJ app). Link to that solution HERE.
The solution I went with uses the AVPlayer class, there are several good posts out there about how to do it. This post is basically a composite of several different solutions I found on Stackoverflow and elsewhere.
I have the following Frameworks linked:
AVFoundation
MediaPlayer
AudioToolbox
CoreAudio
CoreMedia
(I'm not sure if all of those are critical, but that's what I have. I have some OpenAL stuff implemented too that I don't show in the following code)
// Presumably in your SoundManage.m file (or whatever you call it) ...
#import <CoreAudio/CoreAudioTypes.h>
#import <AudioToolbox/AudioToolbox.h>
#interface SoundManager()
#property (retain, nonatomic) AVPlayer* audioPlayer;
#property (retain, nonatomic) AVPlayerItem* currentItem;
#property (retain, nonatomic) MPMediaItemCollection* currentPlaylist;
#property (retain, nonatomic) MPMediaItem* currentTrack;
#property (assign, nonatomic) MPMusicPlaybackState currentPlaybackState;
#end
#implementation SoundManager
#synthesize audioPlayer;
#synthesize currentItem = m_currentItem;
#synthesize currentPlaylist;
#synthesize currentTrack;
#synthesize currentPlaybackState;
- (id) init
{
...
//Define an AVPlayer instance
AVPlayer* tempPlayer = [[AVPlayer alloc] init];
self.audioPlayer = tempPlayer;
[tempPlayer release];
...
//load the playlist you want to play
MPMediaItemCollection* playlist = [self getPlaylistWithName: #"emo-pop-unicorn-blood-rage-mix-to-the-max"];
if(playlist)
[self loadPlaylist: playlist];
...
//initialize the playback state
self.currentPlaybackState = MPMusicPlaybackStateStopped;
//start the music playing
[self playMusic];
...
}
//Have a way to get a playlist reference (as an MPMediaItemCollection in this case)
- (MPMediaItemCollection*) getPlaylistWithName:(NSString *)playlistName
{
MPMediaQuery* query = [[MPMediaQuery alloc] init];
MPMediaPropertyPredicate* mediaTypePredicate = [MPMediaPropertyPredicate predicateWithValue: [NSNumber numberWithInteger: MPMediaTypeMusic] forProperty:MPMediaItemPropertyMediaType];
[query addFilterPredicate: mediaTypePredicate];
[query setGroupingType: MPMediaGroupingPlaylist];
NSArray* playlists = [query collections];
[query release];
for(MPMediaItemCollection* testPlaylist in playlists)
{
NSString* testPlaylistName = [testPlaylist valueForProperty: MPMediaPlaylistPropertyName];
if([testPlaylistName isEqualToString: playlistName])
return testPlaylist;
}
return nil;
}
//Override the setter on currentItem so that you can add/remove
//the notification listener that will tell you when the song has completed
- (void) setCurrentItem:(AVPlayerItem *)currentItem
{
if(m_currentItem)
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemDidPlayToEndTimeNotification object:m_currentItem];
[m_currentItem release];
}
if(currentItem)
m_currentItem = [currentItem retain];
else
m_currentItem = nil;
if(m_currentItem)
{
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(handleMusicTrackFinished) name:AVPlayerItemDidPlayToEndTimeNotification object:m_currentItem];
}
}
//handler that gets called when the name:AVPlayerItemDidPlayToEndTimeNotification notification fires
- (void) handleMusicTrackFinished
{
[self skipSongForward]; //or something similar
}
//Have a way to load a playlist
- (void) loadPlaylist:(MPMediaItemCollection *)playlist
{
self.currentPlaylist = playlist;
self.currentTrack = [playlist.items objectAtIndex: 0];
}
//Play the beats, yo
- (void) playMusic
{
//check the current playback state and exit early if we're already playing something
if(self.currentPlaybackState == MPMusicPlaybackStatePlaying)
return;
if(self.currentPlaybackState == MPMusicPlaybackStatePaused)
{
[self.audioPlayer play];
}
else if(self.currentTrack)
{
//Get the system url of the current track, and use that to make an AVAsset object
NSURL* url = [self.currentTrack valueForProperty:MPMediaItemPropertyAssetURL];
AVAsset* asset = [AVURLAsset URLAssetWithURL:url options:nil];
//Get the track object from the asset object - we'll need to trackID to tell the
//AVPlayer that it needs to modify the volume of this track
AVAssetTrack* track = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
//Build the AVPlayerItem - this is where you modify the volume, etc. Not the AVPlayer itself
AVPlayerItem* playerItem = [[AVPlayerItem alloc] initWithAsset: asset]; //initWithURL:url];
self.currentItem = playerItem;
//Set up some audio mix parameters to tell the AVPlayer what to do with this AVPlayerItem
AVMutableAudioMixInputParameters* audioParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioParams setVolume: 0.5 atTime:kCMTimeZero]; //replace 0.5 with your volume
[audioParams setTrackID: track.trackID]; //here's the track id
//Set up the actual AVAudioMix object, which aggregates effects
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters: [NSArray arrayWithObject: audioParams]];
//apply your AVAudioMix object to the AVPlayerItem
[playerItem setAudioMix:audioMix];
//refresh the AVPlayer object, and play the track
[self.audioPlayer replaceCurrentItemWithPlayerItem: playerItem];
[self.audioPlayer play];
}
self.currentPlaybackState = MPMusicPlaybackStatePlaying;
}
- (void) pauseMusic
{
if(self.currentPlaybackState == MPMusicPlaybackStatePaused)
return;
[self.audioPlayer pause];
self.currentPlaybackState = MPMusicPlaybackStatePaused;
}
- (void) skipSongForward
{
//adjust self.currentTrack to be the next object in self.currentPlaylist
//start the new track in a manner similar to that used in -playMusic
}
- (void) skipSongBackward
{
float currentTime = self.audioPlayer.currentItem.currentTime.value / self.audioPlayer.currentItem.currentTime.timescale;
//if we're more than a second into the song, just skip back to the beginning of the current track
if(currentTime > 1.0)
{
[self.audioPlayer seekToTime: CMTimeMake(0, 1)];
}
else
{
//otherwise, adjust self.currentTrack to be the previous object in self.currentPlaylist
//start the new track in a manner similar to that used in -playMusic
}
}
//Set volume mid-song - more or less the same process we used in -playMusic
- (void) setMusicVolume:(float)vol
{
AVPlayerItem* item = self.audioPlayer.currentItem;
AVAssetTrack* track = [[item.asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableAudioMixInputParameters* audioParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioParams setVolume: vol atTime:kCMTimeZero];
[audioParams setTrackID: track.trackID];
AVMutableAudioMix* audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters: [NSArray arrayWithObject: audioParams]];
[item setAudioMix:audioMix];
}
#end
Please forgive any errors you see - let me know in the comments and I'll fix them. Otherwise, I hope this helps if anyone runs into the same challenge I did!
Actually I found a really easy way to do this by loading iPod URL's from MPMusicPlayer, but then doing playback through AVAudioPlayer.
// Get-da iTunes player thing
MPMusicPlayerController* iTunes = [MPMusicPlayerController iPodMusicPlayer];
// whazzong
MPMediaItem *currentSong = [iTunes nowPlayingItem];
// whazzurl
NSURL *currentSongURL = [currentSong valueForProperty:MPMediaItemPropertyAssetURL];
info( "AVAudioPlayer playing %s", [currentSongURL.absoluteString UTF8String] ) ;
// mamme AVAudioPlayer
NSError *err;
avAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:currentSongURL error:&err] ;
if( err!=nil )
{
error( "AVAudioPlayer couldn't load %s", [currentSongURL.absoluteString UTF8String] ) ;
}
avAudioPlayer.numberOfLoops = -1; //infinite
// Play that t
[avAudioPlayer prepareToPlay] ;
[avAudioPlayer play];
[avAudioPlayer setVolume:0.5]; // set the AVAUDIO PLAYER's volume to only 50%. This
// does NOT affect system volume. You can adjust this music volume anywhere else too.