I have an application in which the user can select from local video files. When one of those thumbnails gets pushed, the user is presented a new view which have a custom video player I've made that presents the video.
This works flawlessly, but only sometimes. The funny thing is that if the user selects a new video (thus getting presented a new view, initializing a new custom video player object) exactly 5 times, the underlying AVPlayerLayer that is used to present the visuals from the player renders black, even though it seems like the underlying asset still loads correctly (the player interface still holds the correct duration for the video and so forth).
When a new custom media player object gets initialized (which happens when the view controller for the media players containing view gets loaded), this is the part of the initializer method which sets up the AVPlayer and its associated item:
// Start to load the specified asset
mediaAsset = [[AVURLAsset alloc] initWithURL:contentURL options:nil];
if (mediaAsset == nil)
NSLog(#"The media asset is zero!!!");
// Now we need to asynchronously load in the tracks of the specified asset (like audio and video tracks). We load them asynchronously to avoid having the entire app UI freeze while loading occours
NSString* keyValueToLoad = #"tracks";
// When loading the tracks asynchronously we also specify a completionHandler, which is the block of code that should be executed once the loading is either or for some reason failed
[mediaAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:keyValueToLoad
] completionHandler:^
{
// When this block gets executed we check for potential errors or see if the asset loaded successfully
NSError* error = nil;
AVKeyValueStatus trackStatus = [mediaAsset statusOfValueForKey:keyValueToLoad error:&error];
if (error != nil)
{
NSLog(#"Error: %#", error.description);
}
//switch (trackStatus) {
//case AVKeyValueStatusLoaded:
if (trackStatus == AVKeyValueStatusLoaded)
{
NSLog(#"Did load properly!");
mediaItem = [AVPlayerItem playerItemWithAsset:mediaAsset];
if (mediaItem.error == nil)
NSLog(#"Everything went fine!");
if (mediaItem == nil)
NSLog(#"THE MEDIA ITEM WAS NIL");
[mediaItem addObserver:self forKeyPath:#"status" options:0 context:&itemStatusContext];
mediaContentPlayer = [[AVPlayer alloc] initWithPlayerItem:mediaItem];
[mediaContentView setPlayer:mediaContentPlayer];
//mediaContentView = [AVPlayerLayer playerLayerWithPlayer:mediaContentPlayer];
[activeModeViewBlocked configurePlaybackSliderWithDuration:mediaItem.duration];
originalDuration = mediaItem.duration.value / mediaItem.duration.timescale;
// We will subscribe to a timeObserver on the player to check for the current playback time of the movie within a specified interval. Doing so will allow us to frequently update the user interface with correct information
playbackTimeObserver = [mediaContentPlayer addPeriodicTimeObserverForInterval:CMTimeMake(1, 50) queue:dispatch_get_main_queue() usingBlock:^(CMTime time)
{
NSLog(#"TIME UPDATED!");
[activeModeViewBlocked updatePlaybackSlider:time];
}];
[self syncUI];
}
if (trackStatus == AVKeyValueStatusFailed)
{
NSLog(#"Something failed!");
}
if (trackStatus == AVKeyValueStatusCancelled)
{
NSLog(#"Something was cancelled!");
}
}];
Now if I initialize this custom media player object 5 times exactly, it always starts to render black screens.
Does anyone have any idea of why this could be happening?
This bit me too. There is a limit on the number of concurrent video players that AVFoundation will allow. That number is four(for iOS 4.x, more recently the number seems to have increased. For example, on iOS 7 I've had up to eight on one screen with no issue). That is why it is going black on the fifth one. You can't even assume you'll get four, as other apps may need a 'render pipeline'.
This API causes a render pipeline:
+[AVPlayer playerWithPlayerItem:]
Related
To understand this question, return with me now through the WWDC time machine to the distant past, 2014, when Action extensions were introduced and explained in this video:
https://developer.apple.com/videos/play/wwdc2014/217/
About halfway through, in slide 71, about minute 23:30, the presenter gives instructions for returning a value back to the calling app (the app where the user tapped our Action extension's icon in an activity view):
- (IBAction)done:(id)sender {
NSData *data = self.contents;
NSItemProvider *itemProvider =
[[NSItemProvider alloc] initWithItem:data typeIdentifier:MyDocumentUTI];
NSExtensionItem *item = [[NSExtensionItem alloc] init];
item.attachments = #[itemProvider];
}
A moment later, slide 75, about minute 26, we see how the app that put up the activity view controller is supposed to unwrap that envelope to retrieve the result data:
- (void)setupActivityViewController {
UIActivityViewController *controller;
controller.completionWithItemsHandler =
^(NSString *activityType, BOOL completed,
NSArray *returnedItems, NSError *error) {
if (completed && (returnedItems.count > 0)) {
// process the result items
}
}];
}
So my question is: is that for real? Has anyone within the sound of my voice ever done either of those things? Namely:
Does your app have an Action extension that returns a value to the caller?
Does your app put up an activity view controller that receives the result of some arbitrary unknown Action extension and does something with the value?
I ask because (1) I have never seen (on my iPhone) an Action extension that actually returns a value, and (2) the code elided in "process the result items" seems to me to be complete hand-waving, because how would my app even know what kind of data to expect?
I have come to believe that this code is an aspirational pipe dream with no corresponding reality. But I would be delighted to be told I'm wrong.
I'm building an app with multiple view controllers and I have coded an audio file to play at the start up of the app. That works fine and when I click on the button to view a different screen the audio file still plays without skipping a beat just like it's supposed to but my problem arises when I click on the button to go back to the main screen. When I click to go back to the main screen the audio file plays over itself reminding me of the song Row Row Your Boat. The app is re-reading that code that tells itself to play the audio file thus playing it all over again. My problem is that I can't figure out how to make it not do that. I have coded the app to stop the audio when clicking on the start game button, which is what I want it to do but not until then. I just need help getting the app to not play the audio file over itself when going back to the main screen. The audio file is coded to play infinitely until the "start" button is clicked. If anyone can make since out of what I'm trying to say then please help me code this thing correctly. Thanks to anyone who can make it work right.
Here my code:
-(void)viewDidLoad
{
NSString *introMusic = [[NSBundle mainBundle]pathForResource:#"invadingForces" ofType:#"mp3"];
audioPlayer0 = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:introMusic] error:NULL];
audioPlayer0.delegate = self;
audioPlayer0.numberOfLoops = -1;
[audioPlayer0 play];
}
The problem is that you start a sound in a local variable when your view is loaded, start it playing on endless repeat, and then forget about it. Then you close the view controller, leaving the now-forgotten audio player playing. Next time you invoke the view controller, it's viewDidLoad method creates another audio player and starts that one playing too, and then forgets about that one. Every time you open a new copy of that view controller, you'll start yet another sound player, adding another voice to your round of "row, row, row your boat."
The naive solution is to put the code that starts the sound player in the app delegate. Set up the AVAudioPlayer as a property of you app delegate. Create a startPlaying method and a stopPlaying method. In your didFinishLaunchingWithOptions method, call startPlaying.
It's cleaner app design not to put app functionality in your app delegate, but instead create a singleton to manage sound play. (Search on "iOS singleton design pattern" to learn more.) Create an appDidLaunch method in the singleton, and call appDidLaunch from didFinishLaunchingWithOptions to start playing your sound. That way the app delegate doesn't need to have app specific logic in it, but simply calls appDidLaunch and goes on it's way.
EDIT:
If you want to call a method in the app delegate, and your app delegate is declared as:
#interface AppDelegate : UIResponder <UIApplicationDelegate>
Then you'd call it from another file like this:
First, import the app delegate's header:
#import "AppDelegate.h"
And the actual code to call your app delegate's stopPlaying method:
//Get a pointer to the application object.
UIApplication *theApp = [UIApplication sharedApplication];
//ask the application object for a pointer to the app delegate, and cast it
//to our custom "AppDelegate" class. If your app delegate uses a different
//class name, use that name here instead of "AppDelegate"
AppDelegate *theAppDelegate = (AppDelegate *)theApp.delegate;
[theAppDelegate stopPlaying];
Here's some example code to wrap an AVAudioPlayer in a singleton -
BWBackgroundMusic.h
#interface BWBackgroundMusic : NSObject
// singleton getter
+ (instancetype)sharedMusicPlayer;
/* public interface required to control the AVAudioPlayer instance is as follows -
start - plays from start - if playing stops and plays from start
stop - stops and returns play-head to start regardless of state
pause - stops and leaves play-head where it is - if already paused or stopped does nothing
continue - continues playing from where the play-head was left - if playing does nothing
replace audio track with new file - replaceBackgroundMusicWithFileOfName:
set background player to nil - destroyBackgroundMusic
NOTE:- change default track filename in .m #define */
// transport like methods
- (void)startBackgroundMusic;
- (void)stopBackgroundMusic;
- (void)pauseBackgroundMusic;
- (void)continueBackgroundMusic;
// audio source management
- (void)replaceBackgroundMusicWithFileOfName:(NSString*)audioFileName startPlaying:(BOOL)startPlaying;
- (void)destroyBackgroundMusic;
#end
BWBackgroundMusic.m
#import "BWBackgroundMusic.h"
#import <AVFoundation/AVFoundation.h> // must link to project first
#define DEFAULT_BACKGROUND_AUDIO_FILENAME #"invadingForces.mp3"
#interface BWBackgroundMusic ()
#property (strong, nonatomic) AVAudioPlayer *backgroundMusicPlayer;
#end
#implementation BWBackgroundMusic
#pragma mark Singleton getter
+ (instancetype)sharedMusicPlayer {
static BWBackgroundMusic *musicPlayer = nil;
static dispatch_once_t onceToken;
dispatch_once (&onceToken, ^{
musicPlayer = [[self alloc] init];
});
//NSLog(#"sample rate of file is %f",[musicPlayer currentSampleRate]);
return musicPlayer;
}
#pragma mark Initialiser
- (id)init {
//NSLog(#"sharedMusicPlayer from BWBackgroundMusic.h init called...");
if (self = [super init]) {
// self setup _backgroundMusicPlayer here...
// configure the audio player
NSURL *musicURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/%#", [[NSBundle mainBundle] resourcePath], DEFAULT_BACKGROUND_AUDIO_FILENAME]];
NSError *error;
if (_backgroundMusicPlayer == nil) {
_backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:musicURL error:&error];
}
if (_backgroundMusicPlayer == nil) {
NSLog(#"%#",[error description]);
} else {
[self makePlaybackInfinite];
[_backgroundMusicPlayer play];
}
}
return self;
}
#pragma mark Selfish methods
- (void)makePlaybackInfinite {
// access backing ivar directly because this is also called from init method
if (_backgroundMusicPlayer) {
_backgroundMusicPlayer.numberOfLoops = -1;
}
}
- (CGFloat)currentSampleRate {
NSDictionary *settingsDict = [self.backgroundMusicPlayer settings];
NSNumber *sampleRate = [settingsDict valueForKey:AVSampleRateKey];
return [sampleRate floatValue];
}
#pragma mark Transport like methods
- (void)startBackgroundMusic {
// plays from start - if playing stops and plays from start
if (self.backgroundMusicPlayer.isPlaying) {
[self.backgroundMusicPlayer stop];
self.backgroundMusicPlayer.currentTime = 0;
[self.backgroundMusicPlayer prepareToPlay];// this is not required as play calls this implicitly if not already prepared
[self.backgroundMusicPlayer play];
}
else {
self.backgroundMusicPlayer.currentTime = 0;
[self.backgroundMusicPlayer prepareToPlay];
[self.backgroundMusicPlayer play];
}
}
- (void)stopBackgroundMusic {
// stops and returns play-head to start regardless of state and prepares to play
if (self.backgroundMusicPlayer.isPlaying) {
[self.backgroundMusicPlayer stop];
self.backgroundMusicPlayer.currentTime = 0;
[self.backgroundMusicPlayer prepareToPlay];
}
else {
self.backgroundMusicPlayer.currentTime = 0;
[self.backgroundMusicPlayer prepareToPlay];
}
}
- (void)pauseBackgroundMusic {
// stops and leaves play-head where it is - if already paused or stopped does nothing
if (self.backgroundMusicPlayer.isPlaying) {
[self.backgroundMusicPlayer pause];
}
}
- (void)continueBackgroundMusic {
// continues playing from where the play-head was left - if playing does nothing
if (!self.backgroundMusicPlayer.isPlaying) {
[self.backgroundMusicPlayer play];
}
}
#pragma mark Content management
- (void)replaceBackgroundMusicWithFileOfName:(NSString*)audioFileName startPlaying:(BOOL)startPlaying {
// construct filepath
NSString *filePath = [NSString stringWithFormat:#"%#/%#", [[NSBundle mainBundle] resourcePath], audioFileName];
// make a url from the filepath
NSURL *fileUrl = [NSURL fileURLWithPath:filePath];
// construct player and prepare
NSError *error;
self.backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileUrl error:&error];
[self.backgroundMusicPlayer prepareToPlay];
[self makePlaybackInfinite];
// if startplaying then play
if (startPlaying) {
[self.backgroundMusicPlayer play];
}
}
- (void)destroyBackgroundMusic {
// stop playing if playing
[self stopBackgroundMusic];
// destroy by setting background player to nil
self.backgroundMusicPlayer = nil;
}
#end
To use simply call [BWBackgroundMusic sharedMusicPlayer]; This will instantiate the singleton if not already instantiated, start the player automatically, and will loop infinitely by default.
Furthermore you can control it from any class that imports BWBackgroundMusic.h
For example to pause the player use
[[BWBackgroundMusic sharedMusicPlayer] pauseBackgroundMusic];
In my game (I'm using SpriteKit, and therefore only support iOS 7), when a player reaches his first 10 points, he is awarded with an achievement. I've implemented the achievement method as follows:
-(void) First10Points
{
GKAchievement *achievement = [[GKAchievement alloc] initWithIdentifier: #"Achievement_First10Points"];
if (achievement)
{
achievement.showsCompletionBanner = YES;
achievement.percentComplete = 100.0;
NSArray *achievements = [NSArray arrayWithObjects:achievement, nil];
[GKAchievement reportAchievements:achievements withCompletionHandler:^(NSError *error) {
if (error != nil) {
NSLog(#"Error in reporting achievements: %#", error);
}
}];
}
}
This works fine and the achievement is indeed earned at 10 points, with the game center banner indicating this to the player during the game. However, when the banner disappears it reappears after a second or so and continues to do so until i terminate the game. The game can still be played while it does this loop thing. I can't seem to understand why it does this and I have not come across this problem while searching the web.
Anyone an idea? Or should I implement my achievements in another way?
One possibility is that you're calling the First10Points method multiple times. You should check if the player has already reached the 10 points achievement before presenting the achievement again. If they have indeed already reached it, then don't call the method.
Try adding a variable like BOOL first10 = NO; When you run your check (score == 10), set first10 = YES; Everytime before you call First10Points, ensure that (first10 == NO)
When i switch views the music keep playing the the background what is fine with my app. But when the user comes back to the initial view the music starts again over the original one so the user hears the music double. I have already got some code to check whether the sound is already playing but it doesnt work. :(
Any thoughts?
if (audioPlayer.playing == 0 ) {
NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/CheeZeeLab.mp3", [[NSBundle mainBundle] resourcePath]]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = 0;
audioPlayer.volume = 1;
[audioPlayer prepareToPlay];
[audioPlayer stop];
if (audioPlayer == nil)
NSLog(#"werkt niet");
else
[audioPlayer play]; }
else{
}
I bet your problem is in how you transition back to your initial view.
If you're pushing (or doing a segue from the child back to the parent... i.e. a circular segue), you're creating a new instance of your parent view controller.
And creating a new instance starts a fresh version of the audio player playing that sound.
You need to properly pop the view to go back to the previous view controller.
Depending on how you have set up the AVAudioPlayer, with manual setter and getter, with property or even in the constructor. You are confronted with some issues.
When you initialize the ViewController you are allocating an AVAudioPlayer, and if it keeps playing it's not being deallocated, and still retained. And the ViewController won't be deallocated either until every property's retain count is 0.
After going back to the main view, and trying to set up the same ViewController again, my guess is that you are not trying to push the same instance to the stack. But rather making a new instance and pushing that one onto the stack? And that also makes a AVAudioPlayer.
This is ofc a bit of guesswork as a can't see you entire code. But if this is the case, I think you could set things up a bit better. Make sure that the ViewController gets properly deallocated and releasing all it's properties before making a new instance to push to the stack.
I'd like to crossfade from one track to the next in a Spotify enabled app. Both tracks are Spotify tracks, and since only one data stream at a time can come from Spotify, I suspect I need to buffer (I think I can read ahead 1.5 x playback speed) the last few seconds of the first track, start the stream for track two, fade out one and fade in two using an AudioUnit.
I've reviewed sample apps:
Viva - https://github.com/iKenndac/Viva SimplePlayer with EQ - https://github.com/iKenndac/SimplePlayer-with-EQ and tried to get my mind around the SPCircularBuffer, but I still need help. Could someone point me to another example or help bullet-point a track crossfade game plan?
Update: Thanks to iKenndac, I'm about 95% there. I'll post what I have so far:
in SPPlaybackManager.m: initWithPlaybackSession:(SPSession *)aSession {
added:
self.audioController2 = [[SPCoreAudioController alloc] init];
self.audioController2.delegate = self;
and in
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
...
self.audioController.audioOutputEnabled = self.playbackSession.isPlaying;
// for crossfade, add
self.audioController2.audioOutputEnabled = self.playbackSession.isPlaying;
and added a new method based on playTrack
-(void)crossfadeTrack:(SPTrack *)aTrack callback:(SPErrorableOperationCallback)block {
// switch audiocontroller from current to other
if (self.playbackSession.audioDeliveryDelegate == self.audioController)
{
self.playbackSession.audioDeliveryDelegate = self.audioController2;
self.audioController2.delegate = self;
self.audioController.delegate = nil;
}
else
{
self.playbackSession.audioDeliveryDelegate = self.audioController;
self.audioController.delegate = self;
self.audioController2.delegate = nil;
}
if (aTrack.availability != SP_TRACK_AVAILABILITY_AVAILABLE) {
if (block) block([NSError spotifyErrorWithCode:SP_ERROR_TRACK_NOT_PLAYABLE]);
self.currentTrack = nil;
}
self.currentTrack = aTrack;
self.trackPosition = 0.0;
[self.playbackSession playTrack:self.currentTrack callback:^(NSError *error) {
if (!error)
self.playbackSession.playing = YES;
else
self.currentTrack = nil;
if (block) {
block(error);
}
}];
}
this starts a timer for crossfade
crossfadeTimer = [NSTimer scheduledTimerWithTimeInterval: 0.5
target: self
selector: #selector ( crossfadeCountdown)
userInfo: nil
repeats: YES];
And in order to keep the first track playing after its data has loaded in SPCoreAudioController.m I changed target buffer length:
static NSTimeInterval const kTargetBufferLength = 20;
and in SPSession.m : end_of_track(sp_session *session) {
I removed
// sess.playing = NO;
I call preloadTrackForPlayback: about 15 seconds before end of track, then crossfadeTrack: at 10 seconds before.
Then set crossfadeCountdownTime = [how many seconds you want the crossfade]*2;
I fade volume over the crosssfade with:
- (void) crossfadeCountdown
{
[UIAppDelegate.playbackSPManager setVolume:(1- (((float)crossfadeCountdownTime/ (thisCrossfadeSeconds*2.0)) *0.2) )];
crossfadeCountdownTime -= 0.5;
if (crossfadeCountdownTime == 1.0)
{
NSLog(#"Crossfade countdown done");
crossfadeCountdownTime = 0;
[crossfadeTimer invalidate];
crossfadeTimer = nil;
[UIAppDelegate.playbackSPManager setVolume:1.0];
}
}
I'll keep working on it, and update if I can make it better. Thanks again to iKenndac for his always spot-on help!
There isn't a pre-written crossfade example that I'm aware of that uses CocoaLibSpotify. However, a (perhaps not ideal) game plan would be:
Make two separate audio queues. SPCoreAudioController is an encapsulation of an audio queue, so you should just be able to instantiate two of them.
Play music as normal to one queue. When you're approaching the end of the track, call SPSession's preloadTrackForPlayback:callback: method with the next track to get it ready.
When all audio data for the playing track has been delivered, SPSession will fire the audio delegate method sessionDidEndPlayback:. This means that all audio data has been delivered. However, since CocoaLibSpotify buffers the audio from libspotify, there's still some time before audio stops.
At this point, start playing the new track but divert the audio data to the second audio queue. Start ramping down the volume of the first queue while ramping up the volume of the next one. This should give a pleasing crossfade.
A few pointers:
In SPCoreAudioController.m, you'll find the following line, which defines how much audio CocoaLibSpotify buffers, in seconds. If you want a bigger crossfade, you'll need to increase it.
static NSTimeInterval const kTargetBufferLength = 0.5;
Since you get audio data at a maximum of 1.5x actual playback speed, be careful not to do, for example, a 5 second crossfade when the user has just skipped near to the end of the track. You might not have enough audio data available to pull it off.
Take a good look at SPPlaybackManager.m. This class is the interface between CocoaLibSpotify and Core Audio. It's not too complicated, and understanding it will get you a long way. SPCoreAudioController and SPCircularBuffer are pretty much implementation details of getting the audio into Core Audio, and you shouldn't need to understand their implementations to achieve what you want.
Also, make sure you understand the various delegates SPSession has. The audio delivery delegate only has one job - to receive audio data. The playback delegate gets all other playback events - when audio has finished being delivered to the audio delivery delegate, etc. There's nothing stopping one class being both, but in the current implementation, SPPlaybackManager is the playback delegate, which creates an instance of SPCoreAudioController to be the audio delivery delegate. If you modify SPPlaybackManager to have two Core Audio controllers and alternate which one is the audio delivery delegate, you should be golden.