cocos2d audio file issue - ios

I am using a CDLongAudioSource to play my music in my cocos2d game. But the issue is my game crashes some time when my audio file only loads but not play because I go back from my scene to other then if I come again in my scene where audio file need to play app crash.
Function to play file:
-(void)playMyEffect:(NSString*)audioFile{
CDLongAudioSource*currentSound = [[CDAudioManager sharedManager] audioSourceForChannel:kASC_Right];
//[currentSound load:#""];
NSLog(#" file path to play %#",audioFile);
[currentSound load:audioFile];
currentSound.delegate = self;
currentSound.backgroundMusic = NO;
self.isSpeechComplete = NO;
[currentSound play];
}

Make sure you stop the player, and remove memory reference before navigating to other scene. I think this should work.

Related

Is it possible to play multiple sound files simultaneously (like a mix) on Apple Watch?

This is the code I am using to play a sound with AVAudioPlayerNode. It is just playing the last sound of the beats array. On iPhone, all sounds from the beats array are playing simultaneously with the same function.
-(void)playMix{
for (int i = 0; i< mix.beatsArray.count;i++) {
beat = [[WatchBeatObject alloc] initWithFileName:mix.beatsArray[i].fileName];
[beat.audioPlayerNode play];
[beat.audioPlayerNode scheduleBuffer:beat.buffer atTime:nil options:AVAudioPlayerNodeBufferLoops completionHandler:^{
}];
}
}
N.B: Method initWithFileName: handles initializing and creating AVAudioPlayerNode and everything needed.
Thank you in advance.
Does that WatchBeatObject has any category option in AVAudioSession to set AVAudioSessionCategoryAmbient? If yes, set it, they would mix different sound.

AVPlayer - Detect action in default control player

I use AVPlayer by default, not custom control.
AVPlayer *player = [AVPlayer playerWithURL:[NSURL URLWithString:#"https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4"]];
AVPlayerViewController *avPlayerVC = [[AVPlayerViewController alloc] init];
avPlayerVC.player = player;
avPlayerVC.view.translatesAutoresizingMaskIntoConstraints = NO;
[self addChildViewController:avPlayerVC];
[self.view addSubview:avPlayerVC.view];
[player play];
... (auto layout here)
AVPlayer works normally but I can't handle any event from it.
When a user clicks on the previous or next button, I want to be able to detect or receive events to update other URLs. When I try it myself, it does not work.
I tried to use AVQueuePlayer with 3 URLs, but the video only changes the URL when it finishes the current item, the 2 buttons next/previous doesn't work.
I want to receive event on control avplayer, How do I do this? Thank so much! Sorry for my bad English.

how to pause youtubevideo playing inside app?

Hi I'm using XCDYouTubeVideoPlayer to play the youtube video inside my app everything works fine. The problem is after navigating to next VC video is still playing how to pause or stop that video.
My code to play .
UIView *videoContainerView = [[UIView alloc]initWithFrame:CGRectMake(0, 200, 320, 200)];
[self.view addSubview:videoContainerView];
XCDYouTubeVideoPlayerViewController *videoPlayerViewController = [[XCDYouTubeVideoPlayerViewController alloc] initWithVideoIdentifier:vd];
// NSLog(#"%#",inVideosObj.video);
[videoPlayerViewController presentInView:videoContainerView];
[videoPlayerViewController.moviePlayer play];
My code to pause.
XCDYouTubeVideoPlayerViewController *videoPlayerViewController = [[XCDYouTubeVideoPlayerViewController alloc] initWithVideoIdentifier:vd];
[videoPlayerViewController.moviePlayer pause];
I have used this above code to pause the video its not working please help me i have been stuck here for long time .
Thanks.
You are creating a new instance of XCDYouTubeVideoPlayerViewController. Instead, you should store your XCDYouTubeVideoPlayerViewController instance in a property when you first create it and use this one when you need to stop the video. See DemoInlineViewController.m from the demo project for a concrete example.

ios SystemSound will not respond to volume buttons

Ive used SystemSound in my app in order to play simple sound effects. In addition to this I play a musicvideo through the MPMoviePlayerController - now when I turn the volume up/down the music from the video responds as intended (lowering the volume up/down).
But the system sounds that are played does not respond to the volume. Im playing the system sounds when the user taps certain areas in the app. Heres a snippet of my code for that:
- (void)handleTap:(UITapGestureRecognizer *)recognizer {
SystemSoundID completeSound = nil;
//yellow folder in xcode doesnt need subdirectory param
//blue folder (true folder) will need to use subdirectory:#"dirname"
NSURL *sound_path = [[NSBundle mainBundle] URLForResource: target_sound_filename withExtension: #"wav"];
AudioServicesCreateSystemSoundID((__bridge CFURLRef)sound_path, &completeSound);
AudioServicesPlaySystemSound(completeSound);
}
PS. I doublechecked that my "Settings->Sounds->Ringer and Alerts->Change With Buttons" is set to ON (as I read on some other SO answers that leaving this option OFF will cause systemsound to not respond to the volume buttons)
Further the reason for using systemsound is that it gave the most accurate and responsive results when playing multiple sounds (like in a game).
Id prefer to not use OpenAL if possible (even through 3rd party sound libraries like Finch or CocosDenshion)
Any ideas?
Use the AVAudioPlayer class for playing sounds that are controlled by the user's volume settings (non-system sounds).
You can retain instances of AVAudioPlayer for each sound file that you use regularly and simply call the play method. Use prepareToPlay to preload the buffers.
Cheers to Marcus for suggesting that i could retain instances of AVAudioPlayer for each sound file and use prepareToPlay to preload the sounds. It might be to help for others looking for the same solution so here is how I did it (feel free to comment if anyone have suggestions for improvements)
//top of viewcontroller.m
#property (nonatomic, strong) NSMutableDictionary *audioPlayers;
#synthesize audioPlayers = _audioPlayers;
//on viewDidLoad
self.audioPlayers = [NSMutableDictionary new];
//creating the instances and adding them to the nsmutabledictonary in order to retain them
//soundFile is just a NSString containing the name of the wav file
NSString *soundFile = [[NSBundle mainBundle] pathForResource:s ofType:#"wav"];
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:soundFile] error:nil];
//audioPlayer.numberOfLoops = -1;
[audioPlayer prepareToPlay];
//add to dictonary with filename (omit extension) as key
[self.audioPlayers setObject:audioPlayer forKey:s];
//then i use the following to play the sound later on (i have it on a tap event)
//get pointer reference to the correct AVAudioPlayer instance for this sound, and play it
AVAudioPlayer *foo = [self.audioPlayers objectForKey:target_sound_filename];
[foo play];
//also im not sure how ARC will treat the strong property, im setting it to nil in dealloc atm.
-(void)dealloc {
self.audioPlayers = nil;
}

How do I audio crossfade using cocoalibspotify?

I'd like to crossfade from one track to the next in a Spotify enabled app. Both tracks are Spotify tracks, and since only one data stream at a time can come from Spotify, I suspect I need to buffer (I think I can read ahead 1.5 x playback speed) the last few seconds of the first track, start the stream for track two, fade out one and fade in two using an AudioUnit.
I've reviewed sample apps:
Viva - https://github.com/iKenndac/Viva SimplePlayer with EQ - https://github.com/iKenndac/SimplePlayer-with-EQ and tried to get my mind around the SPCircularBuffer, but I still need help. Could someone point me to another example or help bullet-point a track crossfade game plan?
Update: Thanks to iKenndac, I'm about 95% there. I'll post what I have so far:
in SPPlaybackManager.m: initWithPlaybackSession:(SPSession *)aSession {
added:
self.audioController2 = [[SPCoreAudioController alloc] init];
self.audioController2.delegate = self;
and in
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
...
self.audioController.audioOutputEnabled = self.playbackSession.isPlaying;
// for crossfade, add
self.audioController2.audioOutputEnabled = self.playbackSession.isPlaying;
and added a new method based on playTrack
-(void)crossfadeTrack:(SPTrack *)aTrack callback:(SPErrorableOperationCallback)block {
// switch audiocontroller from current to other
if (self.playbackSession.audioDeliveryDelegate == self.audioController)
{
self.playbackSession.audioDeliveryDelegate = self.audioController2;
self.audioController2.delegate = self;
self.audioController.delegate = nil;
}
else
{
self.playbackSession.audioDeliveryDelegate = self.audioController;
self.audioController.delegate = self;
self.audioController2.delegate = nil;
}
if (aTrack.availability != SP_TRACK_AVAILABILITY_AVAILABLE) {
if (block) block([NSError spotifyErrorWithCode:SP_ERROR_TRACK_NOT_PLAYABLE]);
self.currentTrack = nil;
}
self.currentTrack = aTrack;
self.trackPosition = 0.0;
[self.playbackSession playTrack:self.currentTrack callback:^(NSError *error) {
if (!error)
self.playbackSession.playing = YES;
else
self.currentTrack = nil;
if (block) {
block(error);
}
}];
}
this starts a timer for crossfade
crossfadeTimer = [NSTimer scheduledTimerWithTimeInterval: 0.5
target: self
selector: #selector ( crossfadeCountdown)
userInfo: nil
repeats: YES];
And in order to keep the first track playing after its data has loaded in SPCoreAudioController.m I changed target buffer length:
static NSTimeInterval const kTargetBufferLength = 20;
and in SPSession.m : end_of_track(sp_session *session) {
I removed
// sess.playing = NO;
I call preloadTrackForPlayback: about 15 seconds before end of track, then crossfadeTrack: at 10 seconds before.
Then set crossfadeCountdownTime = [how many seconds you want the crossfade]*2;
I fade volume over the crosssfade with:
- (void) crossfadeCountdown
{
[UIAppDelegate.playbackSPManager setVolume:(1- (((float)crossfadeCountdownTime/ (thisCrossfadeSeconds*2.0)) *0.2) )];
crossfadeCountdownTime -= 0.5;
if (crossfadeCountdownTime == 1.0)
{
NSLog(#"Crossfade countdown done");
crossfadeCountdownTime = 0;
[crossfadeTimer invalidate];
crossfadeTimer = nil;
[UIAppDelegate.playbackSPManager setVolume:1.0];
}
}
I'll keep working on it, and update if I can make it better. Thanks again to iKenndac for his always spot-on help!
There isn't a pre-written crossfade example that I'm aware of that uses CocoaLibSpotify. However, a (perhaps not ideal) game plan would be:
Make two separate audio queues. SPCoreAudioController is an encapsulation of an audio queue, so you should just be able to instantiate two of them.
Play music as normal to one queue. When you're approaching the end of the track, call SPSession's preloadTrackForPlayback:callback: method with the next track to get it ready.
When all audio data for the playing track has been delivered, SPSession will fire the audio delegate method sessionDidEndPlayback:. This means that all audio data has been delivered. However, since CocoaLibSpotify buffers the audio from libspotify, there's still some time before audio stops.
At this point, start playing the new track but divert the audio data to the second audio queue. Start ramping down the volume of the first queue while ramping up the volume of the next one. This should give a pleasing crossfade.
A few pointers:
In SPCoreAudioController.m, you'll find the following line, which defines how much audio CocoaLibSpotify buffers, in seconds. If you want a bigger crossfade, you'll need to increase it.
static NSTimeInterval const kTargetBufferLength = 0.5;
Since you get audio data at a maximum of 1.5x actual playback speed, be careful not to do, for example, a 5 second crossfade when the user has just skipped near to the end of the track. You might not have enough audio data available to pull it off.
Take a good look at SPPlaybackManager.m. This class is the interface between CocoaLibSpotify and Core Audio. It's not too complicated, and understanding it will get you a long way. SPCoreAudioController and SPCircularBuffer are pretty much implementation details of getting the audio into Core Audio, and you shouldn't need to understand their implementations to achieve what you want.
Also, make sure you understand the various delegates SPSession has. The audio delivery delegate only has one job - to receive audio data. The playback delegate gets all other playback events - when audio has finished being delivered to the audio delivery delegate, etc. There's nothing stopping one class being both, but in the current implementation, SPPlaybackManager is the playback delegate, which creates an instance of SPCoreAudioController to be the audio delivery delegate. If you modify SPPlaybackManager to have two Core Audio controllers and alternate which one is the audio delivery delegate, you should be golden.

Resources