I am working on iOS app which will be compatible with iOS 6/7 and stream audio .mp3 files from a website.
I have already gotten this to work using the following code:
-(NSString*)documentsFolder
{
NSString* dataPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
if (![[NSFileManager defaultManager] fileExistsAtPath:dataPath])
[[NSFileManager defaultManager] createDirectoryAtPath:dataPath withIntermediateDirectories:NO attributes:nil error:NULL];
return dataPath;
}
-(NSString*)createURLFile:(NSString*)songURL
{
NSString* M3U_FILE = #"song.m3u";
NSString* path = [NSString stringWithFormat:#"%#",[[self documentsFolder] stringByAppendingPathComponent:M3U_FILE]];
if([[NSFileManager defaultManager] createFileAtPath:path contents:nil attributes:nil])
{
NSFileHandle* outFile = [NSFileHandle fileHandleForWritingAtPath:path];
if(outFile != nil)
{
NSData* buffer = [songURL dataUsingEncoding:NSUTF8StringEncoding];
[outFile writeData:buffer];
return path;
}
}
return nil;
}
- (void)createStreamer
{
// Remove any previous references.
[[NSNotificationCenter defaultCenter] removeObserver:self];
// Create a new player.
NSString* fileURL = [self createURLFile:self.aSong.songpath];
self.songPlayer = [[AVPlayer alloc]initWithURL:[NSURL fileURLWithPath:fileURL]];
NSAssert(self.songPlayer != nil, #"NIL AVPlayer Created!!!");
// Observer for when the song ends...
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.songPlayer currentItem]];
[[UIApplication sharedApplication] setIdleTimerDisabled: YES];
}
I store the url for the .mp3 file in a local .m3u file and use that to load up the AVPlayer. In earlier versions of iOS, I was told that the AVPlayer would load the song first and then play it, not stream it immediately. While this does not appear to be true in iOS 6/7 (the song starts streaming almost immediately), the .m3u file was being created in case there were any problems created by not having it done this way.
With this, a loop is monitoring the status of the AVPlayer and after a few seconds, the audio starts to play out the phone without a problem.
For testing purposes, I set up an MPVolumeView on the page which plays songs:
MPVolumeView *volumeView = [[[MPVolumeView alloc] initWithFrame:CGRectMake(0, 0, 310, 20)] autorelease];
volumeView.center = CGPointMake(160,62);
[volumeView sizeToFit];
[self.view addSubview:volumeView];
The reason for this is that the volume slider will also show an indicator if the bluetooth is connected as an audio output source and allow me to change the audio route between the phone and the bluetooth device. So far, so good.
I connected my phone to my Jawbox Jambone via bluetooth, start the AVPlayer on a song, and the song comes out of the Jawbox as expected. The volume control has the small "rectangle with arrow" indicating that I can switch the audio output and indeed, while the song is playing, I can switch between the phone and the Jawbox. Happiness.
The problem occurs when I try to connect it to a car. I have two experiences with this:
The car is already paired with the phone for making/receiving calls. The phone even indicates it is paired when I get into the car. But when I use the same code to play the same audio files, they only come out of the phone. The volume slider does not show the "bluetooth route" indicator at all (like it does not recognize the car as a audio output route).
In another car, the audio was streaming from another app (some radio streaming app). The other app was stopped and this one started. The audio started playing for the same song tested above, but stopped after a second or two. Again, there was no indicator on the volume slider that the bluetooth was connected at this point.
Can somebody explain to me why the audio could stream fine out to one bluetooth device but not to another?
Have I missed something (an entitlement?) in the profile for my app that will allow it to stream audio via bluetooth to a car?
There is this project at GIT.
Play iOS project is a streaming client for Play that runs on your iPhone/iPad. It supports background audio as well as the media keys when backgrounded.
It supports:
Streams shoutcast stream
Displays currently playing track
Background audio
Lock screen album art & play controls
AirPlay (along with Bluetooth) streaming. Supports sending metadata
and album art
You can download the project here.
I have not tested this on CAR bluetooth audio player though. Hope it may be of any help to you.
In the first example, your car may simply be a remote player. You would need to register for remote events like this (consider using an AVAudioPlayer instead of an AVPlayer also)
Setup the AudioSession to recognize a bluetooth audio route:
- (BOOL)prepareAudioSession {
// deactivate existing session
NSError *setCategoryError = nil;
NSError *activationError = nil;
BOOL success = [[AVAudioSession sharedInstance] setActive:NO error: nil];
if (!success) {
NSLog(#"deactivationError");
}
// set audio session category AVAudioSessionCategoryPlayAndRecord options AVAudioSessionCategoryOptionAllowBluetooth
success = [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionAllowBluetooth error:&setCategoryError];
if (!success)
{
NSLog(#"setCategoryError %#",setCategoryError);
}
// activate audio session
success = [[AVAudioSession sharedInstance] setActive:YES error: &activationError];
if (!success) {
NSLog(#"activationError");
}
return success;
}
You can check the routes:
AVAudioSessionRouteDescription *mAVASRD = audioSession.currentRoute;
NSLog(#"the array is %#",mAVASRD.outputs);
for (int ctr = 0; ctr < [mAVASRD.outputs count]; ctr++)
{
AVAudioSessionPortDescription *myPortDescription = [mAVASRD.outputs objectAtIndex:ctr];
NSLog(#"the type is %#",myPortDescription.portType);
NSLog(#"the name is %#",myPortDescription.portName);
NSLog(#"the UID is %#",myPortDescription.UID);
NSLog(#"the data sources are %#",myPortDescription.dataSources);
}
Then initialize your AVAudioPlayer and turn on RemoteControlEvents (you can use the console in your car to send play/pause/etc)
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
then implement something like the delegate method for AVAudioPlayer in this stack overflow question to capture the received events and react accordingly in your code:
AVAudioPlayer on Lock Screen
In the scenario 2, when you moved one app to the background (the radio streaming app) and started your app, the likely culprit for the issue is the same cause - your app has to recognize the bluetooth route for audio.
By the way for phone calls and Siri, the iOS uses a different Bluetooth channel that the default for remote control (which is the one I am describing for your car).
When you setup this route and remote control events, you also get a bonus byproduct - your app will be controllable from the lock screen. Check out this technical note from Apple to configure your app to play in the background as well if that is also something you need to do when the screen locks: Technical QA document QA1668
Finally, for added integration via your bluetooth route, look at MPNowPlayingInfoCenter - put the title artist artwork and other good stuff on the lock screen and on most bluetooth screens in the car that are displaying that information.
I'm pretty sure MPVolumeView can only address Bluetooth devices which conform to the newer low power consumption Bluetooth spec... (Bluetooth Low Energy or BLE)...
I know the phone app doesn't use MPVolumeView, probably this other audio player doesn't either.. You may need to look into CoreBluetooth and implement your own :( good luck. There may be a solution on github
A bluetooth speaker designed as a speaker will be no problem.
However, a car will usually be a "phone" bluetooth speaker and will only accept a "phone" type of communications.
My guess would be that you would have to trick it by setting up a "phone audio" connection and having the incoming audio transfer into the void, and the outgoing music stream as a phone signal.
Mind you, signal quality might degrade and there probaly won't be a fix for that.
Related
I have created a camera using AVCaptureSession. I have configured that for both Photo and Video recording modes.
Camera and App is running fine. Also I allowed background music play (If user play song using Music App in iPhone) while open camera or recording video. It is also working fine. (Attached image 2)
I allowed background Music play with the help of this code
AVAudioSession *session1 = [AVAudioSession sharedInstance];
[session1 setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionMixWithOthers|AVAudioSessionCategoryOptionDefaultToSpeaker|AVAudioSessionCategoryOptionAllowBluetooth error:nil];
[session1 setActive:YES error:nil];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
Now if i receive a call, minimize phone call screen by tapping on Home button and open app and want to open camera screen to capture image / record video, It opens but freeze with a image (Attached image(1)).
Now my requirement is, i want to capture image / record video while on phone call. I looked for another apps, and Snapchat is doing same, and i am able to record video while i am on call.
please help me, how can i achieve this.
You need to use the AVCaptureSessionWasInterruptedNotification and AVCaptureSessionInterruptionEndedNotification callbacks and disconnect the audio capture while the session is interrupted:
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(sessionWasInterrupted:) name:AVCaptureSessionWasInterruptedNotification object:self.session];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(sessionInterruptionEnded:) name:AVCaptureSessionInterruptionEndedNotification object:self.session];
// note that self.session is an AVCaptureSession
-
- (void)sessionWasInterrupted:(NSNotification *)notification {
NSLog(#"session was interrupted");
AVCaptureDevice *device = [[self audioInput] device];
if ([device hasMediaType:AVMediaTypeAudio]) {
[[self session] removeInput:[self audioInput]];
[self setAudioInput:nil];
}
}
- (void)sessionInterruptionEnded:(NSNotification *)notification {
NSLog(#"session interuption ended");
}
// note that [self audioInput] is a getter for an AVCaptureDeviceInput
This will allow the camera to continue running and allows it to capture stills / silent video
Now as for how to reconnect the audio after the call ends.. let me know if you figure it out, seemed impossible as of iOS 10: Callback when phone call ends? (to resume AVCaptureSession)
I have added an observer for the interrupt notification when recording audio.
This works fine when performing an outgoing-call, getting an incoming call and not answering, Siri, etc..
Now my app is running in the background with the red bar at the top of the screen, and continuing the recording in the states described above is not a problem.
But when I actually answer an incoming-call. I get another AVAudioSessionInterruptionTypeBegan notification and then when I stop the call, I never get a notification AVAudioSessionInterruptionTypeEnded type.
I have tried using the CTCallCenter to detect when a call has started, but I am unable to restart the recording from that callback.
Does anyone know how to get the interrupt mechanism to work with an incoming call that is actually getting answered?
This is (part of) the code I am using;
CFNotificationCenterAddObserver(
CFNotificationCenterGetLocalCenter(),
this,
&handle_interrupt,
(__bridge CFStringRef) AVAudioSessionInterruptionNotification,
NULL,
CFNotificationSuspensionBehaviorDeliverImmediately );
...
static void handle_interrupt( CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo )
{
au_recorder *recorder = static_cast<au_recorder*>( observer );
NSNumber* interruptionType = [( ( __bridge NSDictionary* ) userInfo ) objectForKey:AVAudioSessionInterruptionTypeKey];
switch ( [interruptionType unsignedIntegerValue] )
{
case AVAudioSessionInterruptionTypeBegan:
{
// pause recorder without stopping recording (already done by OS)
recorder->pause();
break;
}
case AVAudioSessionInterruptionTypeEnded:
{
NSNumber* interruptionOption = [( ( __bridge NSDictionary* ) userInfo ) objectForKey:AVAudioSessionInterruptionOptionKey];
if ( interruptionOption.unsignedIntegerValue == AVAudioSessionInterruptionOptionShouldResume )
{
recorder->resume();
}
break;
}
}
}
I have tried binding the notification to either the AppDelegate, a UIViewController and a separate class, but that doesn't appear to help.
Edit
This is what I tried using the CTCallCenter, but this is very flaky. When recorder->resume() is called from the callback, it either works, crashes violently or doesn't do anything at all until the app is put back in the foreground again manually.
callCenter = [[CTCallCenter alloc] init];
callCenter.callEventHandler = ^(CTCall *call)
{
if ([call.callState isEqualToString:CTCallStateDisconnected])
{
recorder->resume();
}
};
UPDATE
If you hackily wait for a second or so, you can restart the recording from your callEventHandler (although you haven't described your violent crashes, it works fine for me):
if ([call.callState isEqualToString:CTCallStateDisconnected])
{
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC), dispatch_get_main_queue(), ^{
self.recorder->resume();
});
}
This is all without a background task or changing to an exclusive audio session. The delay works around the fact that the call ending notification comes in before Phone.app deactivates its audio session & 1 second seems to be small enough to fall into some kind of background descheduling grace period. I've lost count of how many implementation details are assumed in this, so maybe you'd like to read on to the
Solution that seems to be backed by documentation:
N.B While there's a link to "suggestive" documentation for this solution, it was mostly guided by these two errors popping up and their accompanying comments in the header files:
AVAudioSessionErrorCodeCannotInterruptOthers
The app's audio session is non-mixable and trying to go active while in the background.
This is allowed only when the app is the NowPlaying app.
AVAudioSessionErrorInsufficientPriority
The app was not allowed to set the audio category because another app (Phone, etc.) is controlling it.
You haven't shown how your AVAudioSession is configured, but the fact that you're not getting an AVAudioSessionInterruptionTypeEnded notification suggests you're not using an exclusive audio session (i.e. you're setting "mix with others"):
[session setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionMixWithOthers error:&error];
The obsoleted CTCallCenter and newer CXCallObserver callbacks seem to happen too early, before the interruption ends, and maybe with some further work you could find a way to run a background task that restarts the recording a little while after the call ends (my initial attempts at this failed).
So right now, the only way I know to restart the recording after receiving a phone call is:
Use an exclusive audio session (this gives you an end interruption):
if (!([session setCategory:AVAudioSessionCategoryPlayAndRecord error:&error]
&& [session setActive:YES error:&error])) {
NSLog(#"Audio session error: %#", error);
}
and integrate with "Now Playing" (not joking, this is a structural part of the solution):
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[commandCenter.playCommand addTargetWithHandler:^(MPRemoteCommandEvent *event) {
NSLog(#"PLAY");
return MPRemoteCommandHandlerStatusSuccess;
}];
[commandCenter.pauseCommand addTargetWithHandler:^(MPRemoteCommandEvent *event) {
NSLog(#"PAUSE");
return MPRemoteCommandHandlerStatusSuccess;
}];
// isn't this mutually exclusive with mwo? or maybe that's remote control keys
// not seeing anything. maybe it needs actual playback?
NSDictionary* nowPlaying = #{
MPMediaItemPropertyTitle: #"foo",
MPMediaItemPropertyArtist: #"Me",
MPMediaItemPropertyAlbumTitle: #"brown album",
};
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = nowPlaying;
Pieces of this solution were cherry picked from the Audio Guidelines for User-Controlled Playback and Recording Apps documentation.
p.s. maybe the lock screen integration or non-exclusive audio session are deal breakers for you, but there are other situations where you'll never get an end interruption, e.g. launching another exclusive app, like Music (although this may not be an issue with mixWithOthers, so maybe you should go with the code-smell delay solution.
i am having issues with playing a video through url using avplayer and avplayerviewcontroller
it plays fine on simulator but doesnt work on device.
Is it a problem with the static ip i have been using as follows -
https://ipaddress(111.111.11.11:443)/mc/images/wall/small.mp4
same video from the following url plays fine on device
http://techslides.com/demos/sample-videos/small.mp4
the code below i am using -
NSURL *url = [NSURL URLWithString:#"http://techslides.com/demos/sample-videos/small.mp4"];//#""];https://180.179.77.47:443/mc/images/wall/small.mp4
AVAsset *videoAsset = [AVAsset assetWithURL:url];
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:videoAsset];
playerVideo = [AVPlayer playerWithURL:url];
[[AVAudioSession sharedInstance]setCategory:AVAudioSessionModeMoviePlayback error:nil];
movieController = [[AVPlayerViewController alloc]init];
[movieController setAllowsPictureInPicturePlayback:YES];
movieController.player = playerVideo;
movieController.delegate = self;
movieController.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self presentViewController:movieController animated:YES completion:nil];
I was also facing the same issue short while ago, but just now observed that my device was on silent mode & hence it was running video without sound.
But it should not have happened. If I run YouTube app & plays video, it works fine even if device is on silent mode.
I found the answer. Following is the Swift 3 code. If you set the category to AVAudioSessionCategoryPlayback and if the device is on silent mode then the app plays video with sound.
If you set the category to AVAudioSessionCategoryAmbient, and if you run the app on device with silent mode on, then the app will play the video without sound.
let audioSession = AVAudioSession.sharedInstance()
do
{
try audioSession.setCategory(AVAudioSessionCategoryPlayback, with: .duckOthers)
}
catch
{
print("AVAudioSession cannot be set")
}
I think your video and your device are not withing same network. Used common network for both or put it at remote location from where everyone can access it.
You can play video in simulator because your system (mac) is connected with common network on which video is available but i think you are using different network for mobile network.
Hope this will help :)
I've created my own custom controls for use with the MPMoviePlayerController. So far everything works except the mute button control.
I've configured the AVAudioSession using the following code before I create my instance of the MPMoviePlayerController.
NSError *modeError = nil;
[self.audioSession setMode:AVAudioSessionModeMoviePlayback error:&modeError];
if (modeError != nil) {
NSLog(#"Error setting mode for AVAudioSession: %#", modeError);
}
NSError *categoryError = nil;
[self.audioSession setCategory:AVAudioSessionCategoryPlayback error:&categoryError];
if (categoryError != nil) {
NSLog(#"Error setting category for AVAudioSession: %#", categoryError);
}
Then in my mute button callback method I have the following code:
NSError *activeError = nil;
[self.audioSession setActive:NO error:&activeError];
if (activeError != nil) {
NSLog(#"Error setting inactive state for AVAudioSession: %#", activeError);
}
When clicking the Mute button I get the following unuseful error:
Error Domain=NSOSStatusErrorDomain Code=560030580 "The operation couldn’t be completed. (OSStatus error 560030580.)"
I am linking to the AVFoundation framework.
This is really starting to bug me as I can't for the life of me work out a way to reduce or mute the playback audio of my application.
I don't want to change the system global volume just the application level volume as defined by the AVAudioSession AVAudioSessionCategoryPlayback category.
It seems that you can set the volume of the AVAudioPlayer but not the MPMoviePlayerController. I've seen other posts here on SO that say just create an instance of AVAudioPlayer and set the volume but this just causes my app to crash and I expect it has something to do with the fact I'm not using the initWithContentsOfURL:error: or initWithData:error: and instead using `init'.
Any help would be appreciated.
After speaking to an Apple technician it turns out that it's not possible to control or mute the audio using MPMoviePlayerController.
Instead you have to create your own controller using AVFoundations AVPlayer class.
Once you're using that it's a matter of creating a custom audio mix and setting the volume level. It actually works very well.
Sample code:
AVURLAsset * asset = [AVURLAsset URLAssetWithURL:[self localMovieURL] options:nil];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
// Mute all the audio tracks
NSMutableArray * allAudioParams = [NSMutableArray array];
for (AVAssetTrack *track in audioTracks) {
AVMutableAudioMixInputParameters *audioInputParams =[AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:0.0 atTime:kCMTimeZero ];
[audioInputParams setTrackID:[track trackID]];
[allAudioParams addObject:audioInputParams];
}
AVMutableAudioMix * audioZeroMix = [AVMutableAudioMix audioMix];
[audioZeroMix setInputParameters:allAudioParams];
// Create a player item
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem setAudioMix:audioZeroMix]; // Mute the player item
// Create a new Player, and set the player to use the player item
// with the muted audio mix
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
self.mPlayer = player;
[mPlayer play];
I've written an MPMoviePlayerController replacement class that adds support for volume level. I will upload the to github shortly and add the link in this post.
I know this is an old post, but I managed to find a way to successfully control the volume of the MPMoviePlayerController control in iOS6 & iOS7, using an MPVolumeView. One gotcha is that it does NOT work in the simulator, only on the physical device. For just controlling the volume, adding a hidden MPVolumeView will work fine. However if you use a hidden one, the native OS volume display that appears when you change the volume using the physical device volume buttons will still appear centre screen. If you want to prevent this, make sure your MPVolumeView is not hidden. Instead, you can give it a very low alpha transparency and place it behind other views, so the user can't see it.
Here's the code i've used:
MPVolumeView *volumeView = [[MPVolumeView alloc]initWithFrame:CGRectZero];
[volumeView setShowsVolumeSlider:YES];
[volumeView setShowsRouteButton:NO];
// control must be VISIBLE if you want to prevent default OS volume display
// from appearing when you change the volume level
[volumeView setHidden:NO];
volumeView.alpha = 0.1f;
volumeView.userInteractionEnabled = NO;
// to hide from view just insert behind all other views
[self.view insertSubview:volumeView atIndex:0];
This allows you to control the volume by calling:
[[MPMusicPlayerController applicationMusicPlayer] setVolume:0.0];
But I was still getting the native OS volume display appearing the first time I would try to change the volume - on subsequent loads it did not show this display, so figuring it was something to do with the stage in the viewcontroller life cycle, I moved it from my viewDidLoad method to the viewDidAppear method - it worked - the volume muted and the native volume display did not appear, but I now was able to hear a split second of audio before the video started playing. So I hooked into the playback state did change delegate of the MPMoviePlayerController. In viewDidload I added:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(videoPlaybackStateDidChange:)
name:MPMoviePlayerPlaybackStateDidChangeNotification object:nil];
And the delegate callback method:
-(void)videoPlaybackStateDidChange:(NSNotification *)aNotification
{
// Note, this doesn't work in simulator (even in iOS7), only on actual device!
if ([moviePlayerController playbackState] == MPMoviePlaybackStatePlaying)
{
[[MPMusicPlayerController applicationMusicPlayer] setVolume:0.0];
}
}
This muted the audio of the video before it started playing, but after the viewDidLoad in the life cycle, so no native OS volume muted display.
In my app, I retrieved and stored the current volume level before muting (using [MPMusicPlayerController applicationMusicPlayer].volume property), and then restored the volume to this level when the view controller was closed, meaning the user would be unaware that their device volume level was modified and reverted.
Also, if your MPMoviePlayerController is using a non-standard audio route in iOS7, calling [[MPMusicPlayerController applicationMusicPlayer] setVolume:0.0] may not work for you - in this case you can loop through the subviews of your MPVolumeView control until you find a view which subclasses UISlider. You can then call [sliderView setValue:0 animated:NO] which should work for you. This method isn't using any private Apple APIs so shouldn't get your app rejected - after all there are so many legitimate reasons why you would offer this functionality, and it was possible in iOS6 without having to go to these lengths! In fact, I was bamboozled to discover that Apple had removed the functionality to set the volume on MPMoviePlayerController in iOS7 in the first place.. enforced migration to AVPlayer?
Update: My iPad app has now been approved using this method and is live on the app store.
I have a piano application. It's working fine, with a little error. If I play several keys at the same time very fast, the sounds disappears for a couple of seconds, and receive the following message in the console
AudioQueueStart posting message to kill mediaserverd
Here is the relevant code:
-(IBAction)playNoteFromKeyTouch:(id) sender{
[NSThread detachNewThreadSelector:#selector(playNote:) toTarget:self withObject:[NSString stringWithFormat:#"Piano.mf.%#",[sender currentTitle]]];
}
-(void)playNote:(NSString *) note{
NSError *err;
NSString *path = [[NSBundle mainBundle] pathForResource:note ofType:#"aiff"];
AVAudioPlayer *p = [[AVAudioPlayer alloc ] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:&err];
p.delegate = self;
if (err) {
NSLog(#"%#", err);
}else{
[p prepareToPlay];
[p play];
}
}
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
[player release];
}
I have tested with Instruments and I don't have any memory leak. If somebody could have an idea to avoid this error it would be appreciated.
I suffer from a similar issue.
I spent ages trying to solve the issue, and I think my particular issue takes place when:
I'm in an AudioCategory that doesn't allow sound to play while the mute switch is on.
I start to play a sound (I actually don't do this in the app, but this is how I can reproduce reliably).
With the sound still playing, I switch to another AudioCategory that doesn't allow sound to play while the mute switch is on.
From this point onwards, I get 'posting message to kill mediaserverd' from what looks like various points in calls in the AudioSession API. The app hangs, the device hangs, and I struggle to get the device back to a normal running state.
According to this message it's the device's mute switch.
It turns out that having the iPad muted via the device's physical switch was
causing the problem with my app. As long as the button is not
switched on the problem does not occur.
Sheesh. How to programmatically override?
I "solved" the issue using SoundBankPlayer instead of AVAudioPlayer. SoundBanker info.