How to programmatically sense the iPhone mute switch? - ios

I can't seem to find in the SDK how to programatically sense the mute button/switch on the iPhone. When my app plays background music, it responds properly to the volume button without me having any code to follow that but, when I use the mute switch, it just keeps playing away.
How do I test the position of mute?
(NOTE: My program has its own mute switch, but I'd like the physical switch to override that.)

Thanks, JPM. Indeed, the link you provide leads to the correct answer (eventually. ;) For completeness (because S.O. should be a source of QUICK answers! )...
// "Ambient" makes it respect the mute switch
// Must call this once to init session
if (!gAudioSessionInited)
{
AudioSessionInterruptionListener inInterruptionListener = NULL;
OSStatus error;
if ((error = AudioSessionInitialize (NULL, NULL, inInterruptionListener, NULL)))
{
NSLog(#"*** Error *** error in AudioSessionInitialize: %d.", error);
}
else
{
gAudioSessionInited = YES;
}
}
SInt32 ambient = kAudioSessionCategory_AmbientSound;
if (AudioSessionSetProperty (kAudioSessionProperty_AudioCategory, sizeof (ambient), &ambient))
{
NSLog(#"*** Error *** could not set Session property to ambient.");
}

I answered a similar question here (link). The relevant code:
-(BOOL)silenced {
#if TARGET_IPHONE_SIMULATOR
// return NO in simulator. Code causes crashes for some reason.
return NO;
#endif
CFStringRef state;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &state);
if(CFStringGetLength(state) > 0)
return NO;
else
return YES;
}

Some of the code in other answers (including the accepted answer) may not work if you aren't in the ambient mode, where the mute switch is respected.
I wrote the routine below to switch to ambient, read the switch, and then return to the settings I need in my app.
-(BOOL)muteSwitchEnabled {
#if TARGET_IPHONE_SIMULATOR
// set to NO in simulator. Code causes crashes for some reason.
return NO;
#endif
// go back to Ambient to detect the switch
AVAudioSession* sharedSession = [AVAudioSession sharedInstance];
[sharedSession setCategory:AVAudioSessionCategoryAmbient error:nil];
CFStringRef state;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &state);
BOOL muteSwitch = (CFStringGetLength(state) <= 0);
NSLog(#"Mute switch: %d",muteSwitch);
// code below here is just restoring my own audio state, YMMV
_hasMicrophone = [sharedSession inputIsAvailable];
NSError* setCategoryError = nil;
if (_hasMicrophone) {
[sharedSession setCategory: AVAudioSessionCategoryPlayAndRecord error: &setCategoryError];
// By default PlayAndRecord plays out over the internal speaker. We want the external speakers, thanks.
UInt32 ASRoute = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,
sizeof (ASRoute),
&ASRoute
);
}
else
// Devices with no mike don't support PlayAndRecord - we don't get playback, so use just playback as we don't have a microphone anyway
[sharedSession setCategory: AVAudioSessionCategoryPlayback error: &setCategoryError];
if (setCategoryError)
NSLog(#"Error setting audio category! %#", setCategoryError);
return muteSwitch;
}

To find out the state of the mute switch and the volume control I wrote these two functions. These are ideal if you wish to warn the user before they try creating audio output.
-(NSString*)audioRoute
{
CFStringRef state;
UInt32 propertySize = sizeof(CFStringRef);
OSStatus n = AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &state);
if( n )
{
// TODO: Throw an exception
NSLog( #"AudioSessionGetProperty: %#", osString( n ) );
}
NSString *result = (NSString*)state;
[result autorelease];
return result;
}
-(Float32)audioVolume
{
Float32 state;
UInt32 propertySize = sizeof(CFStringRef);
OSStatus n = AudioSessionGetProperty(kAudioSessionProperty_CurrentHardwareOutputVolume, &propertySize, &state);
if( n )
{
// TODO: Throw an exception
NSLog( #"AudioSessionGetProperty: %#", osString( n ) );
}
return state;
}

-(BOOL)isDeviceMuted
{
CFStringRef state;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionInitialize(NULL, NULL, NULL, NULL);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &state);
return (CFStringGetLength(state) > 0 ? NO : YES);
}

I followed the general theory here and got this to work
http://inforceapps.wordpress.com/2009/07/08/detect-mute-switch-state-on-iphone/
Here is a recap: Play a short silent sound. Time how long it takes to play. If the mute switch is on, the playing of the sound will come back much shorter than the sound itself. I used a 500ms sound and if the sound played in less than this time, then the mute switch was on. I use Audio Services to play the silent sound (which always honors the mute switch). This article says that you can use AVAudioPlayer to play this sound. If you use AVAudioPlayer, I assume you'll need to setup your AVAudioSession's category to honor the mute switch, but I have not tried it`.

Using Ambient mode for playing a video and PlayAndRecord mode for recording a video on camera screen, resolves the issue in our case.
The code in application:didFinishLaunchingWithOptions:
NSError *error = nil;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:&error];
[[AVAudioSession sharedInstance] setMode:AVAudioSessionModeVideoRecording error:&error];
[[AVAudioSession sharedInstance] setActive:YES error:&error];
The code in viewWillAppear on cameraController, if you have to use camera or recording in your app
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
The code in viewWillDisappear on cameraController
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:nil];
Using these lines our Application records and plays a video and mute switch works perfectly under both iOS8 and iOS7!!!

For Swift
Below framework works perfectly in device
https://github.com/akramhussein/Mute
Just install using pod or download from Git
pod 'Mute'
and use like below code
import UIKit
import Mute
class ViewController: UIViewController {
#IBOutlet weak var label: UILabel! {
didSet {
self.label.text = ""
}
}
override func viewDidLoad() {
super.viewDidLoad()
// Notify every 2 seconds
Mute.shared.checkInterval = 2.0
// Always notify on interval
Mute.shared.alwaysNotify = true
// Update label when notification received
Mute.shared.notify = { m in
self.label.text = m ? "Muted" : "Not Muted"
}
// Stop after 5 seconds
DispatchQueue.main.asyncAfter(deadline: .now() + 5.0) {
Mute.shared.isPaused = true
}
// Re-start after 10 seconds
DispatchQueue.main.asyncAfter(deadline: .now() + 10.0) {
Mute.shared.isPaused = false
}
}
}

Related

Audio recording using AVAudioEngine with setting output audio port

In order to make avaliable playback and recording at the same time we use these methods for setting AVAudioSession category:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:NULL];
By doing so, the output audio port switches from line out speaker to built-in speaker. In the loop recording window we need simultaneously working playback from line out speaker and audio recording from microphone. To play sound from line out speaker after setting AVAudioSession category we use a method for setting output audio port:
[[AVAudioSession sharedInstance]
overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil];
We try to arrange both recording and playback using AVAudio Engine.
Structure of AVAudioEngine connections:
// input->inputMixer->mainEqualizer ->tap
// 0 0 |0
// |
// |
// |0 0 0
// recordPlayerNode→recordMixer→meteringMixer→|
// 0 1 0 0 |
// |->mainMixer->out
// |
// volumePlayer→|
// 0 1
After execution overrideOutputAudioPort the recording feature stops working on iPhone 6S and higher. We perform recording in this manner:
if(self.isHeadsetPluggedIn)
{
volumePlayer.volume = 1;
}
else
{
volumePlayer.volume = 0.000001;
}
[volumePlayer play];
[mainEqualizer installTapOnBus:0 bufferSize:0 format:tempAudioFile.processingFormat block:^(AVAudioPCMBuffer *buf, AVAudioTime *when)
{
if(self.isRecord)
{
[volumePlayer scheduleBuffer:buf completionHandler:nil];
recordedFrameCount += buf.frameLength;
if (self.isLimitedRecord && recordedFrameCount >= [AVAudioSession sharedInstance].sampleRate * 90)
{
self.isRecord = false;
[self.delegate showAlert:RecTimeLimit];
}
NSError *error;
[tempAudioFile writeFromBuffer:buf error:&error];
if(error)
{
NSLog(#"Allert while write to file: %#",error.localizedDescription);
}
[self updateMetersForMicro];
}
else
{
[mainEqualizer removeTapOnBus:0];
[self.delegate recordDidFinish];
callbackBlock(recordUrl);
[mainEngine stop];
}
}];
During the investigation we have discovered an interesing fact – if
volumePlayer.volume = 1;
when headphones are not connected, then the buffer that comes from microhone starts to fill and the sound keeps recording, but there appears an effect of a very loud sound repetition in the speaker. Otherwise, PCMBuffer is filled with zeros.
The question is: how can we set AVAudioSession, or recording process so we could record audio using a microphone and play audio using line out speaker?
P.S. Recording with AVAudioRecorder works correctly with these settings.

AVAudioRecorder not recording in background after audio session interruption ended

I am recording audio in my app, both in foreground and in background. I also handle AVAudioSessionInterruptionNotification to stop recording when interruption begins and start again when it ends. Although in foreground it works as expected, when app is recording in background and I receive a call it doesn't start again recording after call ends. My code is the following:
- (void)p_handleAudioSessionInterruptionNotification:(NSNotification *)notification
{
NSUInteger interruptionType = [[[notification userInfo] objectForKey:AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
if (interruptionType == AVAudioSessionInterruptionTypeBegan) {
if (self.isRecording && !self.interruptedWhileRecording) {
[self.recorder stop];
self.interruptedWhileRecording = YES;
return;
}
}
if (interruptionType == AVAudioSessionInterruptionTypeEnded) {
if (self.interruptedWhileRecording) {
NSError *error = nil;
[[AVAudioSession sharedInstance] setActive:YES error:&error];
NSDictionary *settings = #{
AVEncoderAudioQualityKey: #(AVAudioQualityMax),
AVSampleRateKey: #8000,
AVFormatIDKey: #(kAudioFormatLinearPCM),
AVNumberOfChannelsKey: #1,
AVLinearPCMBitDepthKey: #16,
AVLinearPCMIsBigEndianKey: #NO,
AVLinearPCMIsFloatKey: #NO
};
_recorder = [[AVAudioRecorder alloc] initWithURL:fileURL settings:settings error:nil];
[self.recorder record];
self.interruptedWhileRecording = NO;
return;
}
}
}
Note that fileURL points to new caf file in a NSDocumentDirectory subdirectory. Background mode audio is configured. I also tried voip and play silence, both to no success.
The NSError in AVAudioSessionInterruptionTypeEnded block is a OSStatus error 560557684 which I haven't found how to tackle.
Any help would be much appreciated.
Error 560557684 is for AVAudioSessionErrorCodeCannotInterruptOthers. This happens when your background app is trying to activate an audio session that doesn't mix with other audio sessions. Background apps cannot start audio sessions that don't mix with the foreground app's audio session because that would interrupt the audio of the app currently being used by the user.
To fix this make sure to set your session category to one that is mixable, such as AVAudioSessionCategoryPlayback. Also be sure to set the category option AVAudioSessionCategoryOptionMixWithOthers (required) and AVAudioSessionCategoryOptionDuckOthers (optional). For example:
// background audio *must* mix with other sessions (or setActive will fail)
NSError *sessionError = nil;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback
withOptions:AVAudioSessionCategoryOptionMixWithOthers | AVAudioSessionCategoryOptionDuckOthers
error:&sessionError];
if (sessionError) {
NSLog(#"ERROR: setCategory %#", [sessionError localizedDescription]);
}
The error code 560557684 is actually 4 ascii characters '!int' in a 32 bit integer. The error codes are listed in the AVAudioSession.h file (see also AVAudioSession):
#enum AVAudioSession error codes
#abstract These are the error codes returned from the AVAudioSession API.
...
#constant AVAudioSessionErrorCodeCannotInterruptOthers
The app's audio session is non-mixable and trying to go active while in the background.
This is allowed only when the app is the NowPlaying app.
typedef NS_ENUM(NSInteger, AVAudioSessionErrorCode)
{
...
AVAudioSessionErrorCodeCannotInterruptOthers = '!int', /* 0x21696E74, 560557684 */
...
I added the following
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
before configuring AVAudioSession and it worked. Still don't know what bugs may appear.
I had this same error when trying to use AVSpeechSynthesizer().speak() while my app was in the background. #progmr's answer solved the problem for me, though I also had to call AVAudioSession.sharedInstance().setActive(true) too.
For completeness, here's my code in Swift 5.
In application(_:didFinishLaunchingWithOptions:):
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playback,
options: [.mixWithOthers,
.duckOthers])
} catch let error as NSError {
print("Error setting up AVAudioSession : \(error.localizedDescription)")
}
Then in my view controller:
do {
try AVAudioSession.sharedInstance().setActive(true)
} catch let error as NSError {
print("Error : \(error.localizedDescription)")
}
let speechUtterance = AVSpeechUtterance(string: "Hello, World")
let speechSynth = AVSpeechSynthesizer()
speechSynth.speak(speechUtterance)
Note: When setActive(true) is called, it reduces the volume of anything else playing at the time. To turn the volume back up afterwards, you need to call setActive(false) - for me the best time to do that was once I'd been notified that the speech had finished in the corresponding AVSpeechSynthesizerDelegate method.

Why can't I record from RemoteIOUnit after changing AudioSession category from SoloAmbient to PlayAndRecord?

My app has both audio play and record features, and I want to only set the audio session's category to PlayAndRecord when the user initiates recording, so the standard audio playback will be muted by the mute switch, etc.
I'm having a problem though, where my call to AudioUnitRender to record audio input is failing with errParam (-50) after I change the audio session category to PlayAndRecord. If I start my app using the PlayAndRecord category, then recording works correctly.
#implementation MyAudioSession
- (instancetype)init {
NSError *error = nil;
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategorySoloAmbient];
[session setActive:YES error:&error];
}
- (void)enableRecording {
void (^setCategory)(void) = ^{
NSError *error;
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
};
// Do I need to set the category from the main thread?
if ([NSThread isMainThread]) {
setCategory();
} else {
dispatch_sync(dispatch_get_main_queue(), ^{
setCategory();
});
}
}
#end
#interface MyRecorder {
AudioUnit ioUnit_;
AudioBufferList *tmpRecordListPtr_
#end
#implementation MyRecorder
- (instancetype)init {
// Sets up AUGraph with just a RemoteIOUnit node, recording enabled, callback, etc.
// Set up audio buffers
tmpRecordListPtr_ = malloc(sizeof(AudioBufferList) + 64 * sizeof(AudioBuffer));
}
- (OSStatus)doRecordCallback:(AudioUnitRenderActionFlags *)ioActionFlags
timeStamp:(AudioTimeStamp *)inTimeStamp
busNumber:(UInt32)busNumber
numFrames:(UInt32)numFrames
bufferOut:(AudioBufferList *)ioData {
// Set up buffers... All this works fine if I initialize the audio session to
// PlayAndRecord in -[MyAudioSession init]
OSStatus status = AudioUnitRender(ioUnit_, ioActionFlags, inTimeStamp, busNumber,
numFrames, tmpRecordListPtr_);
// This fails with errParam, but only if I start my app in SoloAmbient and then
// later change it to PlayAndRecord
}
#end
OSStatus MyRecorderCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags,
AudioTimeStamp *inTimestamp, UInt32 inBusNumber,
UInt32 inNumberFrames, AudioBufferList *ioData) {
MyRecorder *recorder = (MyRecorder *)inRefCon;
return [recorder doRecordCallback:ioActionFlags
timeStamp:inTimestamp
busNumber:inBusNumber
numFrames:inNumberFrames
bufferOut:ioData];
}
I'm testing on an iPod touch (5th gen) running iOS 7.1.2.
Has anybody else encountered this issue? Any suggestions for fixes or more info I can post?
EDIT: Object lifecycle is similar to:
- (void)startRecording {
[mySession enableRecording];
[myRecorder release];
myRecorder = [[MyRecorder alloc] init];
[myRecorder start]; // starts the AUGraph
}
Without looking at your code it is difficult to comment. But I am doing something similar in my app, and I found that it is important to pay careful attention to what audio session settings can be changed only when the audio session is inactive.
// Get the app's audioSession singleton object
AVAudioSession* session = [AVAudioSession sharedInstance];
//error handling
NSError* audioSessionError = nil;
SDR_DEBUGPRINT(("Setting session not active!\n"));
[session setActive:NO error:&audioSessionError]; // shut down the audio session if it is active
It is important to setActive to "NO" prior to changing the session category, for instance. Failure to do so might allow render callbacks to occur while the session is being configured.
Looking at the lifecycle flow, I'm trying to see where you stop the AUGraph prior to setting up the audio session for recording. The code I use for stopping the AUGraph follows. I call it prior to any attempts to reconfigure the audio session.
- (void)stopAUGraph {
if(mAudioGraph != nil) {
Boolean isRunning = FALSE;
OSStatus result = AUGraphIsRunning(mAudioGraph, &isRunning);
if(result) {
DEBUGPRINT(("AUGraphIsRunning result %d %08X %4.4s\n", (int)result, (int)result, (char*)&result));
return;
}
if(isRunning) {
result = AUGraphStop(mAudioGraph);
if(result) {
DEBUGPRINT(("AUGraphStop result %d %08X %4.4s\n", (int)result, (int)result, (char*)&result));
return;
} else {
DEBUGPRINT(("mAudioGraph has been stopped!\n"));
}
} else {
DEBUGPRINT(("mAudioGraph is already stopped!\n"));
}
}
You need to make sure the RemoteIO Audio Unit (the audio graph) is stopped before deactivating and/or changing the audio session type. Then (re)initialize the RemoteIO Audio Unit after setting the new session type and before (re)starting the graph, as the new session type or options may change some of the allowable settings. Also, it helps to check all the prior audio unit and audio session call error codes before any graph (re)start.

How to play audio with OutputAudioQueue in silent mode? [duplicate]

I want to play a sound even in silent mode in iPhone.
Can it be done by using AVAudioPlayer (Without using AVAudioSession)
(For ios 3.0+)
Thanks in advance.
Actually, you can do this. It is controlled via the Audio Session and has nothing to do with AVAudioPlayer itself. Why don't you want to use AudioSession? They play nice together...
In your app, you should initialize the Audio Session, and then you can also tell indicate what kind of audio you intend to play. If you're a music player, then it sort of makes sense that the user would want to hear the audio even with the ring/silent switch enabled.
AudioSessionInitialize (NULL, NULL, NULL, NULL);
AudioSessionSetActive(true);
// Allow playback even if Ring/Silent switch is on mute
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty (kAudioSessionProperty_AudioCategory,
sizeof(sessionCategory),&sessionCategory);
I have an app that I do this very thing, and use AVAudioPlayer to play audio, and with the ring/silent switch enabled, I can hear the audio.
UPDATE (11/6/2013)
In the app I mentioned above, where I used the code above successfully, I have (for some time) been using the following code instead to achieve the same result:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *error = nil;
BOOL result = NO;
if ([audioSession respondsToSelector:#selector(setActive:withOptions:error:)]) {
result = [audioSession setActive:YES withOptions:0 error:&error]; // iOS6+
} else {
[audioSession setActive:YES withFlags:0 error:&error]; // iOS5 and below
}
if (!result && error) {
// deal with the error
}
error = nil;
result = [audioSession setCategory:AVAudioSessionCategoryPlayback error:&error];
if (!result && error) {
// deal with the error
}
I thought I'd post this as an alternative, in light of the most recent comment to this answer. :-)
MarkGranoff's solution is correct. However, if you prefer to do it in Obj-c instead of C, the following works as well:
NSError *error = nil;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
[[AVAudioSession sharedInstance] setActive:YES error:&error];
The above answers are correct. Following is the Swift version.
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
//print("AVAudioSession Category Playback OK")
do {
try AVAudioSession.sharedInstance().setActive(true)
//print("AVAudioSession is Active")
} catch _ as NSError {
//print(error.localizedDescription)
}
} catch _ as NSError {
//print(error.localizedDescription)
}
Swift 4 simple version:
try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [])
This will simply do the trick (using AVAudioSession)
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
try AVAudioSession.sharedInstance().setActive(true)

Bluetooth output with AVAudioSessionCategoryPlayAndRecord session category doesn't work

Anyone have any idea why I can't get bluetooth output to work on an iPad (iOS 5) under the AVAudioSessionCategoryPlayAndRecord category.
Below is the code I'm using to setup the AVAudioSessionCategoryPlayAndRecord category and enable bluetooth. If I switch the category to AVAudioSessionCategoryPlayback, bluetooth output works just fine to a variety of headsets and headphones. As soon as I switch to PlayAndRecord, no bluetooth output.
What am I doing wrong?
Suggestions, hypotheses appreciated. Thanks.
- (BOOL) setAudioSessionCategoryToPlayAndRecordWithBluetoothEnabled {
AudioSessionInitialize(NULL,NULL,NULL,NULL);
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err;
if (![audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err])
{
fprintf(stderr,"oh no...");
return NO;
}
// make active
[audioSession setActive: YES error: nil];
// allow bluetooth
UInt32 allowBluetoothInput = 1;
OSStatus stat = AudioSessionSetProperty (
kAudioSessionProperty_OverrideCategoryEnableBluetoothInput,
sizeof (allowBluetoothInput),
&allowBluetoothInput
);
if (stat) {
fprintf(stderr,"not good...");
return FALSE;
}
// check for bluetooth route
// (NB: this never works if category is AVAudioSessionCategoryPlayAndRecord)
UInt32 size = sizeof(CFStringRef);
CFStringRef route;
stat = AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &size, &route);
NSLog(#"route = %#", route);
return TRUE;
}

Resources