AVAudioRecorder blocks AVSpeechSynthesizer - avaudiorecorder

I'm using Cordova 3.2 for text to speech and speech to text. Under iOS 7, AVSpeechSynthesizer is available and works very well. Here is the critical bit of the plugin:
self.synthesizer = [[AVSpeechSynthesizer alloc] init];
self.synthesizer.delegate = self;
NSString * toBeSpoken=[command.arguments objectAtIndex:0];
NSNumber * rate=[command.arguments objectAtIndex:1];
NSString * voice=[command.arguments objectAtIndex:2];
NSNumber * volume=[command.arguments objectAtIndex:3];
NSNumber * pitchMult=[command.arguments objectAtIndex:4];
AVSpeechUtterance *utt =[[AVSpeechUtterance alloc] initWithString:toBeSpoken];
utt.rate= [rate floatValue]/4;// this is to slow down the speech rate
utt.volume=[volume floatValue];
utt.pitchMultiplier=[pitchMult floatValue];
utt.voice=[AVSpeechSynthesisVoice voiceWithLanguage:voice];
[self.synthesizer speakUtterance:utt];
The problem occurs after the text is spoken. Using the Cordova Media call to record (AVAudioRecorder) the response for voice to text conversion does something to disrupt the synthesizer output.
Some things that I've noticed during my attempts to figure this out:
Running in the simulator, there is no problem. In fact, I have to be careful to wait for the speech to end before recording, otherwise the record will pick the output through the microphone.
On the iPad 3 w/ iOS 7+, starting the recording will pause the synthesizer output until the recorded file is played back. The media reference to the file is released after the successful recording.
After recording, the synthesizer delegate receives responses:
TTSPlugin did start speaking
TTSPlugin will speak in range of speech string.
TTSPlugin will speak in range of speech string.
TTSPlugin will speak in range of speech string.
TTSPlugin did cancel speaking
Canceling the speech synthesize clears the utterance queue.
My goal is to be able to have a conversation with the app. I'm not able to find where the interference is. Help?
EDIT
I solved the problem. The culprit was AVAudioSession, which the Media plugin in Cordova was managing. I hadn't done multiple audio sources before so this was a stumper.
I added these to my TTS plugin to manage AVAudioSession and activate it as needed. Everything is fine now.
// returns whether or not audioSession is available - creates it if necessary
- (BOOL)hasAudioSession
{
BOOL bSession = YES;
if (!self.avSession) {
NSError* error = nil;
self.avSession = [AVAudioSession sharedInstance];
if (error) {
// is not fatal if can't get AVAudioSession , just log the error
NSLog(#"error creating audio session: %#", [[error userInfo] description]);
self.avSession = nil;
bSession = NO;
}
}
return bSession;
}
- (void)setAudioSession
{
if ([self hasAudioSession]) {
NSError* __autoreleasing err = nil;
NSNumber* playAudioWhenScreenIsLocked = 0;
BOOL bPlayAudioWhenScreenIsLocked = YES;
if (playAudioWhenScreenIsLocked != nil) {
bPlayAudioWhenScreenIsLocked = [playAudioWhenScreenIsLocked boolValue];
}
NSString* sessionCategory = bPlayAudioWhenScreenIsLocked ? AVAudioSessionCategoryPlayback : AVAudioSessionCategorySoloAmbient;
[self.avSession setCategory:sessionCategory error:&err];
if (![self.avSession setActive:YES error:&err]) {
// other audio with higher priority that does not allow mixing could cause this to fail
NSLog(#"Unable to play audio: %#", [err localizedFailureReason]);
}
}
}

startRecordingAudio method in Media(CDVSound.m) set AVAudioSession as "AVAudioSessionCategoryRecord".
In my case, I just I added following 1 line, it works for me.
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
[self.synthesizer speakUtterance:utt];

Related

Upload audio clip realtime while its recording?

How to upload audio clip realtime to a server while its recording? Basically my requirement is upload an audio clip as chucks/packets while its recording.
I already did the recording part with using IQAudioRecorderController https://github.com/hackiftekhar/IQAudioRecorderController. It records the audio and save to TemporaryDirectory.
I wanted to know how to upload realtime without saving the audio clip.
This is the recording part
//Unique recording URL
NSString *fileName = [[NSProcessInfo processInfo] globallyUniqueString];
_recordingFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"%#.m4a",fileName]];
// Initiate and prepare the recorder
_audioRecorder = [[AVAudioRecorder alloc] initWithURL:[NSURL fileURLWithPath:_recordingFilePath] settings:recordSetting error:nil];
_audioRecorder.delegate = self;
_audioRecorder.meteringEnabled = YES;
// Recording start
- (void)recordingButtonAction:(UIBarButtonItem *)item
{
if (_isRecording == NO)
{
_isRecording = YES;
//UI Update
{
[self showNavigationButton:NO];
_recordButton.tintColor = _recordingTintColor;
_playButton.enabled = NO;
_trashButton.enabled = NO;
}
/*
Create the recorder
*/
if ([[NSFileManager defaultManager] fileExistsAtPath:_recordingFilePath])
{
[[NSFileManager defaultManager] removeItemAtPath:_recordingFilePath error:nil];
}
_oldSessionCategory = [[AVAudioSession sharedInstance] category];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:nil];
[_audioRecorder prepareToRecord];
[_audioRecorder record];
}
else
{
_isRecording = NO;
//UI Update
{
[self showNavigationButton:YES];
_recordButton.tintColor = _normalTintColor;
_playButton.enabled = YES;
_trashButton.enabled = YES;
}
[_audioRecorder stop];
[[AVAudioSession sharedInstance] setCategory:_oldSessionCategory error:nil];
}
}
// Recording done
-(void)doneAction:(UIBarButtonItem*)item
{
if ([self.delegate respondsToSelector:#selector(audioRecorderController:didFinishWithAudioAtPath:)])
{
IQAudioRecorderController *controller = (IQAudioRecorderController*)[self navigationController];
[self.delegate audioRecorderController:controller didFinishWithAudioAtPath:_recordingFilePath];
}
[self dismissViewControllerAnimated:YES completion:nil];
}
There are various ways of solving this, one way is to create your own AudioGraph. The AudioGraph can grab samples from microphone or from a file. Then you proceed to an output unit, but install a callback to get the sampled frames. These you then push to your network class which then can upload packet by packet.
A good example that shows you how to write these captured packets to disk is AVCaptureAudioDataOutput .
In that example packets are written suing ExtAudioFileWriteAsync. You have to replace this with your own logic for uploading to a server. Note that while you can do that easily, one problem is that this will give you raw audio samples. If you need them as wave file or similar, you may need to wait until recording is finished, since the header of the file needs an information about contained audio samples.
The code you are currently using will work for you if you want to upload recorded file after recording is done as it will give you the final recorded file.
If you want to upload live audio recording to the server then I think you have to go with combination of,
AudioSession for recording stuff
ffmpeg for uploading your live audio to server.
You can get good help for recording audio and managing Audio Buffers from here
For ffmpeg I think you have to lear a lot. It will be easy to send static/saved audio file to server using ffmpeg but for sending live Audio Buffer to server will be tricky job.

How do you record audio directly from iOS

My app involves creating and recording sounds you make within the app. I have tried multiple ways of recording like AVAudioRecord. So far these only offer a way to record through the microphone but I want to record the actual sounds made within the app. How on earth do I achieve this. I've been on this for days, can someone please help me, thanks!
I also use Audio unit for playback in my application.
This is what I have so far using AVAudioRecord:
_audioController = [[AEAudioController alloc] initWithAudioDescription:[AEAudioController nonInterleavedFloatStereoAudioDescription] inputEnabled:YES];
- (void)beginRecording {
// Init recorder
self.recorder = [[AERecorder alloc] initWithAudioController:_audioController];
NSString *documentsFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)
objectAtIndex:0];
NSString *filePath = [documentsFolder stringByAppendingPathComponent:#"Myaudio.aiff"];
NSLog(#"%#", filePath);
// Start the recording process
NSError *error = NULL;
if ( ![_recorder beginRecordingToFileAtPath:filePath
fileType:kAudioFileAIFFType
error:&error] ) {
// Report error
return;
}
// Receive both audio input and audio output. Note that if you're using
// AEPlaythroughChannel, mentioned above, you may not need to receive the input again.
[_audioController addInputReceiver:_recorder];
[_audioController addOutputReceiver:_recorder];
}

iOS - Unable to play a sound in background

I'm currently trying to implement a system like the alarm one to alert the phone when an event occurred (in my case a bluetooth event). I want this alert to occur even if the phone is in silent and in background.
I create a local notification but i can't get sound played if the phone is in silent mode (which seems to be normal since we put the phone in silent).
So i tried to manage the sound by myself and i'm struggling with playing sound in background. So far i implement the "App plays audio or streams audio/video using AirPlay" key in my plist and i'm using this code.
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *error = nil;
BOOL result = [audioSession setActive:YES error:&error];
if ( ! result && error) {
NSLog(#"Error For AudioSession Activation: %#", error);
}
error = nil;
result = [audioSession setCategory:AVAudioSessionCategoryPlayback error:&error];
if ( ! result && error) {
NSLog(#"Error For AudioSession Category: %#", error);
}
if (player == nil) {
NSString *path = [[NSBundle mainBundle] pathForResource:#"bell" ofType:#"wav"];
NSURL *url = [NSURL fileURLWithPath:path];
NSError *err = nil;
player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&err];
if(err)
{
NSLog(#"Error Player == %#", err);
}
else {
NSLog(#"Play ready");
}
}
[player prepareToPlay];
[player setVolume:1.0];
if([player play])
{
NSLog(#"YAY sound");
}
else {
NSLog(#"Error sound");
}
The sound works great in foreground, even in silent mode, but i got no sound at all in background. Any ideas ?
Thanks
EDIT:
Finally i got it working with the above code. The only missing point is that i was trying to play the sound in somewhat appeared to be a different thread, when i play it right in my bluetooth event and not my function call it's working.
An app that plays audio continuously (even while the app is running in the background) can register as a background audio app by including the UIBackgroundModes key (with the value audio) in its Info.plist file. Apps that include this key must play audible content to the user while in the background.
Apple reference "Playing Background Audio"
Ensuring That Audio Continues When the Screen Locks
Found here
I'm not sure if your problem is that you have not configure correctly the audio session output. I had similar problem while playing .wav in my app. I only listened them with earphones. This method helped me:
- (void) configureAVAudioSession {
//get your app's audioSession singleton object
AVAudioSession *session = [AVAudioSession sharedInstance];
//error handling
BOOL success;
NSError* error;
//set the audioSession category.
//Needs to be Record or PlayAndRecord to use audioRouteOverride:
success = [session setCategory:AVAudioSessionCategoryPlayAndRecord
error:&error];
if (!success) NSLog(#"AVAudioSession error setting category:%#",error);
//set the audioSession override
success = [session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker
error:&error];
if (!success) NSLog(#"AVAudioSession error overrideOutputAudioPort:%#",error);
//activate the audio session
success = [session setActive:YES error:&error];
if (!success) NSLog(#"AVAudioSession error activating: %#",error);
else NSLog(#"audioSession active"); }
Try it out!
You need to make couple of changes in plist file.
for enabling sound when the app enter's Background.
1) Set Required background mode to App plays audio
2) set Application does not run in background to YES.
NSError *setCategoryErr = nil;
NSError *activationErr = nil;
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error:&setCategoryErr];
[[AVAudioSession sharedInstance] setActive:YES error:&activationErr];
Then, you need to write these much code in AppDelegate
Now, you can easily run audio while phone screen locks or in background.
It works fine for me. :)
for more please refer to Playing Audio in background mode and Audio Session Programming Guide

Sound does not play on iPad speaker but works fine on headphones and on the iPod Touch/iPhone speakers

I know this is a question has been asked for many times, and I have checked most of the answers related in SO, but I have no luck finding the right answer to my problem.
Here is the problem:
I tried to play a mp3 file (just 2 seconds at most) in a game when some event is triggered, and I use the AudioPlayer to do so, below is the code blocks:
NSError *error;
AVAudioPlayer *audioPlayer = [[[AVAudioPlayer alloc] initWithContentsOfURL:[[NSBundle mainBundle] URLForResource: #"ding" withExtension: #"mp3"] error:&error] autorelease];
if (error) {
NSLog(#"Error creating audio player: %#", [error userInfo]);
}
else {
BOOL success = [audioPlayer play];
// This always is "Play sound succeeded"
NSLog(#"Play sound %#", success ? #"succeeded" : #"failed");
}
When I ran this code in iPhone 4s, iTouch 3/4, the sound always played well and clear, but in iPad 1 or iPad2, there is no sound out from speaker. But when I plugged in my headphone, weird thing happened that there is sound from my headphone! The iPad is not in mute mode and the URL is correct.
I am confused why this happened.
PS: I tried the following code (got from HERE) to output the audio output port type:
CFDictionaryRef asCFType = nil;
UInt32 dataSize = sizeof(asCFType);
AudioSessionGetProperty(kAudioSessionProperty_AudioRouteDescription, &dataSize, &asCFType);
NSDictionary *easyPeasy = (NSDictionary *)asCFType;
NSDictionary *firstOutput = (NSDictionary *)[[easyPeasy valueForKey:#"RouteDetailedDescription_Outputs"] objectAtIndex:0];
NSString *portType = (NSString *)[firstOutput valueForKey:#"RouteDetailedDescription_PortType"];
NSLog(#"first output port type is: %#!", portType);
When I plugged in my headphone, the output was "first output port type is headphone!" and when I unplugged it , the output turned out to be "first output port type is speaker!"
It would be great is someone can offer some help or advice.
There is a code change solution to this, but also an end-user solution: turn the 'Ring/Silent switch' to on. Specifically, the problem is that the default setting of AVAudioSessions, AVAudioSessionCategorySoloAmbient, is to be silent if the phone is in silent mode.
As mentioned by the original poster, you can override this behavior by calling:
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
AVAudioSession Reference recommends setting the AVAudioSessionCategoryPlayback category:
For playing recorded music or other sounds that are central to the successful use of your app.
To update #JonBrooks answer above with Swift 2 syntax, I put the following in my viewDidLoad() to override the silent switch and get my sound to play:
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch let error as NSError {
print(error)
}
Solution found!
After I added this line of code before playing, the sound finally is played out from speaker
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
I now still do not understand why, and I will dig into it.:)

iOS avaudiosession not starting on certain iOS hardware

I have a strange situation with an AVAudioRecorder used to record audio in my iOS app. I have found some rare iPad2 devices where the audio session will not start recording. The iOS app code does record fine for all other types of iPod Touches, iPhones, and iPads as well as in the simulator. It only seems to not be able to record in a very few cases. Does anyone know why this might be happening?
Here is the coding information.
In my .h file I have the AVAudioRecorder set up...
RVC.h
#interface RVC : UIViewController <AVAudioRecorderDelegate, AVAudioPlayerDelegate>{
AVAudioRecorder *audioRecorder;
}
#property (nonatomic, retain) AVAudioRecorder *audioRecorder;
RVC.m
#implementation RVC
#synthesize audioRecorder;
// Set up the file path and name, where the path is to the temporary directory
soundFile = [NSURL fileURLWithPath:[tempDir stringByAppendingFormat:#"theSoundFile.m4a"]];
// Set up the settings for the recording
soundSetting = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithFloat: 44100.0],AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
[NSNumber numberWithInt: 2], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityHigh], AVEncoderAudioQualityKey, nil];
// Make the AVAudioRecorder with the filename and the sound settings.
self.audioRecorder = [[AVAudioRecorder alloc] initWithURL:soundFile settings:soundSetting error:&error];
// Check to make sure the iOS device has audio hardware that can make a recording
BOOL audioHWAvailable = [[AVAudioSession sharedInstance] inputIsAvailable];
if (! audioHWAvailable) {
NSLog(#"No audio hardware is available.");
return;
}
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
// Make sure any output sound is played through the speakers.
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
[[AVAudioSession sharedInstance] setActive:YES error:nil];
// If the audiorecorder is properly set up, then start recording.
if (audioRecorder) {
[audioRecorder prepareToRecord];
// Start the recording
isRecording = [audioRecorder record];
}
else {
isRecording = FALSE;
NSLog(#"The audio recorder was not properly set up.");
}
if (isRecording) {
// The AVAudioSession is recording.
}
else {
// The AVAudioSession did not start so make some kind of log file.
NSLog(#"The recording was not started properly.");
// This is where THE PROBLEM SHOWS UP where it is having trouble with some versions of the iPad2, which I can not replicate in the simulators.
return;
}
You are passing in an error in the init call:
self.audioRecorder = [[AVAudioRecorder alloc] initWithURL:soundFile settings:soundSetting error:&error];
but you are never reading it. I would bet that on the failed cases, you are getting an error here, causing the audioRecorder to be nil.

Resources