Unable to play audio / speech from iOS Swift Safari Extension on device (OSStatus error 2003329396) - ios

I'm currently trying to use AVSpeechSynthesizer to speak text from within an iOS Safari extension:
let synthesizer = AVSpeechSynthesizer()
...
let utterance = AVSpeechUtterance(string: self.text)
utterance.rate = 0.55;
self.synthesizer.speak(utterance)
On a simulator this works fine. However, on a physical device, I get the following error (even when the device is unmuted/volume-up):
NSURLConnection finished with error - code -1002
NSURLConnection finished with error - code -1002
NSURLConnection finished with error - code -1002
[AXTTSCommon] Failure starting audio queue alp!
[AXTTSCommon] Run loop timed out waiting for free audio buffer
[AXTTSCommon] _BeginSpeaking: speech cancelled error: Error Domain=TTSErrorDomain Code=-4001 "(null)"
[AXTTSCommon] _BeginSpeaking: couldn't begin playback
I have looked through quite a few SO and Apple Dev Forums threads and have tried many of the proposed solutions with no luck. Here are the things I've tried:
Linking AVFAudio.framework and AVFoundation.framework to the extension.
Starting an AVAudioSession prior to playing the utterance:
do {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playback, mode: .default, options: [.mixWithOthers, .allowAirPlay])
try session.setActive(true, options: .notifyOthersOnDeactivation)
} catch let error {
print("Error starting audio: \(error.localizedDescription)")
}
This actually results in another error being thrown right before the same errors above:
Error starting audio: The operation couldn’t be completed. (OSStatus error 2003329396.)
Playing a plain mp3 audio file:
guard let url = Bundle.main.url(forResource: "sample", withExtension: "mp3") else {
print("Couldn't find file")
return
}
do {
self.player = try AVAudioPlayer(contentsOf: url)
self.player.play()
print("**playing sound")
} catch let error as NSError {
print("Error playing sound: \(error.localizedDescription)")
}
This prints the following:
**playing sound
[aqsrv] AQServer.cpp:72 Exception caught in AudioQueueInternalNotifyRunning - error -66671
Enabling Audio, AirPlay, and Picture in Picture in Background Modes for the main target app (not available for the extension).
Any help would be appreciated.

EDIT:
The solution below gets rejected due to a validation error when submitting to App Store Connect.
I filed a Technical Support Incident with Apple, and this was their response:
Safari extensions are very short-lived, hence not fit for audio playback or speech synthesis. Not being able to validate an app extension in Xcode with a manually-added plist entry for background audio is the designed behavior. The general recommendation is to synthesize speech using JavaScript in conjunction with the Web Speech API.
TLDR: Use the Web Speech API for text-to-speech in Safari extensions, not AVSpeechSynthesizer.
Original answer:
Adding the following to the extension's Info.plist allowed the audio to play as expected:
<dict>
...
<key>UIBackgroundModes</key>
<array>
<string>audio</string>
</array>
...
</dict>
Interestingly, it actually shows the same errors in the console as before, but it does play the audio.

Related

Audio Recording is not working during Screen-share with Zoom or other app

I am trying to record voice with AVAudioRecorder. It is working fine if Screen-share is not enable. But notice when i share my device screen with Zoom or any other app. AVAudioSession is not active.
Here i attach code that i added for audio record
UIApplication.shared.beginReceivingRemoteControlEvents()
let session = AVAudioSession.sharedInstance()
do{
try session.setCategory(.playAndRecord,options: .defaultToSpeaker)
try session.setActive(true)
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 44100,
AVNumberOfChannelsKey: 2,
AVEncoderAudioQualityKey:AVAudioQuality.high.rawValue
]
audioRecorder = try AVAudioRecorder(url: getFileUrl(), settings: settings)
audioRecorder.delegate = self
audioRecorder.isMeteringEnabled = true
audioRecorder.prepareToRecord()
self.nextBtn.isHidden = true
}catch let error {
print("Error \(error)")
}
When i hit record button it shows me error NSOSStatusErrorDomain Code=561017449 "Session activation failed".
Here i attach video.
https://share.icloud.com/photos/0a09o5DCNip6Rx_GnTpht7K3A
I don't have the reputation to comment or I would. (Almost there lol!) Have you tried AVAudioSession.CategoryOptions.overridemutedmicrophoneinterrupt?
Edit
The more I looked into this it seems like if Zoom is using the hardware then the iPhone won't be able to record that stream. I think that's the idea behind the AVAudioSession.sharedSession() being a singleton.
From the docs:
Type Property
overrideMutedMicrophoneInterruption: An option that indicates
whether the system interrupts the audio session when it mutes the
built-in microphone.
Declaration
AVAudioSession.CategoryOptions { get }
Discussion
Some devices include a privacy feature that mutes the built-in
microphone at the hardware level in certain conditions, such as when
you close the Smart Folio cover of an iPad. When this occurs, the
system interrupts the audio session that’s capturing input from the
microphone. Attempting to start audio input after the system mutes the
microphone results in an error. If your app uses an audio session
category that supports input and output, such as playAndRecord, you
can set this option to disable the default behavior and continue using
the session. Disabling the default behavior may be useful to allow
your app to continue playback when recording or monitoring a muted
microphone doesn’t lead to a poor user experience. When you set this
option, playback continues as normal, and the microphone hardware
produces sample buffers, but with values of 0.
Important
Attempting to use this option with a session category that doesn’t
support audio input results in an error.

How to use internal mic for input and bluetooth for output

I'm currently trying to have my device to record audio for a capture session through device mic while having audio output on a bluetooth device (AirPods).
The reason I am doing this is because with bluetooth headphones and especially AirPods when the bluetooth mic is active the playback quality is horrible.
I tried using setPreferredInput but it changes both input and output, here's what I have so far.
do {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth, .mixWithOthers])
print(session.currentRoute.outputs)
try session.setAllowHapticsAndSystemSoundsDuringRecording(true)
try session.setActive(true, options: .notifyOthersOnDeactivation)
if let mic = session.availableInputs?.first(where: {$0.portType == AVAudioSession.Port.builtInMic}) {
try session.setPreferredInput(mic)
}
} catch let err {
print("Audio session err", err.localizedDescription)
}
Also I saw an old api that could have helped but it seems to be long depreciated now (kAudioSessionProperty_OverrideCategoryEnableBluetoothInput) for AudioSession.
There are other apps on the App Store that seem to have achieved the split recording so it seems to be possible.
Get rid of allowBluetooth and use allowBluetoothA2DP. You also don't want defaultToSpeaker here.
"Allow Bluetooth" actually means "prefer HFP" which is why the audio is so bad. HFP is a low-bandwidth bidirectional protocol used generally for phone calls. The enum name is very confusing IMO. People get confused about it all the time.
A2DP is a high-bandwidth unidirectional protocol (it doesn't support a microphone). When you request that, the headset's microphone will be disabled, and you'll get the iPhone's microphone by default (provided there isn't some other wired microphone available, but that's very unlikely).

Very quiet audio samples from AVCaptureMultiCamSession after disconnecting Lightning earphones

I've encountered an issue with some AVFoundation video and audio capture code that only happens when changing AVCaptureSession() to AVCaptureMultiCamSession() with no other code changes.
It only reproduces intermittently, and the steps are:
Boot the app and capture session with no external mic connected
Connect a pair of Apple Lightning earphones
Start recording video and audio - the audio comes from the mic on the Lightning earphones as expected
Unplug the Apple Lightning earphones
The mic input switches to a built-in mic on the device, however the average power level on the audio samples coming in via AVCaptureAudioDataOutputSampleBufferDelegate is now much quieter by about 25-30 dB.
This graph illustrates what's observed looking at the values of AVCaptureAudioChannel.averagePowerLevel from inside AVCaptureAudioDataOutputSampleBufferDelegate.captureOutput(AVCaptureOutput, didOutput: CMSampleBuffer, from: AVCaptureConnection):
The issue only reproduces intermittently, i.e. sometimes after unplugging the Lightning earphones the level coming from the built-in mic is completely normal. This issue only occurs when instantiating an AVCaptureMultiCamSession instead of an AVCaptureSession, the rest of the code is unchanged.
There is no handling of changes to the audio routing in the app. It's using the default audio device so the switch to the built-in mic is all being handled by AVFoundation. The code for setting up the audio capture is:
guard let audioDevice = AVCaptureDevice.default(for: .audio) else {
fatalError("could not create default audio device")
}
let audioIn = try! AVCaptureDeviceInput(device: audioDevice)
if self.session.canAddInput(audioIn) {
self.session.addInput(audioIn)
}
let audioOut = AVCaptureAudioDataOutput()
let audioCaptureQueue = DispatchQueue(label: "com.test.capturepipeline.audio", attributes: [])
audioOut.setSampleBufferDelegate(self, queue: audioCaptureQueue)
if self.session.canAddOutput(audioOut) {
self.session.addOutput(audioOut)
}
self.audioConnection = audioOut.connection(with: .audio)
if let aSettings = audioOut.recommendedAudioSettingsForAssetWriter(writingTo: .mov) {
self.audioCompressionSettings = aSettings as? [String: Any]
}
One theory is that this is a bug with audio beamforming that AVFoundation is configuring internally. It's strange how the audio is still there but is just very quiet. Maybe AVFoundation's weighted processing of audio samples from the multiple built-in microphones done for beamforming is somehow missing some audio data, resulting in a very quiet final signal?
Perhaps someone else has encountered similar sorts of problems with AVCaptureMultiCamSession? It's new in iOS 13 and is only supported on fairly recent hardware.
Any ideas?
Thanks!

AVAssetDownloadTask iOS13

Tried iOS13.0 and iOS13.1 and still not working, I tried both AVAggregateAssetDownloadTask and AVAssetDownloadURLSession but none of them working. Not any delegate was called to tell me error of finish, and I found downloaded cache was only 25Kb what was not the right size.
The error is:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedDescription=The operation could not be completed, _NSURLErrorFailingURLSessionTaskErrorKey=BackgroundAVAssetDownloadTask <AFDCA3CC-FA49-488B-AB16-C74425345EE4>.<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"BackgroundAVAssetDownloadTask <AFDCA3CC-FA49-488B-AB16-C74425345EE4>.<1>"
), NSLocalizedFailureReason=An unknown error occurred (-16654)}
Found out AVAssetDownloadURLSession can only download HLS with master playlist structure which contains codec attribute into EXT-X-STREAM-INF m3u8 meta on iOS 13+.
I have no idea if this is a bug or function restriction.
(m3u8 meta have no CODECS attribute can be played with AVFoundation, but can't be downloaded with AVAssetDownloadURLSession)
Anyway, the solution is:
If you have HLS master playlist:
add CODECS attribute into your #EXT-X-STREAM-INF in m3u8 meta.
e.g.
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-STREAM-INF:BANDWIDTH=63701,CODECS="mp4a.40.34"
playlist.m3u8
If you haven't HLS master playlist yet:
You have to make a master playlist even if you're not supporting adaptive streaming.
The master playlist is the only m3u8 which can contain #EXT-X-STREAM-INF hence CODECS attribute.
So, I found out that the 'AVAssetDownloadTask' had some error in calling delegates in iOS 13 (13.1,13.2.13.3). Finally, in iOS 13.4.1, Apple has fixed this error and now delegates have called after setting delegate and starting the task. Below is what I used to start downloading the m3u8 file from the server and saving it as an Asset to play later offline.
func downloadVideo(_ url: URL) {
let configuration = URLSessionConfiguration.background(withIdentifier: currentFileName)
let downloadSession = AVAssetDownloadURLSession(configuration: configuration,
assetDownloadDelegate: self,
delegateQueue: OperationQueue.main)
// HLS Asset URL
let asset = AVURLAsset(url: url)
// Create new AVAssetDownloadTask for the desired asset
let downloadTask = downloadSession.makeAssetDownloadTask(asset: asset,
assetTitle: currentFileName,
assetArtworkData: nil,
options: nil)
// Start task and begin download
downloadTask?.resume()
}
I tried this on iOS 12 and iOS 13.4.1 and it is working as expected. Also, it was already on the Apple Developer Forums here. Hope this helps someone.

How to add url in avaudioplayer in swift

i just want to ask on how to play audio using avAudion using url link..
and im getting error bad request.. can someone help me?
i already tried using avplayer but it is not suitable for me.. though it is working but.. i kinda prefer to use avaudioplayer
let mp3URL = NSURL(fileURLWithPath:"https://s3.amazonaws.com/kargopolov/kukushka.mp3")
do {
// 2
audioPlayer = try AVAudioPlayer(contentsOf: mp3URL as URL)
audioPlayer.play()
// 3
Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(updateAudioProgressView), userInfo: nil, repeats: true)
progressView.setProgress(Float(audioPlayer.currentTime/audioPlayer.duration), animated: false)
}
catch {
print("An error occurred while trying to extract audio file")
}
According to the AVAudioPlayer documentation, if you want to play audio over the internet (that's not already downloaded to disk or memory) you should use AVPlayer.
The docs describe AVAudioPlayer as
An audio player that provides playback of audio data from a file or memory.
A little further down, in the overview section, emphasis added:
Use this class for audio playback unless you are playing audio captured from a network stream or require very low I/O latency.
Notes
You can get some more information from the runtime by accessing the error object that is thrown when the connection fails. Change catch { to catch let error { and then you can log out the error as part of your message, like so:
catch let error{
print("An error occurred while trying to extract audio file: \(error)")
}
When I run your sample code in Xcode playgrounds with the change noted above, I see the following:
An error occurred while trying to extract audio file: Error Domain=NSOSStatusErrorDomain Code=2003334207 "(null)"
Notice the error's Domain and Code. Pasting that error into Google yields some results that indicate that the URL might not be resolving correctly. (Indeed, clicking on that mp3 link shows a bad access error message.)
The first result is another StackOverflow post with a similar issues. The second one is an Apple Developer Forum post which has some more information.
Let's try to change the URL to a publicly accessible sample mp3 file. (I found this one by searching the web for "test mp3 file" on Google.)
You'll want to change NSURL to URL, and instead of fileURLWithPath, you're going to want to use another initializer. (Say, string:.)
Whenever you see contentsOf...: in a media or file API, there's a good chance it expects data or a file, to the exclusion of a network stream. Similarly, when you see an initializer or method that takes a fileURL..., the system expects to be pointing to a local resource, not a network URL.
if let mp3URL = URL(string: "https://s3.amazonaws.com/kargopolov/kukushka.mp3"){
// do s.t
}
Anyway, you should know the basics of ios

Resources