Starting AudioKit results in AudioHAL_Client errors 50% of the time - audiokit

I've been using AudioKit for over 8 months, but recently I've run into a weird issue.
When I start AudioKit, in (roughly) 50% of the cases the audio stops playing after a few seconds and I get a stream of lower-level AudioHAL_Client errors:
2019-03-14 17:17:15.567027+0100 TestApp[68164:1626512] [AudioHAL_Client] HALC_ProxyIOContext.cpp:1399:IOWorkLoop: HALC_ProxyIOContext::IOWorkLoop: failed to send the final message to the server, Error: 0x10000003
2019-03-14 17:17:16.104180+0100 TestApp[68164:1626365] [AudioHAL_Client] HALC_ShellPlugIn.cpp:817:HAL_HardwarePlugIn_ObjectHasProperty: HAL_HardwarePlugIn_ObjectHasProperty: no object
or:
2019-03-15 08:15:33.756244+0100 macOSDevelopment[47186:2925180] [AudioHAL_Client] HALC_ProxyIOContext.cpp:1399:IOWorkLoop: HALC_ProxyIOContext::IOWorkLoop: failed to send the final message to the server, Error: 0x10000003
2019-03-15 08:15:34.290366+0100 macOSDevelopment[47186:2925038] [AudioHAL_Client] HALC_ShellPlugIn.cpp:817:HAL_HardwarePlugIn_ObjectHasProperty: HAL_HardwarePlugIn_ObjectHasProperty: no object
2019-03-15 08:15:34.290431+0100 macOSDevelopment[47186:2925038] [AudioHAL_Client] HALC_ShellPlugIn.cpp:817:HAL_HardwarePlugIn_ObjectHasProperty: HAL_HardwarePlugIn_ObjectHasProperty: no object
It is not related to my specific app, because when I build the AudioKit macOS development app, the same happens. I've also tried it with a clean macOS project.
This is enough to trigger the bug:
AudioKit.output = AKMixer()
AudioKit.start()
Same happens when I connect an AKOscillator instead of AKMixer.
I've tried to debug this, but I cannot figure out what's going wrong.

Related

How do I stop AudioKit Inputs and Outputs efficiently and definitively?

I've been using Audiokit.stop() to reasonable effect, and some minor issues that I'd like to resolve now. With the recent update to Audiokit 4.2, I'm now getting a (very useful) error message from the following code:
do {
try AudioKit.stop()
} catch {
AKLog("AudioKit did not stop!")
}
My error code is this:
2018-04-24 16:14:48.099606+0100 MyProject[517:62626] [avas] AVAudioSession.mm:1142:
-[AVAudioSession setActive:withOptions:error:]: Deactivating an audio session that has
running I/O. All I/O should be stopped or paused prior to deactivating the audio session.
AudioKit.swift:stop():299:couldn't stop session
Error Domain=NSOSStatusErrorDomain Code=560030580 "(null)"
GameScene.swift:update:755:AudioKit did not stop!
So my question is - how do I properly stop or pause all inputs and outputs (I/O) as suggested by the error message?

Scrubbing does not work on playing a network stream video vlcj

We are having the same issue that is descibed here https://trac.videolan.org/vlc/ticket/4888
When we try to scrubbing the video, it does not work, vlcj is buffering every time we scrubbing and we are getting the errors logs below
[163bdd44] core input error: ES_OUT_SET_(GROUP_)PCR is called too late (pts_delay increased to 101 ms)
[163bdd44] core input error: ES_OUT_RESET_PCR called
[2403dc04] core vout display error: Failed to set on top
[164a8284] http access error: error: HTTP/1.1 500 Object not found
[5:56]
[15c5f1d4] core input error: input control fifo overflow, trashing type=3
Does anyone know if using a convention of VLC Configuration we can fix it ?
Thanks
Francisco

AVAudioEngine with Today Extension

Hello I want to add the Speech framework to an iOS 10 Today Extension.
I try to use SpeakToMe sample https://developer.apple.com/library/content/samplecode/SpeakToMe/Introduction/Intro.html to record audio with AVAudioEngine.
But I get an exception when I try to start the audio engine
try audioEngine.start()
2016-10-04 22:21:24.658964 VoiceReco[4225:1230467] [aurioc] 1316: AUIOClient_StartIO failed (561145187)
2016-10-04 22:21:24.663743 VoiceReco[4225:1230467] [central] 54: ERROR: [0x1af158c40] >avae> AVAudioEngineGraph.mm:1175: Start: error 561145187
The sample code from apple works fine if I use the normal view controller without my extension. Does anyone know a workaround for this?
You can't record audio in an extension.
Check here

Passing mono audio data to AVAudioEnvironmentNode

I am attempting to use an AVAudioEnvironmentNode to produce 3D spatialized sound for a game I'm working on. The AVAudioEnvironmentNode documentation states, "It is important to note that only inputs with a mono channel connection format to the environment node are spatialized. If the input is stereo, the audio is passed through without being spatialized. Currently inputs with connection formats of more than 2 channels are not supported." I have indeed found this to be the case. When I load audio buffers with two channels into an AudioPlayerNode and connect the node to an AVAudioEnvironmentNode, the output sound is not spatialize. My question is, how can I send mono data to the AVAudioEnvironmentNode?
I've tried creating a mono .wav file using Audacity as well as loading an AVAudioPCMBuffer with sine wave data programmatically. I find that either way, when I create a single channel audio buffer and attempt to load the buffer into an AudioPlayerNode, my program crashes with the following error:
2016-02-17 06:36:07.695 Test Scene[1577:751557] 06:36:07.694 ERROR:
[0x1a1691000] AVAudioPlayerNode.mm:423: ScheduleBuffer: required
condition is false: _outputFormat.channelCount ==
buffer.format.channelCount 2016-02-17 06:36:07.698 Test
Scene[1577:751557] *** Terminating app due to uncaught exception
'com.apple.coreaudio.avfaudio', reason: 'required condition is false:
_outputFormat.channelCount == buffer.format.channelCount'
Checking the AVAudioPlayerNode output bus does indeed reveal that it expects 2 channels. It's unclear to me how this can be changed, or even if it should be.
I stipulate that I have very little experience working with AVFoundation or audio data in general. Any help you can provide would be greatly appreciated.
I hope you have solved this already, but for anyone else:
When connecting your AVAudioPlayerNode to an AVAudioMixerNode or any other node that it can connect to, you need to specify the number of channels there:
audioEngine.connect(playerNode, to: audioEngine.mainMixerNode, format: AVAudioFormat.init(standardFormatWithSampleRate: 96000, channels: 1))
You can check if the sample rate is 96000 for your file in Audacity, or 'Get Info' in Finder.

Prime failed ('nope'); will stop (1/5925 frames)

When started the audioqueue work well, it can playback audio, but when I manually change the progess of player, AudioQueuePrime will fail ,output Prime failed ('nope'); will stop (1/5925 frames), the errorcode returned is 1852797029, but next AudioQueueStart will succeed, and so the player will work, then just AudioQueuePrime failed sometimes. So why?

Resources