AVAudioUnitSampler generates sinewaves after headphones route change, iOS 11 iPhone - ios

I'm facing a strange issue on iPhone (iOS 11) when using AVAudioUnitSampler.
Let's say I have a AVAudioUnitSampler initialised with a piano sound. So, every time that I connect or disconnect the headphones I hear the piano sound plus a sinewave tone added to it, which gets louder the more times I connect/disconnect the headphones.
So, to me it feels as if every time that the headphones are plugged/un-plugged, a new audio unit sampler would be internally attached to the sound output (and, since it is un-initialised, it generates just sinewave tones).
The following class already shows the problem. Note that I'm using AudioKit to handle MIDI signals and trigger the sampler (although on that end everything seem to work fine, ie. startNote() and stopNote() get called properly):
class MidiController: NSObject, AKMIDIListener {
var midi = AKMIDI()
var engine = AVAudioEngine()
var samplerUnit = AVAudioUnitSampler()
override public init() {
super.init()
NotificationCenter.default.addObserver(
self,
selector: #selector(handleRouteChange),
name: .AVAudioSessionRouteChange,
object: nil)
midi.openInput()
midi.addListener(self)
engine.attach(samplerUnit)
engine.connect(samplerUnit, to: engine.outputNode)
startEngine()
}
func startEngine() {
if (!engine.isRunning) {
do {
try self.engine.start()
} catch {
fatalError("couldn't start engine.")
}
}
}
#objc func handleRouteChange(notification: NSNotification) {
let deadlineTime = DispatchTime.now() + .milliseconds(100)
DispatchQueue.main.asyncAfter(deadline: deadlineTime) {
self.startEngine()
}
}
func receivedMIDINoteOn(noteNumber: MIDINoteNumber, velocity:MIDIVelocity, channel: MIDIChannel) {
if velocity > 0 {
samplerUnit.startNote(noteNumber: noteNumber, velocity: 127, channel: 0)
} else {
samplerUnit.stopNote(noteNumber: noteNumber, channel: 0)
}
}
func receivedMIDINoteOff(noteNumber: MIDINoteNumber, velocity: MIDIVelocity, channel: MIDIChannel) {
samplerUnit.stopNote(noteNumber: noteNumber, channel: 0)
}
}
I have forked AudioKit and replaced the HelloWorld example with a minimal project with which I can reproduce this problem.
Also, I couldn't reproduce this problem on iPad under both iOS 9.3 and 11, so this might be an iPhone-specific problem.
Any help or suggestion on how to continue debugging this would be very welcome, I'm quite puzzled with this and I'm not really an expert on iOS audio development.
Thanks!

You might try checking to see if the engine is started in handleRouteChange and then bounce it if it is instead of just starting it. Let us know if that works.

Related

How to 10 second forward or backward in Spotify player

I am trying to add(move forward) 10 second song duration or minus(move backward) 10 second in Spotify player but i am really confused how to add or minus.
When i m trying to use this code the song is not changed duration
// forward button action
#IBAction func moveFrdBtnAction(_ sender: Any) {
SpotifyManager.shared.audioStreaming(SpotifyManager.shared.player, didSeekToPosition: TimeInterval(10))
}
// spotify delegate method seekToPosition
func audioStreaming(_ audioStreaming: SPTAudioStreamingController!, didSeekToPosition position: TimeInterval) {
player?.seek(to: position, callback: { (error) in
let songDuration = audioStreaming.metadata.currentTrack?.duration as Any as! Double
self.delegate?.getSongTime(timeCount: Int(songDuration)+1)
})
}
We are making a music application using the same SDK in both the platforms (Android & iOS), the seekToPosition method of the Spotify SDK is working correctly in the Android version, however, it is not working in the iOS one.The delegate method calls itself but the music stops.
Can you kindly let us know why this scenario is happening, and what should we do to run it on the iOS devices as well.
Can someone please explain to me how to solve this , i've tried to solve this but no results yet.
Any help would be greatly appreciated.
Thanks in advance.
I don't use this API so my answer will be based your code and Spotify's reference documentation.
I think there are a few things wrong with your flow:
As Robert Dresler commented, you should (approximately) never call a delegate directly, a delegate calls you.
I'm pretty sure your action currently results in jumping to exactly 10 seconds, not by 10 seconds.
(As an aside, I'd suggest changing the name of your function moveFrdBtnAction to at least add more vowels)
Anyway, here's my best guess at what you want:
// forward button action
#IBAction func moveForwardButtonAction(_ sender: Any) {
skipAudio(by: 10)
}
#IBAction func moveBackButtonAction(_ sender: Any) {
skipAudio(by: -10)
}
func skipAudio(by interval: TimeInterval) {
if let player = player {
let position = player.playbackState.position // The documentation alludes to milliseconds but examples don't.
player.seek(to: position + interval, callback: { (error) in
// Handle the error (if any)
})
}
}
// spotify delegate method seekToPosition
func audioStreaming(_ audioStreaming: SPTAudioStreamingController!, didSeekToPosition position: TimeInterval) {
// Update your UI
}
Note that I have not handled seeking before the start of the track, nor after the end which could happen with a simple position + interval. The API may handle this for you, or not.
You could take a look at the examples here: spotify/ios-sdk. In the NowPlayingView example they use the 'seekForward15Seconds', maybe you could use that? If you still need 10s I have added a function below. The position is in milliseconds.
"position: The position to seek to in milliseconds"
docs
ViewController.swift
var appRemote: SPTAppRemote {
get {
return AppDelegate.sharedInstance.appRemote
}
}
fileprivate func seekForward15Seconds() {
appRemote.playerAPI?.seekForward15Seconds(defaultCallback)
}
fileprivate seekBackward15Seconds() {
appRemote.playerAPI?.seekBackward15Seconds(defaultCallback)
}
// TODO: Or you could try this function
func seekForward(seconds: Int){
appRemote.playerAPI?.getPlayerState({ (result, error) in
// playback position in milliseconds
let current_position = self.playerState?.playbackPosition
let seconds_in_milliseconds = seconds * 1000
self.appRemote.playerAPI?.seek(toPosition: current_position + seconds_in_milliseconds, callback: { (result, error) in
guard error == nil else {
print(error)
return
}
})
})
}
var defaultCallback: SPTAppRemoteCallback {
get {
return {[weak self] _, error in
if let error = error {
self?.displayError(error as NSError)
}
}
}
}
AppDelegate.swift
lazy var appRemote: SPTAppRemote = {
let configuration = SPTConfiguration(clientID: self.clientIdentifier, redirectURL: self.redirectUri)
let appRemote = SPTAppRemote(configuration: configuration, logLevel: .debug)
appRemote.connectionParameters.accessToken = self.accessToken
appRemote.delegate = self
return appRemote
}()
class var sharedInstance: AppDelegate {
get {
return UIApplication.shared.delegate as! AppDelegate
}
}
Edit1:
For this to work you need to follow the Prepare Your Environment:
Add the SpotifyiOS.framework to your Xcode project
Hope it helps!

AudioKit : AKNodeOutputPlot and AKMicrophone not working, potentially due to Lifecycle or MVVM architecture decisions

Early in my learning with AudioKit, and scaling in a larger app, I took the standard advice that AudioKit should be effectively be a global singleton. I managed to build a really sophisticated prototype and all was well in the world.
Once I started to scale up and get closer to an actual release. We decided to go MVVM for our architecture and try to not have a monstrous large AudioKit Singelton to handle every aspect of our audio needs in the app. In short, MVVM has been so incredibly elegant and has demonstrably cleaned up our code base.
In direct relation to our structure of AudioKit, it goes something like this:
AudioKit and AKMixer reside in a Singelton instance, and have public functions that allow the various viewmodels and our other Audio models to attach and detach the various nodes (AKPlayer, AKSampler, etc...). In the minimal testing I have done, I can confirm that this works as I tried it with my AKPlayer module and it works great.
I'm running into an issue where I cannot, for the life of me, get AKNodeOutputPlot and AKMicrophone to work with each other, despite the actual code implementation being identical to my working prototypes.
My concern is did I do the wrong thing thinking I could modularize AudioKit and the various nodes and components that need to connect to it, or does AKNodeOutputPlot have special requirements I am not aware of.
Here is the briefest snippets of Code I can provide without overwhelming the question:
AudioKit Singelton (called in AppDelegate):
import Foundation
import AudioKit
class AudioKitConfigurator
{
static let shared: AudioKitConfigurator = AudioKitConfigurator()
private let mainMixer: AKMixer = AKMixer()
private init()
{
makeMainMixer()
configureAudioKitSettings()
startAudioEngine()
}
deinit
{
stopAudioEngine()
}
private func makeMainMixer()
{
AudioKit.output = mainMixer
}
func mainMixer(add node: AKNode)
{
mainMixer.connect(input: node)
}
func mainMixer(remove node: AKNode)
{
node.detach()
}
private func configureAudioKitSettings()
{
AKAudioFile.cleanTempDirectory()
AKSettings.defaultToSpeaker = true
AKSettings.playbackWhileMuted = true
AKSettings.bufferLength = .medium
do
{
try AKSettings.setSession(category: .playAndRecord, with: .allowBluetoothA2DP)
}
catch
{
AKLog("Could not set session category.")
}
}
private func startAudioEngine()
{
do
{
try AudioKit.start()
}
catch
{
AKLog("Fatal Error: AudioKit did not start!")
}
}
private func stopAudioEngine()
{
do
{
try AudioKit.stop()
}
catch
{
AKLog("Fatal Error: AudioKit did not stop!")
}
}
}
Microphone Component:
import Foundation
import AudioKit
import AudioKitUI
enum MicErrorsToThrow: String, Error
{
case recordingTooShort = "The recording was too short, just silently failing"
case audioFileFailedToUnwrap = "The Audio File failed to Unwrap from the recorder"
case recorderError = "The Recorder was unable to start recording."
case recorderCantReset = "In attempt to reset the recorder, it was unable to"
}
class Microphone
{
private var mic: AKMicrophone = AKMicrophone()
private var micMixer: AKMixer = AKMixer()
private var micBooster: AKBooster = AKBooster()
private var recorder: AKNodeRecorder!
private var recordingTimer: Timer
init()
{
micMixer = AKMixer(mic)
micBooster = AKBooster(micMixer)
micBooster.gain = 0
recorder = try? AKNodeRecorder(node: micMixer)
//TODO: Need to finish the recording timer implementation, leaving blank for now
recordingTimer = Timer(timeInterval: 120, repeats: false, block: { (timer) in
})
AudioKitConfigurator.shared.mainMixer(add: micBooster)
}
deinit {
// removeComponent()
}
public func removeComponent()
{
AudioKitConfigurator.shared.mainMixer(remove: micBooster)
}
public func reset() throws
{
if recorder.isRecording
{
recorder.stop()
}
do
{
try recorder.reset()
}
catch
{
AKLog("Recorder can't reset!")
throw MicErrorsToThrow.recorderCantReset
}
}
public func setHeadphoneMonitoring()
{
// microphone will be monitored while recording
// only if headphones are plugged
if AKSettings.headPhonesPlugged {
micBooster.gain = 1
}
}
/// Start recording from mic, call this function when using in conjunction with a AKNodeOutputPlot so that it can display the waveform in realtime while recording
///
/// - Parameter waveformPlot: AKNodeOutputPlot view object which displays waveform from recording
/// - Throws: Only error to throw is from recorder property can't start recording, something wrong with microphone. Enum is MicErrorsToThrow.recorderError
public func record(waveformPlot: AKNodeOutputPlot) throws
{
waveformPlot.node = mic
do
{
try recorder.record()
// self.recordingTimer.fire()
}
catch
{
print("Error recording!")
throw MicErrorsToThrow.recorderError
}
}
/// Stop the recorder, and get the recording as an AKAudioFile, necessary to call if you are using AKNodeOutputPlot
///
/// - Parameter waveformPlot: AKNodeOutputPlot view object which displays waveform from recording
/// - Returns: AKAudioFile
/// - Throws: Two possible errors, recording was too short (right now is 0.0, but should probably be like 0.5 secs), or could not retrieve audio file from recorder, MicErrorsToThrow.audioFileFailedToUnwrap, MicErrorsToThrow.recordingTooShort
public func stopRecording(waveformPlot: AKNodeOutputPlot) throws -> AKAudioFile
{
waveformPlot.pause()
waveformPlot.node = nil
recordingTimer.invalidate()
if let tape = recorder.audioFile
{
if tape.duration > 0.0
{
recorder.stop()
AKLog("Printing tape: CountOfFloatChannelData:\(tape.floatChannelData?.first?.count) | maxLevel:\(tape.maxLevel)")
return tape
}
else
{
//TODO: This should be more gentle than an NSError, it's just that they managed to tap the buttona and tap again to record nothing, honestly duration should probbaly be like 0.5, or 1.0 even. But let's return some sort of "safe" error that doesn't require UI
throw MicErrorsToThrow.recordingTooShort
}
}
else
{
//TODO: need to return error here, could not recover audioFile from recorder
AKLog("Can't retrieve or unwrap audioFile from recorder!")
throw MicErrorsToThrow.audioFileFailedToUnwrap
}
}
}
Now, in my VC, the AKNodeOutputPlot is a view on Storybard and hooked up via IBOutlet. It renders on screen, it's stylized per my liking and it's definitely connected and working. Also in the VC/VM is an instance property of my Microphone component. My thinking was that upon recording, we would pass the nodeOutput object to the ViewModel, which then would call the record(waveformPlot: AKNodeOutputPlot) function of Microphone, which then would waveformPlot.node = mic be sufficient to hook them up. Sadly this is not the case.
View:
class ComposerVC: UIViewController, Storyboarded
{
var coordinator: MainCoordinator?
let viewModel: ComposerViewModel = ComposerViewModel()
#IBOutlet weak var recordButton: RecordButton!
#IBOutlet weak var waveformPlot: AKNodeOutputPlot! // Here is our waveformPlot object, again confirmed rendering and styled
// MARK:- VC Lifecycle Methods
override func viewDidLoad()
{
super.viewDidLoad()
setupNavigationBar()
setupConductorButton()
setupRecordButton()
}
func setupWaveformPlot() {
waveformPlot.plotType = .rolling
waveformPlot.gain = 1.0
waveformPlot.shouldFill = true
}
override func viewDidAppear(_ animated: Bool)
{
super.viewDidAppear(animated)
setupWaveformPlot()
self.didDismissComposerDetailToRootController()
}
// Upon touching the Record Button, it in turn will talk to ViewModel which will then call Microphone module to record and hookup waveformPlot.node = mic
#IBAction func tappedRecordView(_ sender: Any)
{
self.recordButton.recording.toggle()
self.recordButton.animateToggle()
self.viewModel.tappedRecord(waveformPlot: waveformPlot)
{ (waveformViewModel, error) in
if let waveformViewModel = waveformViewModel
{
self.segueToEditWaveForm()
self.performSegue(withIdentifier: "composerToEditWaveForm", sender: waveformViewModel)
//self.performSegue(withIdentifier: "composerToDetailSegue", sender: self)
}
}
}
ViewModel:
import Foundation
import AudioKit
import AudioKitUI
class ComposerViewModel: ViewModelProtocol
{
//MARK:- Instance Variables
var recordingState: RecordingState
var mic: Microphone = Microphone()
init()
{
self.recordingState = .readyToRecord
}
func resetViewModel()
{
self.resetRecorder()
}
func resetRecorder()
{
do
{
try mic.reset()
}
catch let error as MicErrorsToThrow
{
switch error {
case .audioFileFailedToUnwrap:
print(error)
case .recorderCantReset:
print(error)
case .recorderError:
print(error)
case .recordingTooShort:
print(error)
}
}
catch {
print("Secondary catch in start recording?!")
}
recordingState = .readyToRecord
}
func tappedRecord(waveformPlot: AKNodeOutputPlot, completion: ((EditWaveFormViewModel?, Error?) -> ())? = nil)
{
switch recordingState
{
case .readyToRecord:
self.startRecording(waveformPlot: waveformPlot)
case .recording:
self.stopRecording(waveformPlot: waveformPlot, completion: completion)
case .finishedRecording: break
}
}
func startRecording(waveformPlot: AKNodeOutputPlot)
{
recordingState = .recording
mic.setHeadphoneMonitoring()
do
{
try mic.record(waveformPlot: waveformPlot)
}
catch let error as MicErrorsToThrow
{
switch error {
case .audioFileFailedToUnwrap:
print(error)
case .recorderCantReset:
print(error)
case .recorderError:
print(error)
case .recordingTooShort:
print(error)
}
}
catch {
print("Secondary catch in start recording?!")
}
}
I'm happy to provide more code but I just don't want to overwhelm anyway with their time. The logic seems sound, I just feel I'm missing something obvious and or a complete misunderstanding of AudioKit + AKNodeOutputPlot + AKMicrohone.
Any ideas are so welcome, thank you!
EDIT
AudioKit 4.6 fixed all the issues! Highly encourage MVVM/Modularization of AudioKit for your projects!
====
So after alot of experiments. I have come to a few conclusions:
In a separate project, I brought over my AudioKitConfigurator and Microphone classes, initialized them, hooked them to a AKNodeOutputPlot and it worked flawlessly.
In my very large project, no matter what I do, I cannot get the same classes to work at all.
For now, I am reverting back to an old build, slowly adding components until it breaks again, and will update the architecture one by one, as this problem is too complex and might be interacting with some other libraries. I have also downgraded from AudioKit 4.5.6, to AudioKit 4.5.3.
This is not a solution, but the only one that is workable right now. The good news is, it is entirely possible to format AudioKit to work with an MVVM architecture.

Trying to set playback speed on youtube-ios-player-helper but nothing happens

I am building an iOS app that uses the youtube-ios-player-helper library to play YouTube videos. I'm trying to set the playback speed, but nothing happens:
class PlayerViewController: UIViewController {
private let player = YTPlayerView()
override func viewDidLoad() {
super.viewDidLoad()
player.delegate = self
view.addSubview(player)
setupPlayerConstraints()
player.load(withVideoId: "123456")
// Tried here, but nothing change:
print("Available playback rates: \(String(describing: player.availablePlaybackRates()))")
player.setPlaybackRate(2.0)
}
}
extension PlayerViewController: YTPlayerViewDelegate {
func playerViewDidBecomeReady(_ playerView: YTPlayerView) {
player.playVideo()
}
func playerView(_ playerView: YTPlayerView, didChangeTo state: YTPlayerState) {
switch (state) {
case.playing:
// Tried here, but again, nothing change:
print("Available playback rates: \(String(describing: player.availablePlaybackRates()))")
player.setPlaybackRate(2.0)
default:
break;
}
}
}
As shown by the code above, I've tried setting playback speed after loading the video and also when player changes it's state to playing. In none of then the speed was changed.
Also, player.availablePlaybackRates returns nil in both cases (that's strange because when I watch the same video using the YouTube app, I can change the playback speed).
I know that setting the playback speed is a suggestion to the player, but on the official YouTube app, changing the speed works for the same video that I'm trying to watch on my app.
Is there anything that I'm missing here?
I had the same problem. But I solved it like this and it works well. in swift 4
private -> public
private func stringFromEvaluingJavaScript(jsToExecute: String) -> String{
Guard let result = self.webView.stringByEvaluingJavaScript (from jsToExecute) else{
return ""
}
return result
}
directly Execute
playerView.stringFromEvaluingJavaScript(jsToExecute): "player.setPlaybackRate(1.5);")

How to test background audio in iOS UI Tests

I want to test various aspects of background music and speech in iOS UI Testing;
is background music playing?
did music change on new scene?
does music mute on interruptions from phone call, FaceTime etc?
is speech uttered when required ?
Can't seem to find any api documentation or tutorials that show how to do these types of tests.
Have tried the following code in my UI test class just to test if music is playing. The code attempts to either detect audioSession state (audioSession.isOtherAudioPlaying) or events related to playback (AVAudioSessionInterruption). Neither approach is detecting the music.
override func setUp() {
super.setUp()
continueAfterFailure = false
XCUIApplication().launch()
app = XCUIApplication();
NotificationCenter.default.addObserver(self, selector: #selector(self.handleInterruption(_:)), name: NSNotification.Name.AVAudioSessionInterruption, object: nil)
}
#objc func handleInterruption(_ notification: Notification) {
print("Received audio interrupt", notification);
}
override func tearDown() {
super.tearDown()
}
func testIsBackgroundMusicPlaying() {
let audioSession: AVAudioSession = AVAudioSession.sharedInstance();
if (audioSession.isOtherAudioPlaying) {
print ("Other audio is playing");
}
}
Any suggestions?

Getting MPRemoteCommandCenter.shared() to work in tvOS

Ok, maybe I missed something here. I want to use the black remote with my app and got this code essentially from the WWDC 2017 talk on the issue. It says ...
Consistent and intuitive control of media playback is key to many apps on tvOS, and proper use and configuration of MPNowPlayingInfoCenter and MPRemoteCommandCenter are critical to delivering a great user experience. Dive deeper into these frameworks and learn how to ensure a seamless experience whether your app is being controlled using Siri, the Siri Remote, or the iOS Remote app.
So I added these lines to viewDidLoad of my tvOS app and well they do nothing basically?
var commandCenter = MPRemoteCommandCenter.shared()
override func viewDidLoad() {
super.viewDidLoad()
commandCenter.playCommand.isEnabled = true
commandCenter.pauseCommand.isEnabled = true
commandCenter.playCommand.addTarget { (commandEvent) -> MPRemoteCommandHandlerStatus in
print("You Pressed play")
return .success
}
commandCenter.pauseCommand.addTarget { (commandEvent) -> MPRemoteCommandHandlerStatus in
print("You Pressed pause")
return .success
}
}
I run the app, and try the play/pause button on the black remote and nothing is printed to the debugging console? Also added some code the plist related to background mode...Should this work or did I miss the point here somewhere?
<key>UIBackgroundModes</key>
<array>
<string>audio</string>
<string>external-accessory</string>
</array>
The commands in MPRemoteCommandCenter aren't triggered by the Siri Remote when your app is in the foreground. To get events from the remote when you're in the foreground, use UIGestureRecognizer like you're probably already used to.
These commands in MPRemoteCommandCenter are for other ways the system may want to interact with your playback, such as:
Your app is playing audio in the background, and the user presses the pause button on the remote: your app I'll be asked to pause playback.
The user is using the TV Remote app for iOS and is using that app's playback control screen.
Posted the question to Apple support; who pointed me in the right direction, need to use the GCMicroGamepad controller or its related GameKit frameworks. Than found a 2015 example posted by blauzahn who most certainly deserves the credit really for this post. Here is his code slightly modified for Swift 3.0, ios 10.x
import GameController
..
var gamePad: GCMicroGamepad? = nil
NotificationCenter.default.addObserver(self,
selector: #selector(gameControllerDidConnect),
name: NSNotification.Name.GCControllerDidConnect,
object: nil)
NotificationCenter.default.addObserver(self,
selector: #selector(gameControllerDidDisconnect),
name: NSNotification.Name.GCControllerDidDisconnect,
object: nil)
func gameControllerDidConnect(notification : NSNotification) {
if let controller = notification.object as? GCController {
if let mGPad = controller.microGamepad {
// Some setup
gamePad = mGPad
gamePad!.allowsRotation = true
gamePad!.reportsAbsoluteDpadValues = true
print("MicroGamePad connected...")
// Add valueChangedHandler for each control element
if gamePad?.buttonA.isPressed == true {
print("button A pressed")
}
if gamePad?.buttonX.isPressed == true {
print("button X pressed")
}
gamePad!.dpad.valueChangedHandler = { (dpad: GCControllerDirectionPad, xValue: Float, yValue: Float) -> Void in
print("dpad xValue = \(xValue), yValue = \(yValue)")
}
gamePad!.buttonA.valueChangedHandler = { (buttonA: GCControllerButtonInput, value:Float, pressed:Bool) -> Void in
print("\(buttonA)")
}
gamePad!.buttonX.valueChangedHandler = { (buttonX: GCControllerButtonInput, value:Float, pressed:Bool) -> Void in
print("\(buttonX)")
}
}
}
}
// Game controller disconnected
func gameControllerDidDisconnect(notification : NSNotification) {
if let controller = notification.object as? GCController {
if controller.microGamepad != nil {
self.gamePad = nil
print("MicroGamePad disconnected...")
}
}
}

Resources