Sound recording click in beginning - ios

I'm getting a click sound just when starting the recording in the app. I have no clue where it comes from, thinking it might come from the tap on the button but that seems far fetched.
This is the code that I'm using:
private var recorderSettings: [String: Any] {
[
AVFormatIDKey: NSNumber(value: kAudioFormatMPEG4AAC),
AVSampleRateKey: 44100.0,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
}
audioRecorder = try AVAudioRecorder(url: fileUrl, settings: recorderSettings)
try audioSession.setCategory(.playAndRecord, mode: .default, options: [])
audioRecorder?.isMeteringEnabled = true
audioRecorder?.record()
meteringTimer = Timer.scheduledTimer(withTimeInterval: 0.05, repeats: true, block: { (timer) in
self.audioRecorder?.updateMeters()
self.soundSamples[self.currentSample] = self.audioRecorder?.averagePower(forChannel: 0) ?? 0
self.currentSample = (self.currentSample + 1) % self.numberOfSamples
})

A "click" is usually an indication of either an impulse audio value (e.g., a single 1 in a stream of 0's) or a significant non-linearity (e.g., either jumping into or out of a full-volume signal from 0). It's probably not the actual sound of a key or mouse click! (Experiment: put a fixed delay between the click time and the recording onset, and see if the click remains audible.)
If the situation is one where Apple doesn't have built in functions to deal with this common problem, perhaps you can start your recording by gradually fading up the volume at the onset. A fade up over the course of 1/50th of a second might be sufficient for avoiding audible signal discontinuities. I often use fades of 1/40th of a second or less on 44100 fps data as an alternative to going from volume A to volume B instantaneously in my Java audio coding. (In your case, volume A is 0 and volume B is your recording volume.)

Related

Audio from haptic engine only playing through speakers

I'm working on an app that uses CoreHaptics to play a synchronised pattern of vibrations and audio.
The problem is that the audio only gets played through the iPhones speakers (if the mute switch is not turned on). As soon as I connect my AirPods to the phone the audio stops playing, but the haptics continue.
My code looks something like this:
let engine = CHHapticEngine()
...
var events = [CHHapticEvent]()
...
let volume: Float = 1
let decay: Float = 0.5
let sustained: Float = 0.5
let audioParameters = [
CHHapticEventParameter(parameterID: .audioVolume, value: volume),
CHHapticEventParameter(parameterID: .decayTime, value: decay),
CHHapticEventParameter(parameterID: .sustained, value: sustained)
]
let breathingTimes = pacer.breathingTimeInSeconds
let combinedTimes = breathingTimes.inhale + breathingTimes.exhale
let audioEvent = CHHapticEvent(
audioResourceID: selectedAudio,
parameters: audioParameters,
relativeTime: 0,
duration: combinedTimes
)
events.append(audioEvent)
...
let pattern = try CHHapticPattern(events: events, parameterCurves: [])
let player = try engine.makeAdvancedPlayer(with: pattern)
...
try player.start(atTime: CHHapticTimeImmediate)
My idea to activate an audio session before the player starts, to indicate to the system that audio is played, also didn't changed the outcome:
try AVAudioSession.sharedInstance().setActive(true)
Is there a different way to route the audio from CoreHaptics to a different output other than the integrated speakers?

Sound volume decreasing for no apparent reason

I have an iOS app using SwiftUI. It handles a few sound files and performs some audio recording. This is the function doing the recording work:
func recordAudio(to file: String, for duration: TimeInterval) {
let audioSession:AVAudioSession = AVAudioSession.sharedInstance()
do {try audioSession.setCategory(.playAndRecord, mode: .default)
try audioSession.setActive(true)
let audioFilename = getDocumentsDirectory().appendingPathComponent(file+".m4a"),
audioURL = URL(fileURLWithPath: audioFilename),
settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 44100,
AVNumberOfChannelsKey: 2,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
audioRecorder = try AVAudioRecorder(url: audioURL, settings: settings)
audioRecorder.delegate = self
audioRecorder.record(forDuration: TimeInterval(2.0))
} catch let error as NSError {
print("Failed -- Recording !!! -> \(error)")
}
}
At this point, it basically works, but there is a strange behaviour that I neither understand nor like.
Here is the problem:
When I start the app and play a sound file, the volume is right for my taste.
Then without ever adjusting the volume I perform some recording (using the function above).
Finally after the recording is done, I go back to the file I played just before and play it again; the volume has mysteriously gone down, without me knowing why.
Is there something in my function that could explain that?
Or some other cause that someone could think of?
If I restart the app, the volume automatically goes back to normal.
For information, I am using iOS 14.4.2 and Xcode 12.4.
The audio session will be a decreased volume during playback after recording in .playAndRecord mode. After recording, explicitly set to something like .playback to get the volume you're expecting.

AudioKit empty file using renderToFile with AKSequencer

I'm trying to use AudioKit.renderToFile() to export short MIDI passages to audio (m4a):
// renderSequencer is an instance of AKSequencer
self.renderSequencer.loadMIDIFile(fromURL: midiURL)
Conductor.sharedInstance.setInstrument(renderItem.soundID, forOfflineRender: true)
// we only have one track with note content
for track in self.renderSequencer.tracks {
if track.isNotEmpty {
track.setMIDIOutput(Conductor.sharedInstance.midiIn)
}
}
let audioCacheDir = self.module.stateManager.audioCacheDirectory
// strip name off midi file
let midiFileName = String(midiURL.lastPathComponent.split(separator: ".")[0])
audioFileName = midiFileName
audioFileURL = audioCacheDir.appendingPathComponent("\(midiFileName).m4a")
if let audioFileURL = audioFileURL {
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 44100,
AVNumberOfChannelsKey: 2,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
let audioFile: AVAudioFile = try! AVAudioFile(forWriting: audioFileURL, settings: settings)
// get time in seconds of audio file (with 4-beat tail)
var duration: Float64 = 0.0
MusicSequenceGetSecondsForBeats(seq, (16.0 + 4), &duration)
// render sequence
do { try AudioKit.renderToFile(audioFile, duration: duration) {
self.renderSequencer.setRate(60.0)
self.renderSequencer.play()
}
} catch { print("Error performing offline file render!") }
}
This does produce an audio file of the expected duration, but it is silent. I've also tried logging from my MIDI output and can see that the events "played" from inside the preload closure are actually being sent/handled.
Mostly, I suppose, I'm curious to know whether this is actually expected to work. I've seen a couple of posts suggesting that renderToFile from MIDI is not supported (while others have suggested they have it working).
I did, btw, also post an issue on the audiokit GitHub.

Playing scheduled audio in the background

I am having a really difficult time with playing audio in the background of my app. The app is a timer that is counting down and plays bells, and everything worked using the timer originally. Since you cannot run a timer over 3 minutes in the background, I need to play the bells another way.
The user has the ability to choose bells and set the time for these bells to play (e.g. play bell immediately, after 5 minutes, repeat another bell every 10 minutes, etc).
So far I have tried using notifications using DispatchQueue.main and this will work fine if the user does not pause the timer. If they re-enter the app though and pause, I cannot seem to cancel this queue or pause it in anyway.
Next I tried using AVAudioEngine, and created a set of nodes. These will play while the app is in the foreground but seem to stop upon backgrounding. Additionally when I pause the engine and resume later, it won't pause the sequence properly. It will squish the bells into playing one after the other or not at all.
If anyone has any ideas of how to solve my issue that would be great. Technically I could try remove everything from the engine and recreate it from the paused time when the user pauses/resumes, but this seems quite costly. It also doesn't solve the problem of the audio stopping in the background. I have the required background mode 'App plays audio or streams audio/video using Airplay', and it is also checked under the background modes in capabilities.
Below is a sample of how I tried to set up the audio engine. The registerAndPlaySound method is called several other times to create the chain of nodes (or is this done incorrectly?). The code is kinda messy at the moment because I have been trying many ways trying to get this to work.
func setupSounds{
if (attached){
engine.detach(player)
}
engine.attach(player)
attached = true
let mixer = engine.mainMixerNode
engine.connect(player, to: mixer, format: mixer.outputFormat(forBus: 0))
var bell = ""
do {
try engine.start()
} catch {
return
}
if (currentSession.bellObject?.startBell != nil){
bell = (currentSession.bellObject?.startBell)!
guard let url = Bundle.main.url(forResource: bell, withExtension: "mp3") else {
return
}
registerAndPlaySound(url: url, delay: warmUpTime)
}
}
func registerAndPlaySound(url: URL, delay: Double) {
do {
let file = try AVAudioFile(forReading: url)
let format = file.processingFormat
let capacity = file.length
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(capacity))
do {
try file.read(into: buffer)
}catch {
return
}
let sampleRate = buffer.format.sampleRate
let sampleTime = sampleRate*delay
let futureTime = AVAudioTime(sampleTime: AVAudioFramePosition(sampleTime), atRate: sampleRate)
player.scheduleBuffer(buffer, at: futureTime, options: AVAudioPlayerNodeBufferOptions(rawValue: 0), completionHandler: nil)
player.play()
} catch {
return
}
}

I'm trying to use AVQueuePlayer to create a seamless audio loop, however, I don't know why there is a small silent pause between loops?

I have a simple audio file in .wav format (the audio file is cut perfectly to loop). I've tried different methods to loop it. My first attempt was simply using AVPlayer and NSNotification to detect when audioItem ended to seek time at zero and play again. However, there was clearly a gap.
I've been looking at different solutions online, and found people using AVQueuePlayer to do a switching:
Looping AVPlayer seamlessly
However, when implemented, this still produces a gap.
Here's my current notification code:
weak var weakSelf = self
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: nil, queue: nil, usingBlock: {(note: NSNotification) -> Void in
if weakSelf?.currentQueuePlayer.currentItem == weakSelf?.currentAudioItemOne {
weakSelf?.currentQueuePlayer.insertItem((weakSelf?.currentAudioItemTwo)!, afterItem: nil)
weakSelf?.currentAudioItemTwo.seekToTime(kCMTimeZero)
} else {
weakSelf?.currentQueuePlayer.insertItem((weakSelf?.currentAudioItemOne)!, afterItem: nil)
weakSelf?.currentAudioItemOne.seekToTime(kCMTimeZero)
}
})
Here's my code to set up the current QueuePlayer.
let audioPlayerItem = AVPlayerItem(URL: url)
currentAudioItemOne = audioPlayerItem
currentAudioItemTwo = audioPlayerItem
currentQueuePlayer = AVQueuePlayer()
currentQueuePlayer.insertItem(currentAudioItemOne, afterItem: nil)
currentQueuePlayer.play()
I've been working at this problem for several days now. Any leads or new things to try would be appreciated. The only thing I haven't tried so far is lower quality audio files. These .wav files are all over 1mb, and had be suspecting that the file size could be affecting the seamless looping.
EDIT:
Using AVPlayerLooper to create the 'Treadmill' effect:
let url = URL(fileURLWithPath: path)
let audioPlayerItem = AVPlayerItem(url: url)
currentAudioItemOne = audioPlayerItem
currentQueuePlayer = AVQueuePlayer()
currentAudioPlayerLayer = AVPlayerLayer(player: currentQueuePlayer)
currentAudioLooper = AVPlayerLooper(player: currentQueuePlayer, templateItem: currentAudioItemOne)
currentQueuePlayer.play()
EDIT 2:
afinfo on one of my wav files:
Num Tracks: 1
----
Data format: 2 ch, 44100 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer
no channel layout.
estimated duration: 11.302336 sec
audio bytes: 1993732
audio packets: 498433
bit rate: 1411200 bits per second
packet size upper bound: 4
maximum packet size: 4
audio data file offset: 44
not optimized
source bit depth: I16
----
You are inserting the item too late in your current solution. You need to queue up more than one initial item, so there's always a primed AVPlayerItem ready to go.
This is called the AVPlayerQueue "treadmill pattern" as better described in this WWDC 2016 session. If you're targeting iOS 10, you can use new AVPlayerLooper class which does it for you (also described in the same link). Apple has also provided a sample project which provides an example of both strategies.
Lower level solutions include queuing up the audio buffers to an AVAudioEngine instance or using an AudioQueue or mashing the buffers together yourself with an AudioUnit.

Resources