I'm trying to build an iOS application using Swift 4 that would play different sounds. Part of what I need to do is to limit the duration of the sound file played depending on some settings.
What I'd like to know is if it's possible to set the duration of the sound prior to playing it so it can stop after the set duration. If so, how?
I'm currently using Swift's AVAudioPlayer but I don't know if it can do it. My current code is shown below:
// resName set depending on settings
url = Bundle.main.url(forResource: resName, withExtension: "mp3")
do{
audioPlayer = try AVAudioPlayer(contentsOf: url!)
audioPlayer.prepareToPlay()
audioPlayer.currentTime = 0
} catch let error as NSError{
print(error.debugDescription)
}
audioPlayer.play()
Thanks in advance for the help :)
Well, since you are in control of playing the sounds, one way how to deal with it would be running a Timer when you start playing that would after the given time period stop playing:
let timeLimit = 1.6 // get it from settings
player.play()
Timer.scheduledTimer(withTimeInterval: timeLimit, repeats: false) { (timer) in
player.stop()
}
And in case you need to cancel it, you can keep a reference to the timer and cancel it using timer.invalidate().
As an audio engineer, I recommend editing your audio to be exactly how you want it to avoid any unwanted artifacts that can happen from coding.
Related
I am creating and playing an AVAudioPlayer as the following:
playerOne = try AVAudioPlayer(contentsOf: URL.init(fileURLWithPath: path))
playerOne.numberOfLoops = -1
playerOne.prepareToPlay()
I am playing an AAC file. I am using
playerOne.play(atTime: startTime)
to schedule a play in future and sync multiple AVAudioPlayers.
All works fine but my problem is when sounds loop they go out of sync, this is due to loops not being seamless.
What happens here is that due to aac decoders I believe that there is an exra small silence is added to the decoded audio data which causes the sync between audio players to be lost. I expected this loop to be perfect with 0 gap between looping from end to beginning.
How may I achieve seamless looping with AVAudioPlayer?
Using AVAudioEngine will give you a lot of flexibility but it's overhead if you don't need anything else but sync your tracks.
In this case you can try to use a single player with AVComposition containing all your tracks, something like this:
func generateComposition(urls: [URL]) throws -> AVComposition {
let composition = AVMutableComposition()
let audioTracks = urls
.map(AVAsset.init(url:))
.flatMap { $0.tracks(withMediaType: .audio) }
for audioTrack in audioTracks {
guard
let compositionTrack = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)
else { continue }
try compositionTrack.insertTimeRange(
audioTrack.timeRange,
of: audioTrack,
at: .zero
)
}
return composition
}
And play it using AVPlayer:
AVPlayer(playerItem: AVPlayerItem(asset: composition))
due to loops not being seamless.
I'm not sure what you mean by "not seamless", but if the problem is that you are experiencing latency, that's pretty much natural with AVAudioPlayer. If the goal is to loop with minimal latency, use AVAudioEngine.
This is a continuation of the discussion in here.
I'm building a voice recorder app for iOS in Swift, and I have a custom waveform graphic that I feed data to from a AKFFTTap object. I had a problem that the FFT starts generating all zeros after a while. In order to diagnose and solve this, I'm trying to re-initialize all the nodes and taps whenever the user starts recording (assuming that would solve the issue). Previously, AudioKit was initialized and started when the view was loaded, and that's it.
So, now I try to re-allocate everything each recording, and it works, except that every re-recording (so not the first one, but the one after), the FFT doesn't work again. This time it's consistent and reproducible.
So, here's what I'm doing, and if anyone can show me where I'm going wrong, I'll be very grateful:
When recording starts, I'm doing:
mic = AKMicrophone() //needs to be started
fft = AKFFTTap.init(mic) //will start when mic starts
//now, let's define a mixer, and add the mic node to it, and initialize the recorder to it
micMixer = AKMixer(mic)
recorder = try AKNodeRecorder(node: micMixer)
micBooster = AKBooster(micMixer, gain: 0)
AudioKit.output = micBooster
try AudioKit.start()
mic.start()
micBooster.start()
try recorder.record()
When recording stops:
//now go back deallocating stuff
recorder.stop()
micBooster.stop()
micMixer.stop()
mic.stop()
//now set player file to recorder file, since I want to play it later
do {
if let file = recorder.audioFile {
player = try AKAudioPlayer(file: file, looping: false, lazyBuffering: false, completionHandler: playingEnded)
try AudioKit.stop()
} else {
//handle no file error
}
}
catch {
//handle error
}
So, can anyone please help me figure out why the FFT doesn't work the second time around?
Thanks!
I am trying to repeat a song in my app. However, it just plays it until the end, then it stops altogether. How can I put in a loop feature for this?
This is my code in my viewDidLoad:
do
{
let audioPath = Bundle.main.path(forResource: "APP4", ofType: "mp3")
try player = AVAudioPlayer(contentsOf: NSURL(fileURLWithPath: audioPath!) as URL)
}
catch
{
//catch error
}
let session = AVAudioSession.sharedInstance()
do
{
try session.setCategory(AVAudioSessionCategoryPlayback)
}
catch
{
}
player.play()
I'm using Xcode 8.
Use AVAudioPlayer's numberOfLoops property for getting the repeat feature.
From the Apple doc's Discussion section:
A value of 0, which is the default, means to play the sound once. Set a positive integer value to specify the number of times to return to the start and play again. For example, specifying a value of 1 results in a total of two plays of the sound. Set any negative integer value to loop the sound indefinitely until you call the
stop() method.
So use:
player.numberOfLoops = n - 1 // here n (positive integer) denotes how many times you want to play the sound
Or, to avail the infinite loop use:
player.numberOfLoops = -1
// But somewhere in your code, you need to stop this
To stop the playing:
player.stop()
You need to set the numberOfLoops
player.numberOfLoops = 2 // or whatever
From Apple doc:
var numberOfLoops: Int
The number of times a sound will return to the beginning, upon
reaching the end, to repeat playback.
For repeat song you can set the Property numberOfLoops to -1 .It will work as infinity loop
player.numberOfLoops = -1
With SwiftySound, you can do it with a single line of code. All you have to do is pass -1 value for numberOfLoops parameter.
Sound.play(file: "dog", fileExtension: "wav", numberOfLoops: -1)
You can find SwiftySound on GitHub.
I am having a really difficult time with playing audio in the background of my app. The app is a timer that is counting down and plays bells, and everything worked using the timer originally. Since you cannot run a timer over 3 minutes in the background, I need to play the bells another way.
The user has the ability to choose bells and set the time for these bells to play (e.g. play bell immediately, after 5 minutes, repeat another bell every 10 minutes, etc).
So far I have tried using notifications using DispatchQueue.main and this will work fine if the user does not pause the timer. If they re-enter the app though and pause, I cannot seem to cancel this queue or pause it in anyway.
Next I tried using AVAudioEngine, and created a set of nodes. These will play while the app is in the foreground but seem to stop upon backgrounding. Additionally when I pause the engine and resume later, it won't pause the sequence properly. It will squish the bells into playing one after the other or not at all.
If anyone has any ideas of how to solve my issue that would be great. Technically I could try remove everything from the engine and recreate it from the paused time when the user pauses/resumes, but this seems quite costly. It also doesn't solve the problem of the audio stopping in the background. I have the required background mode 'App plays audio or streams audio/video using Airplay', and it is also checked under the background modes in capabilities.
Below is a sample of how I tried to set up the audio engine. The registerAndPlaySound method is called several other times to create the chain of nodes (or is this done incorrectly?). The code is kinda messy at the moment because I have been trying many ways trying to get this to work.
func setupSounds{
if (attached){
engine.detach(player)
}
engine.attach(player)
attached = true
let mixer = engine.mainMixerNode
engine.connect(player, to: mixer, format: mixer.outputFormat(forBus: 0))
var bell = ""
do {
try engine.start()
} catch {
return
}
if (currentSession.bellObject?.startBell != nil){
bell = (currentSession.bellObject?.startBell)!
guard let url = Bundle.main.url(forResource: bell, withExtension: "mp3") else {
return
}
registerAndPlaySound(url: url, delay: warmUpTime)
}
}
func registerAndPlaySound(url: URL, delay: Double) {
do {
let file = try AVAudioFile(forReading: url)
let format = file.processingFormat
let capacity = file.length
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(capacity))
do {
try file.read(into: buffer)
}catch {
return
}
let sampleRate = buffer.format.sampleRate
let sampleTime = sampleRate*delay
let futureTime = AVAudioTime(sampleTime: AVAudioFramePosition(sampleTime), atRate: sampleRate)
player.scheduleBuffer(buffer, at: futureTime, options: AVAudioPlayerNodeBufferOptions(rawValue: 0), completionHandler: nil)
player.play()
} catch {
return
}
}
My app has a "Click" sound functionality. I used the
import AVFoundation
then the following function to run the "Click" sound:
var audioPlayer = AVAudioPlayer()
func playSound() {
var soundPath = NSBundle.mainBundle().pathForResource("tick", ofType: "wav")
var soundURL = NSURL.fileURLWithPath(soundPath!)
self.audioPlayer = AVAudioPlayer(contentsOfURL: soundURL, error: nil)
self.audioPlayer.play()
}
Now if the user is running a music player, my app causes the music player to stop. I read about the Audio Session Default Behavior in the documentation, but I don't know how to apply it.
Can you please help?
Thank you!
If you are wondering the syntax for swift 2, here it is:
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: .DuckOthers)
} catch {
print("AVAudioSession cannot be set: \(error)")
}
Depending on what you want the app to behave, i.e, how your app's sound effect or music should interact with other app's background audio session, you might need to tweak both the audio session category and categoryOption.
If you just want to play the sound effect, like "tick" sound, then, AVAudioSessionCategoryAmbient and DuckOthers should be used respectively, for example:
let audioSession = AVAudioSession.sharedInstance()
var error: NSErrorPointer = nil
audioSession.setCategory(AVAudioSessionCategoryAmbient, withOptions: .DuckOthers, error: error)
However, I suppose you are actually trying to play a sound effect, in this case, the AudioServices API is a more suitable choice. You can check func AudioServicesPlaySystemSound(inSystemSoundID: SystemSoundID) in AudioToolbox framework for more details.
Another common scenario. If you want to have your app to play audio exclusively, even if there're other app's playing the music in the background, you need to set the category to AVAudioSessionCategorySoloAmbient, for example:
let audioSession = AVAudioSession.sharedInstance()
var error: NSErrorPointer = nil
audioSession.setCategory(AVAudioSessionCategorySoloAmbient, error: error)
I hope you've got what you're looking for.