How to implement sender.selected if statement into UISlider? - ios

I am trying to create a slider that changes audio dependent on the value of the slider. So, for example, if the slider value is 0, no audio plays. If the value is 1, 'song a' plays and if the value is 2, 'song b' plays etc.
I am using AVFoundation and have created my soundplayer:
var soundPlayer:AVAudioPlayer = AVAudioPlayer()
I have added outlets to my slider:
#IBOutlet weak var audioSlider: UISlider!
#IBOutlet weak var audioSliderPlay: UISlider!
Defined the file location of my audio:
let audioLocation = NSBundle.mainBundle().pathForResource("song a", ofType: ".mp3")
Created a do method for my player:
do {
audioSoundPlayer = try AVAudioPlayer (contentsOfURL: NSURL (fileURLWithPath: audioLocation!))
catch {
print(error)
}
And created my action:
#IBAction func audioSlider(sender: UISlider) {
if (!sender.selected) {
audioSoundPlayer.play()
audioSoundPlayer.numberOfLoops = -1
}
Now - the slider currently activates song a when any value is chosen. Once the audio has started, it cannot be stopped. I need to add a sender.value and create variables that define the value and choose other songs that I am going to add.
Help please?!
Edit with Amit issue:
Amit - I have changed the maximum value of the slider to 3 for 3 audios.
On adding the following to the IBAction, the slider only plays one audio once it reaches the maximum value:
#IBAction func audioSlider(sender: UISlider) {
let sliderValue = (sender.value)
if sliderValue == 1 {
soundPlayer1.play()
soundPlayer1.numberOfLoops = -1
}
else if sliderValue == 2 {
soundPlayer2.play()
soundPlayer2.numberOfLoops = -1
}
else if sliderValue == 3 {
soundPlayer3.play()
soundPlayer3.numberOfLoops = -1
}
else if sliderValue == 0 {
soundPlayer1.stop()
soundPlayer2.stop()
soundPlayer3.stop()
}
}

You slider's value property as sender.value as typecast it to Int and change the audio according to your need.
You do need to set the minimum and maximum value of the slider before using it.
Set default to 0, minimum to 0 and maximum to your need or number of songs you have.
You can use the "pattern-match" operator ~=:
let sliderValue = Int(slider.value)
if 100 ... 199 ~= sliderValue {
print("play audio 1")
}
else if 200 ... 299 ~= sliderValue {
print("play audio 2")
}

1.Set the Slider's max value as number of audio files present in the bundle.
2.Create an array containing the file name or resource path for all the audio.
Like
slider.maxValue = 5;
let audioPaths = ["1.mp3", "2.mp3", "3.mp3", "4.mp3", "5.mp3"];
#IBAction func audioSlider(sender: UISlider)
{
var index = Int(sender.value);
let audioPath = audioPaths[index];
do
{
audioSoundPlayer = try AVAudioPlayer (contentsOfURL: NSURL (fileURLWithPath: audioLocation!))
catch
{
print(error)
}
}
}

Related

Swift: Trying to control time in AVAudioPlayerNode using UISlider

I'm using an AVAudioPlayerNode attached to an AVAudioEngine to play a sound.
to get the current time of the player I'm doing this:
extension AVAudioPlayerNode {
var currentTime: TimeInterval {
get {
if let nodeTime: AVAudioTime = self.lastRenderTime, let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}
}
I have a slider that indicates the current time of the audio. When the user changes the slider value, on .ended event I have to change the current time of the player to that indicated in the slider.
To do so:
extension AVAudioPlayerNode {
func seekTo(value: Float, audioFile: AVAudioFile, duration: Float) {
if let nodetime = self.lastRenderTime{
let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodetime)!
let sampleRate = self.outputFormat(forBus: 0).sampleRate
let newsampletime = AVAudioFramePosition(Int(sampleRate * Double(value)))
let length = duration - value
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
self.stop()
if framestoplay > 1000 {
self.scheduleSegment(audioFile, startingFrame: newsampletime, frameCount: framestoplay, at: nil,completionHandler: nil)
}
}
self.play()
}
However, my function seekTo is not working correctly(I'm printing currentTime before and after the function and it shows always a negative value ~= -0.02). What is the wrong thing I'm doing and can I find a simpler way to change the currentTime of the player?
I ran into same issue. Apparently the framestoplay was always 0, which happened because of sampleRate. The value for playerTime.sampleRate was always 0 in my case.
So,
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
must be replaced with
let framestoplay = AVAudioFrameCount(Float(sampleRate) * length)

Swift 4 The UISlider with AVPlayer doesn't slide

I am trying to control the audio with the UISlider, the print() statement prints the correct value and the app doesn't crash, however when I try to grab it and move the slider (the thumb tint), the UISlider doesn't slide but just moves a bit when I try to slide it ( like a tap ).
When I comment the 6th row the slider response correctly (But of course nothing happens).
var playerItem : AVPlayerItem?
var player : AVPlayer?
#IBAction func adjustSongProgress(_ sender: UISlider) {
player?.pause()
let floatTime = Float(CMTimeGetSeconds(player!.currentTime()))
sliderProgressOutlet.value = floatTime
print(floatTime)
player?.play()
}
 Fixed it by deleting AVPlayer and changing AVPlayerItem to AVAudioPlayer, then putting the song URL into data : `
//DOWNLOADS THE SONG IN THE URL AND PUTS IT IN DATA
var task: URLSessionTask? = nil
if let songUrl = songUrl {
task = URLSession.shared.dataTask(with: songUrl, completionHandler: { data, response, error in
// SELF.PLAYER IS STRONG PROPERTY
if let data = data {
self.playerItem = try? AVAudioPlayer(data: data)
self.playPause()
DispatchQueue.main.async {
self.sliderProgress()
}
}
})
task?.resume()`
Then I changed UISlider IBAction Value Changed to Touch Down and Touch Up Inside when I connected it to the ViewController:
// TOUCH DOWN
#IBAction func SliderTouchDown(_ sender: UISlider) {
playerItem?.pause()
}
//TOUCH UP INSIDE
#IBAction func SliderTouchUpInside(_ sender: UISlider) {
playerItem?.currentTime = TimeInterval(sliderProgressOutlet.value)
playerItem?.play()
}
iOS Slider takes value between 0 to 1. if CMTimeGetSeconds return value outside from 0 to 1 it will not set properly.
therefor you have to convert your Time range to slider range.
for ex : your video/audio length is 120 second and you want to move slider to 30 second.than you can calculate new value using below function.
OldRange = (OldMax - OldMin)
NewRange = NewMax - NewMin
NewValue = (((OldValue - OldMin) * NewRange) / OldRange) + NewMin
oldRange = 120 - 0
newRange = 1 - 0
New value = (30-0)*1/120+0 = 0.25

Updating a variable in a function driven by Timer

I have the below function that works properly when a button is switched to activate it. I want to add a variable that gives each message it's current message number starting from 1. I've tried different methods of setting / updating the value but none of it helped with updating.
I'm very new to Swift / iOS development so I'm sure there is something I'm missing. What I do know is that the message prints to console repeatedly till the button is switched off and the Timer is what enables it to continuously run.
#IBOutlet weak var stateLabel: UILabel!
//Starts / Stops recording of sensor data via a switch
#IBAction func stateChange(_ sender: UISwitch) {
if sender.isOn == true {
startSensorData()
stateLabel.text = "Stop"
} else {
stopSensorData()
stateLabel.text = "Start"
}
}
func startSensorData() {
print("Start Capturing Sensor Data")
// Making sure sensors are available
if self.motionManager.isAccelerometerAvailable, self.motionManager.isGyroAvailable {
// Setting the frequency required for data session
self.motionManager.accelerometerUpdateInterval = 1.0 / 3.0
self.motionManager.gyroUpdateInterval = 1.0 / 3.0
// Start sensor updates
self.motionManager.startAccelerometerUpdates()
self.motionManager.startGyroUpdates()
// Configure a timer to fetch the data.
self.motionUpdateTimer = Timer.scheduledTimer(withTimeInterval: 1.0/3.0, repeats: true, block: { (timer1) in
// Get the motion data.
var loggingSample = 1
if let accelData = self.motionManager.accelerometerData, let gyroData = self.motionManager.gyroData {
let accelX = accelData.acceleration.x
let accelY = accelData.acceleration.y
let accelZ = accelData.acceleration.z
let gyroX = gyroData.rotationRate.x
let gyroY = gyroData.rotationRate.y
let gyroZ = gyroData.rotationRate.z
let message = "\(Date().timeIntervalSince1970),\(self.device_id),\(loggingSample),\(accelX),\(accelY),\(accelZ),\(gyroX),\(gyroY),\(gyroZ),Processing"
print(message)
loggingSample += 1
}
}
)}
}
You keep getting a value of 1 for loggingSample because you are using a local variable that gets created as 1 each time.
All you need to do is move the declaration of loggingSample to be outside the function so it is a class property.
Move the line:
var loggingSample = 1
outside the function so it is next to your outlets and other properties.

How to detect max dB Swift

I'm trying to detect dB on a iOS Device, however, I am new to AV audio foundation can't really get to figure it out. I have come across this post: iOS - Detect Blow into Mic and convert the results! (swift), but it is not working for me.
My current code is this:
import Foundation
import UIKit
import AVFoundation
import CoreAudio
class ViewController: UIViewController {
var recorder: AVAudioRecorder!
var levelTimer = NSTimer()
var lowPassResults: Double = 0.0
override func viewDidLoad() {
super.viewDidLoad()
//make an AudioSession, set it to PlayAndRecord and make it active
var audioSession:AVAudioSession = AVAudioSession.sharedInstance()
audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: nil)
audioSession.setActive(true, error: nil)
//set up the URL for the audio file
var documents: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)[0]
var str = documents.stringByAppendingPathComponent("recordTest.caf")
var url = NSURL.fileURLWithPath(str as String)
// make a dictionary to hold the recording settings so we can instantiate our AVAudioRecorder
var recordSettings: [NSObject : AnyObject] = [AVFormatIDKey:kAudioFormatAppleIMA4,
AVSampleRateKey:44100.0,
AVNumberOfChannelsKey:2,AVEncoderBitRateKey:12800,
AVLinearPCMBitDepthKey:16,
AVEncoderAudioQualityKey:AVAudioQuality.Max.rawValue
]
//declare a variable to store the returned error if we have a problem instantiating our AVAudioRecorder
var error: NSError?
//Instantiate an AVAudioRecorder
recorder = AVAudioRecorder(URL:url, settings: recordSettings, error: &error)
//If there's an error, print otherwise, run prepareToRecord and meteringEnabled to turn on metering (must be run in that order)
if let e = error {
print(e.localizedDescription)
} else {
recorder.prepareToRecord()
recorder.meteringEnabled = true
//start recording
recorder.record()
//instantiate a timer to be called with whatever frequency we want to grab metering values
self.levelTimer = NSTimer.scheduledTimerWithTimeInterval(0.02, target: self, selector: #selector(ViewController.levelTimerCallback), userInfo: nil, repeats: true)
}
}
//This selector/function is called every time our timer (levelTime) fires
func levelTimerCallback() {
//we have to update meters before we can get the metering values
recorder.updateMeters()
//print to the console if we are beyond a threshold value. Here I've used -7
if recorder.averagePowerForChannel(0) > -7 {
print("Dis be da level I'm hearin' you in dat mic ")
print(recorder.averagePowerForChannel(0))
print("Do the thing I want, mofo")
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
i was currently building my app about movie making,and learned something about how to metering sound level in dB.
the origin data of recorder.averagePowerForChannel is not really dB level of the sound,it's provide a level range which is -160 - 0,so we need some modification to make this data more reasonable
so i was finding some thing that makes this data(value) convert to dB level data.
(Sorry about forgot where i was found it!)
here is the code
/**
Format dBFS to dB
- author: RÅGE_Devil_Jåmeson
- date: (2016-07-13) 20:07:03
- parameter dBFSValue: raw value of averagePowerOfChannel
- returns: formatted value
*/
func dBFS_convertTo_dB (dBFSValue: Float) -> Float
{
var level:Float = 0.0
let peak_bottom:Float = -60.0 // dBFS -> -160..0 so it can be -80 or -60
if dBFSValue < peak_bottom
{
level = 0.0
}
else if dBFSValue >= 0.0
{
level = 1.0
}
else
{
let root:Float = 2.0
let minAmp:Float = powf(10.0, 0.05 * peak_bottom)
let inverseAmpRange:Float = 1.0 / (1.0 - minAmp)
let amp:Float = powf(10.0, 0.05 * dBFSValue)
let adjAmp:Float = (amp - minAmp) * inverseAmpRange
level = powf(adjAmp, 1.0 / root)
}
return level
}
i was noticed that you are recording whit 2 channels so it will be little different with my code;
wish could help you out or give you some ideas :D
LAST UPDATE
Change coding language to swift

iOS Adjust Pitch Whilst Playing via AVAudioUnitTimePitch

I’m trying to get some audio to be able to have the pitch adjusted whilst playing. I’m very new to Swift and iOS, but my initial attempt was to just change timePitchNode.pitch whilst it was playing; however, it wouldn’t update whilst playing. My current attempt is to reset audioEngine, and have it just resume from where it was playing (below). How do I determine where the audio currently is, and how do I get it to resume from there?
var audioFile: AVAudioFile?
var audioEngine: AVAudioEngine?
var audioPlayerNode: AVAudioPlayerNode?
var pitch: Int = 1 {
didSet {
playResumeAudio()
}
}
…
func playResumeAudio() {
var currentTime: AVAudioTime? = nil
if audioPlayerNode != nil {
let nodeTime = audioPlayerNode!.lastRenderTime!
currentTime = audioPlayerNode!.playerTimeForNodeTime(nodeTime)
}
if audioEngine != nil {
audioEngine!.stop()
audioEngine!.reset()
}
audioEngine = AVAudioEngine()
audioPlayerNode = AVAudioPlayerNode()
audioEngine!.attachNode(audioPlayerNode!)
let timePitchNode = AVAudioUnitTimePitch()
timePitchNode.pitch = Float(pitch * 100)
timePitchNode.rate = rate
audioEngine!.attachNode(timePitchNode)
audioEngine!.connect(audioPlayerNode!, to: timePitchNode, format: nil)
audioEngine!.connect(timePitchNode, to: audioEngine!.outputNode, format: nil)
audioPlayerNode!.scheduleFile(audioFile!, atTime: nil, completionHandler: nil)
let _ = try? audioEngine?.start()
audioPlayerNode!.playAtTime(currentTime)
}
I was being dumb apparently. You can modify the pitch during playback, and it does update. No need to reset any audio, just mutate the node as it’s playing, and it’ll work.

Resources