I've figured out how to trigger a sound when the button is pressed. I'm stuck on triggering a random sound when that same button is pressed. Since the audio player will take a string, I've generated a random number, but don't know how to insert that random number in the player. The code below is really broken so please bear with me there, I'm just stuck.
import UIKit
import AVFoundation
class ViewController: UIViewController {
#IBOutlet weak var mainButton: UIButton!
var sound1: AVAudioPlayer?
var sound2: AVAudioPlayer?
var sound3: AVAudioPlayer?
var sound4: AVAudioPlayer?
//There are many more sounds, but this is short for the example
var audioPlayer: AVAudioPlayer = AVAudioPlayer()
func setupAudioPlayerWithFile(file: NSString, type: NSString) -> AVAudioPlayer? {
let path = NSBundle.mainBundle().pathForResource(file as String, ofType: type as String)
let url = NSURL.fileURLWithPath(path!)
var audioPlayer : AVAudioPlayer?
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: url)
} catch {
print("Player not available")
}
return audioPlayer
}
override func viewDidLoad() {
super.viewDidLoad()
if let sound1 = self.setupAudioPlayerWithFile("sound1", type: "aif")
{self.sound1 = sound1}
if let sound2 = self.setupAudioPlayerWithFile("sound2", type: "aif")
{self.sound2 = sound2}
if let sound3 = self.setupAudioPlayerWithFile("sound3", type: "aif")
{self.sound3 = sound3}
if let sound4 = self.setupAudioPlayerWithFile("sound4", type: "aif")
{self.sound4 = sound4}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func buttonPressed(sender: AnyObject){
let audioArray: NSArray = ["1, 2, 3, 4"]
let range: UInt32 = UInt32(audioArray.count)
number = Int(arc4random_uniform(range))
sound(number)?.play()
}
}
class ViewController: UIViewController {
#IBOutlet weak var mainButton: UIButton!
// PUT SOUNDS AS STRINGS IN ARRAY
var arrayOfSounds = ["sound1", "sound2", "sound3", "sound4"]
var audioPlayer: AVAudioPlayer = AVAudioPlayer()
func setupAudioPlayerWithFile(file: NSString, type: NSString) -> AVAudioPlayer? {
let path = NSBundle.mainBundle().pathForResource(file as String, ofType: type as String)
let url = NSURL.fileURLWithPath(path!)
var audioPlayer : AVAudioPlayer?
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: url)
} catch {
print("Player not available")
}
return audioPlayer
}
override func viewDidLoad() {
super.viewDidLoad()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func buttonPressed(sender: AnyObject){
let range: UInt32 = UInt32(arrayOfSounds.count)
number = Int(arc4random_uniform(range))
//FIND OUT WHICH SOUND HERE
let sound = self.setupAudioPlayerWithFile(arrayOfSounds[number], type: "aif")
sound.play()
}
}
Related
I always have worked in obj-c, but I want to get my head 'round swift. I am nearly there with my code, but I just don't know how to loop my wav. It stops after one time playing. I have found some instructions, but I haven't found the solution yet for my code. I hope anyone knows the answer and can help me. So the question is: What do I have to do to make my wav loop when pressing #IBAction func playButtonTapped ?? I will give all my code, just to be sure. Thanks in advance:-)
class ViewController: UIViewController {
#IBOutlet var startAllButton: UIButton!
var audioEngine = AVAudioEngine()
var playerNode = AVAudioPlayerNode()
let timeShift = AVAudioUnitTimePitch()
let pedoMeter = CMPedometer()
let bpm: Float = 110
var avgStarted: Bool = false
var steps: Int = 0
var timer = Timer()
var adjustedBpm: Float = 110
var timerCount = 10 {
didSet {
if timerCount == 0 {
stopCountingSteps()
}
}
}
var lastTap: Date? = nil
#IBOutlet weak var tempoTap: UIButton!
#IBOutlet weak var slider: UISlider!
#IBOutlet weak var label: UILabel!
#IBOutlet weak var playButton: UIButton!
#IBOutlet weak var timerLabel: UILabel!
#IBOutlet weak var stepCountLabel: UILabel!
#IBOutlet weak var avgLabel: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
setup()
}
func setup() {
label.text = "110"
audioEngine.attach(playerNode)
audioEngine.attach(timeShift)
audioEngine.connect(playerNode, to: timeShift, format: nil)
audioEngine.connect(timeShift, to: audioEngine.mainMixerNode, format: nil)
audioEngine.prepare()
timerLabel.text = ""
stepCountLabel.text = ""
do {
try audioEngine.start()
} catch {
print("Could not start audio engine")
}
}
#IBAction func sliderValueChanged(_ sender: UISlider) {
label.text = String(sender.value)
self.label.text = String(format:"%.f", sender.value)
adjustedBpm = sender.value
timeShift.rate = adjustedBpm/bpm
}
#IBAction func playButtonTapped(_ sender: Any) {
let url = Bundle.main.url(forResource: "25loop110", withExtension: ".wav")
if let url = url {
do {
let audioFile = try AVAudioFile(forReading: url)
timeShift.rate = adjustedBpm/bpm
playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
} catch {
print("could not load audio file")
}
} else {
print("could not load audio file")
}
playerNode.play()
}
The problem is these lines:
let audioFile = try AVAudioFile(forReading: url)
playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
Where’s the loop? Nowhere. That code plays the file once.
You cannot loop with a file in AVAudioEngine. You loop with a buffer. You read the file into a buffer and call scheduleBuffer(buffer, at: nil, options: .loops).
How can I get AvPlayer to work like AvAudioPlayer. What I mean is, I cannot seem to get duration to work at all. Here is what I need to convert to AvPlayer:
import UIKit
import AVKit
import AVFoundation
class ModalViewController: UIViewController {
var audioPlayer = AVAudioPlayer()
let varSend = VarSend.sharedInstance
var timer:NSTimer!
var toggleState = 2
#IBOutlet weak var slider: UISlider!
#IBOutlet weak var sermonImage: UIImageView!
#IBOutlet weak var sermont: UILabel!
#IBOutlet weak var sermond: UILabel!
#IBOutlet weak var sermonE: UILabel!
#IBOutlet weak var sermonL: UILabel!
#IBOutlet weak var play: UIButton!
#IBOutlet var layer: UIView!
override func viewDidLoad() {
super.viewDidLoad()
let url = varSend.url
print("Setting up.")
do {
let data1 = NSData(contentsOfURL: NSURL(string:url)!)
audioPlayer = try AVAudioPlayer(data: data1!)
audioPlayer.prepareToPlay()
audioPlayer.volume = 1.0
audioPlayer.play()
} catch {
print("Error getting the audio file")
}
slider.maximumValue = Float(audioPlayer.duration)
timer = NSTimer.scheduledTimerWithTimeInterval(0.1, target: self, selector: Selector("updateSlider"), userInfo: nil, repeats: true)
slider.setThumbImage(UIImage(named: "circle"), forState: .Normal)
slider.setThumbImage(UIImage(named: "circle"), forState: .Highlighted)
let title = varSend.sermonName
self.sermont.text = title
let date = varSend.sermonDate
self.sermond.text = date
let image = varSend.sermonPic
ImageLoader.sharedLoader.imageForUrl(image, completionHandler:{(image: UIImage?, url: String) in
self.sermonImage.image = image!
})
}
#IBAction func ChangeAudioTime(sender: AnyObject) {
audioPlayer.stop()
audioPlayer.currentTime = NSTimeInterval(slider.value)
audioPlayer.prepareToPlay()
audioPlayer.volume = 1.0
audioPlayer.play()
}
func updateSlider() {
slider.value = Float(audioPlayer.currentTime)
let currentTime = Int(audioPlayer.currentTime)
let minutes = currentTime / 60
let seconds = currentTime - minutes * 60
let con = Int(audioPlayer.duration)
let currentItem = con - currentTime
let minutesU = Int(currentItem / 60)
let secondsU = Int(currentItem % 60)
sermonE.text = NSString(format: "%02d:%02d", minutes,seconds) as String
let timeLeft = NSString(format: "%02d:%02d", minutesU,secondsU) as String
sermonL.text = "-\(timeLeft)"
if currentItem == 1 {
//audioPlayer.pause()
toggleState = 1
print(toggleState)
play.setImage(UIImage(named:"play.png"),forState:UIControlState.Normal)
}
}
#IBAction func playPauseButton(sender: AnyObject) {
let playBtn = sender as! UIButton
if toggleState == 1 {
audioPlayer.play()
toggleState = 2
playBtn.setImage(UIImage(named:"pause.png"),forState:UIControlState.Normal)
} else {
audioPlayer.pause()
toggleState = 1
playBtn.setImage(UIImage(named:"play.png"),forState:UIControlState.Normal)
}
}
#IBAction func play(sender: AnyObject) {
audioPlayer.play()
}
#IBAction func pause(sender: AnyObject) {
audioPlayer.pause()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func close(sender: AnyObject) {
self.dismissViewControllerAnimated(true, completion: nil)
audioPlayer.stop()
}
}
And here is what I have tried already:
import UIKit
import Alamofire
import SwiftyJSON
import AVKit
import AVFoundation
import CoreMedia
class test: UIViewController {
var audioPlayer:AVPlayer!
var playerItem:AVPlayerItem!
var timer:NSTimer!
#IBOutlet weak var sermonE: UILabel!
#IBOutlet weak var sermonL: UILabel!
#IBOutlet weak var slider: UISlider!
override func viewDidLoad() {
super.viewDidLoad()
//let playerItem:AVPlayerItem!;
let playerURL = "example.com"
let steamingURL:NSURL = NSURL(string:playerURL)!
audioPlayer = AVPlayer(URL: steamingURL)
timer = NSTimer.scheduledTimerWithTimeInterval(0.1, target: self, selector: Selector("updateSlider"), userInfo: nil, repeats: true)
}
func updateSlider() {
let item = audioPlayer?.currentItem
let durationInSeconds = CMTimeGetSeconds(item!.duration)
print(durationInSeconds)
}
#IBAction func ChangeAudioTime(sender: AnyObject) {
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
#IBAction func Play(sender: AnyObject) {
audioPlayer.play()
}
}
I have been searching for days and Apple's docs are very hard to make out.
I even tried
self.player.currentItem.asset.duration
from: How to get Duration from AVPlayer (Not AVAudioPlayer)?
Any help would be much appreciated.
Swift 2.0
loadedTimeRanges The array contains NSValue objects containing a CMTimeRange value
indicating the times ranges for which the player item has media data
readily available. The time ranges returned may be discontinuous.
So I call myplayer.getCurrentTrackDuration every 1 second and noticed that when I'm streaming I got the correct final duration after 3-4 second.
extension AVPlayer {
//run this every 1 second of streaming (or use KVO)
//In Http stream the duration it going to increase and probably finallize near to 7% of the total duration of the song
func getCurrentTrackDuration () -> Float64 {
guard let currentItem = self.currentItem else { return 0.0 }
guard currentItem.loadedTimeRanges.count > 0 else { return 0.0 }
let timeInSecond = CMTimeGetSeconds((currentItem.loadedTimeRanges[0].CMTimeRangeValue).duration);
return timeInSecond >= 0.0 ? timeInSecond : 0.0
}
}
I just opened my old project in Xcode 7 beta. The code works perfectly fine in Xcode 6, but now it's showing many error. I don"t know what those are. Can anybody explain why this happened, and how to fix it? Thank you! Here is the Code
import UIKit
import AVFoundation
class ViewController: UIViewController {
var player: AVAudioPlayer = AVAudioPlayer()
#IBOutlet weak var firstCardImageView: UIImageView!
#IBOutlet weak var secondCardImageView: UIImageView!
#IBOutlet weak var label: UILabel!
var cardNamesArray:[String] = ["dice1","dice2","dice3","dice4","dice5","dice6"]
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
#IBAction func rollaction(sender: AnyObject) {
updateAction()
}
func updateAction(){
var firstRandomNumber = Int(arc4random_uniform(5))
var firstCardString:String = String(self.cardNamesArray[firstRandomNumber])
var secondRandomNumber = Int(arc4random_uniform(5))
var secondCardString:String = String(self.cardNamesArray[secondRandomNumber])
self.firstCardImageView.image = UIImage(named: firstCardString)
self.secondCardImageView.image = UIImage(named: secondCardString)
var fileLocation = NSBundle.mainBundle().pathForResource("sound", ofType: ".mp3")
var error: NSError? = nil
player = AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath: fileLocation!), error: &error) //Error: Cannot find an initializer for type 'AVAudioPlayer' that accepts an argument list of type '(contentsOfURL: NSURL, error: inout NSError?)'
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient, error: nil) //Error:Extra argument 'error' in call
AVAudioSession.sharedInstance().setActive(true, error: nil) //Error:Extra argument 'error' in call
player.play()
let num = firstRandomNumber + secondRandomNumber + 2
self.label.text = "The sum is \(num)"
}
override func motionEnded(motion: UIEventSubtype, withEvent event: UIEvent) {
if event.subtype == UIEventSubtype.MotionShake { //Error:Method does not override any method from its superclass
updateAction()
}
}
}
Here's your updateAction() function with Swift 2.0's do/try/catch implementation:
func updateAction(){
var firstRandomNumber = Int(arc4random_uniform(5))
var firstCardString:String = String(self.cardNamesArray[firstRandomNumber])
var secondRandomNumber = Int(arc4random_uniform(5))
var secondCardString:String = String(self.cardNamesArray[secondRandomNumber])
self.firstCardImageView.image = UIImage(named: firstCardString)
self.secondCardImageView.image = UIImage(named: secondCardString)
let fileLocation = NSBundle.mainBundle().pathForResource("sound", ofType: ".mp3")
do {
player = try AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath: fileLocation!))
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
try AVAudioSession.sharedInstance().setActive(true)
}
catch {
print("Something bad happened. Try catching specific errors to narrow things down")
}
player.play()
let num = firstRandomNumber + secondRandomNumber + 2
self.label.text = "The sum is \(num)"
}
I just opened my old project in Xcode 7 beta. The code works perfectly fine in Xcode 6, but now it's showing many error. I don"t know what those are. Can anybody explain why this happened, and how to fix it? Thank you! Here is the Code
import UIKit
import AVFoundation
class ViewController: UIViewController {
var player: AVAudioPlayer = AVAudioPlayer()
#IBOutlet weak var firstCardImageView: UIImageView!
#IBOutlet weak var secondCardImageView: UIImageView!
#IBOutlet weak var label: UILabel!
var cardNamesArray:[String] = ["dice1","dice2","dice3","dice4","dice5","dice6"]
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
#IBAction func rollaction(sender: AnyObject) {
updateAction()
}
func updateAction(){
var firstRandomNumber = Int(arc4random_uniform(5))
var firstCardString:String = String(self.cardNamesArray[firstRandomNumber])
var secondRandomNumber = Int(arc4random_uniform(5))
var secondCardString:String = String(self.cardNamesArray[secondRandomNumber])
self.firstCardImageView.image = UIImage(named: firstCardString)
self.secondCardImageView.image = UIImage(named: secondCardString)
var fileLocation = NSBundle.mainBundle().pathForResource("sound", ofType: ".mp3")
var error: NSError? = nil
player = AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath: fileLocation!), error: &error) //Error: Cannot find an initializer for type 'AVAudioPlayer' that accepts an argument list of type '(contentsOfURL: NSURL, error: inout NSError?)'
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient, error: nil) //Error:Extra argument 'error' in call
AVAudioSession.sharedInstance().setActive(true, error: nil) //Error:Extra argument 'error' in call
player.play()
let num = firstRandomNumber + secondRandomNumber + 2
self.label.text = "The sum is \(num)"
}
override func motionEnded(motion: UIEventSubtype, withEvent event: UIEvent) {
if event.subtype == UIEventSubtype.MotionShake { //Error:Method does not override any method from its superclass
updateAction()
}
}
}
Here's your updateAction() function with Swift 2.0's do/try/catch implementation:
func updateAction(){
var firstRandomNumber = Int(arc4random_uniform(5))
var firstCardString:String = String(self.cardNamesArray[firstRandomNumber])
var secondRandomNumber = Int(arc4random_uniform(5))
var secondCardString:String = String(self.cardNamesArray[secondRandomNumber])
self.firstCardImageView.image = UIImage(named: firstCardString)
self.secondCardImageView.image = UIImage(named: secondCardString)
let fileLocation = NSBundle.mainBundle().pathForResource("sound", ofType: ".mp3")
do {
player = try AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath: fileLocation!))
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
try AVAudioSession.sharedInstance().setActive(true)
}
catch {
print("Something bad happened. Try catching specific errors to narrow things down")
}
player.play()
let num = firstRandomNumber + secondRandomNumber + 2
self.label.text = "The sum is \(num)"
}
I have created an object class of AVAudioPlayer below in order to play a noise:
class MusicAudio: NSObject, AVAudioPlayerDelegate {
var bassAudio : AVAudioPlayer! = AVAudioPlayer()
override init() {
super.init()
let bassAudioPath = NSBundle.mainBundle().pathForResource("Raw", ofType:"mp3")
let fileURL = NSURL(fileURLWithPath: bassAudioPath!)
bassAudio = AVAudioPlayer(contentsOfURL: fileURL, error: nil)
bassAudio.currentTime = 0
bassAudio.volume = 1.0
bassAudio.delegate = self
}
func adjustAudioLayerState(volumeOn:Bool, layer:AudioLayerState) {
if layer == AudioLayerState.Bass {
self.bassAudio.prepareToPlay()
self.bassAudio.play()
}
}
}
To play the noise, I call the class via:
#IBAction func testSound() {
var sound:MusicAudio = MusicAudio()
sound.adjustAudioLayerState(true, layer: AudioLayerState.Bass)
}
However I am not getting any audio played back, can anyone spot a problem with my implementation?
Edit: After rdelmar answer
var bassAudio : AVAudioPlayer! = AVAudioPlayer()
class MusicAudio: NSObject, AVAudioPlayerDelegate {
override init() {
super.init()
let bassAudioPath = NSBundle.mainBundle().pathForResource("Raw", ofType:"mp3")
let fileURL = NSURL(fileURLWithPath: bassAudioPath!)
bassAudio = AVAudioPlayer(contentsOfURL: fileURL, error: nil)
bassAudio.currentTime = 0
bassAudio.volume = 1.0
bassAudio.delegate = self
}
func adjustAudioLayerState(volumeOn:Bool, layer:AudioLayerState) {
if layer == AudioLayerState.Bass {
self.bassAudio.prepareToPlay()
self.bassAudio.play()
}
}
}
You don't get any sound because your MusicAudio class is being deallocated. You need to make the var, sound, a property, not a local variable.
class ViewController: UIViewController {
var sound = MusicAudio()
#IBAction func testSound() {
self.sound.adjustAudioLayerState(true, layer: AudioLayerState.Bass)
}
}