iOS Swift 2.0 - AvAudioPlayer is not playing any sound - ios

Lately I have run into an issue while using the beta version of Xcode (7.0).
I am not able to hear the sound that I play through this code:
(It is a ViewController from the Main.storyboard, there is a button connected to buttonTouchUpInside())
import UIKit
import AVFoundation
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
func setupAudioPlayerWithFile(file:NSString, type:NSString) -> AVAudioPlayer {
let path = NSBundle.mainBundle().pathForResource(file as String, ofType: type as String)
let url = NSURL.fileURLWithPath(path!)
var audioPlayer:AVAudioPlayer?
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: url)
} catch {
print("NO AUDIO PLAYER")
}
return audioPlayer!
}
#IBAction func buttonTouchUpInside(sender: AnyObject) {
let backMusic = setupAudioPlayerWithFile("sound", type: "wav")
backMusic.play()
}
}

You just have to move the declaration of backMusic out of your IBAction:
Try like this:
class ViewController: UIViewController {
var backMusic: AVAudioPlayer!
// ...
#IBAction func buttonTouchUpInside(sender: AnyObject) {
backMusic = setupAudioPlayerWithFile("sound", type: "wav")
backMusic.play()
}
}

Related

Cannot Play "m4a" in swift 2.

I am new to swift. I am trying to run an audio file using avfoundation in swift 2. This codes run for file format "mp3" but xcode crashes for "m4a". Whats my mistake? I am using xcode 7.3.1.
class ViewController: UIViewController {
var myAudioPlayer = AVAudioPlayer()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
let myFilePathString = NSBundle.mainBundle().pathForResource("Par30", ofType:"m4a")
if let myFilePathString = myFilePathString
{
let myFilePathURL = NSURL(fileURLWithPath: myFilePathString)
do{
try myAudioPlayer = AVAudioPlayer(contentsOfURL: myFilePathURL)
//myAudioPlayer.play()
}catch
{
print("error")
}
}
}
#IBAction func Play(sender: AnyObject) {
myAudioPlayer.play()
}
#IBAction func Stop(sender: AnyObject) {
myAudioPlayer.stop()
myAudioPlayer.currentTime = 0
}
}

How to pass button title to multimedia filename Swift?

I am trying to pass button label to the filename of my multimedia.
Unfortunately it's not working.
Idea is when I press button named "cat" it will play filename named "cat", "mp3"
If I press button with label "cow" it will play sound with filename "cow".
So I tried already different variant's but I can't make it work. If you guys have some ideas, please help.
import UIKit
import AVFoundation
class ViewController: UIViewController {
var audioPlayer: AVAudioPlayer!
override func viewDidLoad() {
super.viewDidLoad()
}
func playAudio() {
do {
self.audioPlayer = try AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("buttonName", ofType: "mp3")!))
self.audioPlayer.play()
} catch {
print("Error")
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func playSound(sender: AnyObject) {
let buttonName = sender.currentTitle!
playAudio()
}
}
I'm no expert in Swift, but I don't think that dope LLVM can fix these for you:
You initialized buttonName as local variable, and you didn't pass it to playAudio() (which does not accept parameters anyways), it's 100% not working to me.
Since you didn't pass the local buttonName to playAudio() as a parameter, you can't get the button name within the function scope. Also, you use "buttonName", which is a String object, not even a variable. There's no file called buttonName.mp3 in your bundle, so nothing would happen (will always print("Error")).
Something like this should work: (Not tested, but should be similer)
import UIKit
import AVFoundation
class ViewController: UIViewController {
var audioPlayer: AVAudioPlayer!
override func viewDidLoad() {
super.viewDidLoad()
}
func playAudio(buttonName: String!) {
do {
self.audioPlayer = try AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(buttonName, ofType: "mp3")!))
self.audioPlayer.play()
} catch {
print("Error")
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func playSound(sender: AnyObject) {
let buttonName = sender.currentTitle!
playAudio(buttonName)
}
}
The .pathForResource(:String?,: String?) method of NSBundle does not take a selector, but a String. Hence, in your case, you will always try to open "buttonName.mp3", rather than the contents of some property called buttonName. Moreover, playSound in your function playSound(..) lives only locally in the scope of that function; and since you do not pass its value to playAudio(), the latter does now know of it. Finally, the sender for #IBAction is an UIButton, you're better of actually using type UIButton rather than AnyObject.
import UIKit
import AVFoundation
class ViewController: UIViewController {
var audioPlayer: AVAudioPlayer!
override func viewDidLoad() {
super.viewDidLoad()
}
func playAudio(buttonName: String) {
do {
self.audioPlayer = try AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(buttonName, ofType: "mp3") ?? ""))
self.audioPlayer.play()
} catch {
print("Error")
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func playSound(sender: UIButton) {
if let buttonName = sender.currentTitle {
playAudio(buttonName)
}
}
}
Notice that you should avoid using forced unwrapping of optionals (!) unless you know specifically that they will not be nil; I've used optional binding instead in the example above (if let ... in playSound(...).

thread1: exc_bad_instruction on swift

I do have error Thread1: EXC_BAD_INSTRUCTION. Been trying very hard on this. Any helpful personnel out there?
import UIKit
import AVFoundation
class PlaySoundsViewController: UIViewController {
var filePathUrl: NSURL!
var audioPlayer: AVAudioPlayer?
var receivedAudio:RecordedAudio!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
audioPlayer = try! AVAudioPlayer(contentsOfURL: receivedAudio.filePathUrl)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
#IBAction func playFastAudio(sender: UIButton) {
audioPlayer!.stop()
audioPlayer!.rate = 1.5
audioPlayer!.currentTime = 0.0
audioPlayer!.play()
}
#IBAction func playSlowAudio(sender: UIButton) {
// play audio slow
audioPlayer!.stop()
audioPlayer!.rate = 0.5
audioPlayer!.currentTime = 0.0
audioPlayer!.play()
}
#IBAction func StopButtonSound(sender: UIButton) {
audioPlayer!.stop()
}
}

Error when trying to play video in Xcode 7 / Swift 2

I am trying to play a video inside a view controller when I enter it, but a get one error, listed below.Not too familiar with video playback, so is it supposed to be in a viewcontroller and how do I fix that error?The video is stored in xcode. Thanks.
import UIKit
import MediaPlayer
import AVFoundation
class AuroraViewController: UIViewController {
#IBOutlet var AuroraViewController: UIView!
var moviePlayer: AVPlayer?
private func playVideo() {
if let path = NSBundle.mainBundle().pathForResource("Aurora", ofType:"mp4") {
let url = NSURL(fileURLWithPath: path)
The line below is where I get the error of: Type of expression is ambiguous without more context
moviePlayer = AVPlayer(contentURL: url) {
self.moviePlayer = moviePlayer
moviePlayer.AuroraViewController.frame = self.AuroraViewController.bounds
moviePlayer.prepareToPlay()
moviePlayer.scalingMode = .AspectFill
self.AuroraViewController.addSubview(moviePlayer.view)
}
} else {
debugPrint("Ops, something wrong when playing video.m4v")
}
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
playVideo()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
UPDATE Correct code:
import UIKit
import AVFoundation
import AVKit
class AuroraViewController: AVPlayerViewController {
private func playVideo() {
if let path = NSBundle.mainBundle().pathForResource("Aurora", ofType: "mp4") {
let url = NSURL(fileURLWithPath: path)
player = AVPlayer(URL: url)
}
else {
println("Oops, could not find resource Aurora.mp4")
}
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
playVideo()
}
}
This was a 'total re-write'.
The following did the job:
Add "AVKit.framework" to the "Build Phases" -> "Link Binary With Libraries" in Project settings.
In Storyboard, change the identity of your UIViewController to be AuroraViewController.
The resulting code then is all that's needed:
import UIKit
import AVFoundation
import AVKit
class AuroraViewController: AVPlayerViewController {
private func playVideo() {
if let path = NSBundle.mainBundle().pathForResource("Aurora", ofType: "mp4") {
let url = NSURL(fileURLWithPath: path)
player = AVPlayer(URL: url)
}
else {
println("Oops, could not find resource Aurora.mp4")
}
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
playVideo()
}
}

Remote Control event in iOS with Swift

Trying to figure out how to read the Apple headphone's volume buttons to use as a trigger for the camera shutter (as the Apple Camera app does).
From the documentation on Remote Control Events,
Remote Control Received With Event, and this git repo, I've pieced together that I'll probably need an AVAudioPlayer object, .beginReceivingRemoteControlEvents(), and remoteControlReceivedWithEvent, along with making this view canBecomeFirstResponder() return true.
import UIKit
import AVFoundation
class ViewController: UIViewController, AVAudioPlayerDelegate {
var player: AVAudioPlayer!
override func viewDidLoad() {
super.viewDidLoad()
var session: AVAudioSession = AVAudioSession.sharedInstance()
session.setActive(true, error: nil)
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
println("viewDidAppear worked...")
self.becomeFirstResponder()
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
}
override func canBecomeFirstResponder() -> Bool {
return true
}
override func remoteControlReceivedWithEvent(event: UIEvent) {
let rc = event.subtype
println("does this work? \(rc.rawValue)")
//takePicture()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
}
I expected to get "does this work" when hitting the volume buttons on the headphones, instead I just see it adjust the headphone volume like normal. So I must be missing something, maybe with a delegate or AVSession?
I cross-posted this on r/swift, where I was told it probably requires playing audio (quoted straight from the documentation).
So while this isn't the ideal solution, it works for my own private use.
import UIKit
import AVFoundation
import MediaPlayer
class ViewController: UIViewController, AVAudioPlayerDelegate {
var testPlayer: AVAudioPlayer? = nil
func loadSound(filename: NSString) -> AVAudioPlayer {
let url = NSBundle.mainBundle().URLForResource(filename as String, withExtension: "caf")
var error: NSError? = nil
let player = AVAudioPlayer(contentsOfURL: url, error: &error)
if error != nil {
println("Error loading \(url): \(error?.localizedDescription)")
} else {
player.prepareToPlay()
}
return player
}
override func viewDidLoad() {
super.viewDidLoad()
self.testPlayer = self.loadSound("silence")
self.testPlayer?.numberOfLoops = -1
self.testPlayer?.play()
}
override func canBecomeFirstResponder() -> Bool {
return true
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
self.becomeFirstResponder()
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
}
override func remoteControlReceivedWithEvent(event: UIEvent) {
let rc = event.subtype
println("rc.rawValue: \(rc.rawValue)")
// take photo
}
}
I noticed that in Apple's camera app, the +/- volume buttons trigger the camera, and the microphone button pauses/plays any audio running in another app, but in this implementation the volume buttons still control the volume (and any audio has been paused when the app is launched).
An rc.rawValue: 103 corresponds to a single click of the microphone button, a double click returns 104, and a triple click returns 105, and then sometimes bumping a couple at a time returns a 108 or 109.
Based on Cody's answer but updated for 2019 (Swift 5)
import UIKit
import AVFoundation
import MediaPlayer
class ViewController: UIViewController, AVAudioPlayerDelegate {
var myPlayer: AVAudioPlayer? = nil
func loadSound(filename: NSString) -> AVAudioPlayer? {
let url = Bundle.main.url(forResource: filename as String, withExtension: "mp3")
do {
let player = try AVAudioPlayer(contentsOf: url ?? URL(fileURLWithPath: ""))
player.prepareToPlay()
return player
}
catch {
print("Error : \(error)")
return nil
}
}
override func viewDidLoad() {
super.viewDidLoad()
guard let testPlayer = loadSound(filename: "silence") else {
print("Not able to load the sound")
return
}
testPlayer.delegate = self
testPlayer.volume = 0.8
testPlayer.numberOfLoops = -1
myPlayer = testPlayer
myPlayer?.play()
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
self.becomeFirstResponder()
UIApplication.shared.beginReceivingRemoteControlEvents()
}
override func remoteControlReceived(with event: UIEvent?) {
let rc = event?.subtype
print("rc.rawValue: \(rc?.rawValue)")
// Do your thing
}
}

Resources