AVAudioUnitSampler loading delay issue in iOS 11 but not iOS 10 - ios

Running on iOS 10 I am able to load a sample from file into an AVAudioUnitSampler on demand and it plays correctly.
However when I run the same code on iOS 11 the audio doesn't play. In order to get it to work I have to preload the audio.
Has something changed in iOS 11 that causes this to no longer work?
The is designed to play from a selection of about to 100 samples, some of them longer than others, so I can't load them all into memory in advance and thus need to load on demand.
I've tried this on iOS 11, 11.0.1 and 11.0.2.
Any suggestions?
This is my initial version (works on iOS 10 but not iOS 11):
var audioEngine:AVAudioEngine!
var mixer:AVAudioMixerNode!
var sampler:AVAudioUnitSampler!
override func viewDidLoad() {
super.viewDidLoad()
audioEngine = AVAudioEngine()
mixer = audioEngine.mainMixerNode
mixer.volume = 1.0
do {
try audioEngine.start()
} catch {
print(error)
}
sampler = AVAudioUnitSampler()
audioEngine.attach(sampler)
audioEngine.connect(sampler, to: mixer, format: nil)
}
#IBAction func audio1Tapped(_ sender: Any) {
if let audioPath = Bundle.main.url(forResource: "a73", withExtension: "wav") {
do {
try sampler.loadAudioFiles(at: [audioPath])
sampler.startNote(60, withVelocity: 127, onChannel: 0)
} catch {
print(error.localizedDescription)
}
} else {
print("Failed to find audio file")
}
}
The following works on iOS 11:
var audioEngine:AVAudioEngine!
var mixer:AVAudioMixerNode!
var sampler:AVAudioUnitSampler!
override func viewDidLoad() {
super.viewDidLoad()
audioEngine = AVAudioEngine()
mixer = audioEngine.mainMixerNode
mixer.volume = 1.0
do {
try audioEngine.start()
} catch {
print(error)
}
sampler = AVAudioUnitSampler()
audioEngine.attach(sampler)
audioEngine.connect(sampler, to: mixer, format: nil)
if let audioPath = Bundle.main.url(forResource: "a73", withExtension: "wav") {
do {
try sampler.loadAudioFiles(at: [audioPath])
} catch {
print(error.localizedDescription)
}
} else {
print("Failed to find audio file")
}
}
#IBAction func audio1Tapped(_ sender: Any) {
sampler.startNote(60, withVelocity: 127, onChannel: 0)
}

Related

ShazamKit: music detection "session.match(signature)" with no match found and no error was thrown, weird

Today, I create a small app to try ShazamKit music detection ability in iOS 15. Follow a tutorial on Youtube, and I have Apple developer membership and have enabled the ShazamKit service for this app identifier.
In short, I want to detect a song metadata with ShazamKit from the audio file inside app.
The problem is that both of delegate method didFind and didNotFindMatchFor didn't fire though I have generated the signature successfully. I think it should give me an error in didNotFindMatchFor delegate method if there is no match found at least, but it doesn't.
It's a pretty new feature, there is not that much related stuff I could find. Appreciate for any help.
More info: I do find some stuff using audioEngine, however that use output from Microphone, if user listen music with a headphone, that would be not possible. In my case I want to use the file itself since my production app is a music player, which stores a lot audio files in sandbox.
import ShazamKit
import UIKit
class ViewController: UIViewController {
lazy var recoButton: UIButton = {
let button = UIButton(frame: CGRect(x: 0, y: 0, width: 120, height: 60))
button.layer.cornerRadius = 8
button.backgroundColor = .brown
button.setTitle("Recognize", for: .normal)
button.addTarget(self, action: #selector(recognizeSong), for: .touchUpInside)
return button
}()
override func viewDidLoad() {
super.viewDidLoad()
view.addSubview(recoButton)
recoButton.center = view.center
}
#objc func recognizeSong(_ sender: UIButton) {
print("reco button tapped")
// ShazamKit is available from iOS 15
if #available(iOS 15, *) {
// session
let session = SHSession()
// delegate
session.delegate = self
do {
// get track
guard let url = Bundle.main.url(forResource: "Baby One More Time", withExtension: "mp3") else {
print("url is NULLLL")
return }
// create audio file
let file = try AVAudioFile(forReading: url)
let frameCapacity = AVAudioFrameCount(file.length / 26)
// Audio -> Buffer
guard let buffer = AVAudioPCMBuffer(pcmFormat: file.processingFormat, frameCapacity: frameCapacity) else {
print("Failed to create a buffer")
return }
// Read file into buffer
try file.read(into: buffer)
// SignatureGenerator
let generator = SHSignatureGenerator()
try generator.append(buffer, at: nil)
// create signature
let signature = generator.signature()
// try to match
session.match(signature)
} catch {
print(error)
}
} else {
// unavailable alert
}
}
}
extension ViewController: SHSessionDelegate {
func session(_ session: SHSession, didFind match: SHMatch) {
print("Match found!")
// get results
let items = match.mediaItems
items.forEach { item in
print(item.title ?? "title")
print(item.artist ?? "artist")
print(item.artworkURL?.absoluteURL ?? "artwork url")
}
}
func session(_ session: SHSession, didNotFindMatchFor signature: SHSignature, error: Error?) {
if let error = error {
print(error)
}
}
}
Per today's test & observation. I found that we need to convert input audio format to AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 1) with a built-in converter(AVAudioConverter). Then create the output buffer, and the music is recognized this time.
I pick 10+ music files for a test run, all of them could be detected except one. And the interesting thing is this music file could be detected by Shazam app, I have no idea what is the reason as there is no error is shown for the un-detected music song.
Anyway, now it is worked. Update code as below, it is just a combination of several functions for test purpose, you should separate them into different functions for production.
#objc func recognizeSong(_ sender: UIButton) {
print("reco button tapped")
// ShazamKit is available from iOS 15
if #available(iOS 15, *) {
// session
let session = SHSession()
session.delegate = self
guard let url = Bundle.main.url(forResource: "You Belong With Me", withExtension: "mp3") else {
return
}
guard let audioFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 1) else {
return
}
let generator = SHSignatureGenerator()
do {
let audioFile = try AVAudioFile(forReading: url)
guard let inputBuffer = AVAudioPCMBuffer(pcmFormat: audioFile.processingFormat, frameCapacity: 44100 * 10),
let outputBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: 44100 * 10) else {
return
}
// Read file into buffer
let inputBlock : AVAudioConverterInputBlock = { inNumPackets, outStatus in
do {
try audioFile.read(into: inputBuffer)
outStatus.pointee = .haveData
return inputBuffer
} catch {
if audioFile.framePosition >= audioFile.length {
outStatus.pointee = .endOfStream
return nil
} else {
outStatus.pointee = .noDataNow
return nil
}
}
}
guard let converter = AVAudioConverter(from: audioFile.processingFormat, to: audioFormat) else {
return
}
let status = converter.convert(to: outputBuffer, error: nil, withInputFrom: inputBlock)
if status == .error || status == .endOfStream {
return
}
try generator.append(outputBuffer, at: nil)
if status == .inputRanDry {
return
}
} catch {
print(error)
}
// create signature
let signature = generator.signature()
// try to match
session.match(signature)
} else {
// unavailable alert
}
}
}
Reference: Apple forums

Replacing audio file for AKAudioPlayer playback not working

I have an app in which I want to have a single audio player, and the ability to switch out what audio clips are in the player. Currently using AKAudioPlayer and replace(file: audioFile).
I have the following class that gets created on the view controller loading:
class AudioFilePlayer {
var songFile = Bundle.main
var player: AKAudioPlayer!
func play(file: String, type: String) {
var audioFile: AKAudioFile!
let song = songFile.path(forResource: file, ofType: type)
do {
let url = URL(string: song!)
audioFile = try AKAudioFile(forReading: url!)
} catch {
AKLog(error)
}
do {
player = try AKAudioPlayer(file: audioFile)
} catch {
AKLog(error)
}
}
func rePlay(file: String, type: String) {
var audioFile: AKAudioFile!
let song = songFile.path(forResource: file, ofType: type)
do {
let url = URL(string: song!)
audioFile = try AKAudioFile(forReading: url!)
} catch {
AKLog(error)
}
do {
try player.replace(file: audioFile)
} catch {
AKLog(error)
}
}
func pause(){
player.pause()
}
}
Once the app starts, I have the following code to set up the AK signal chain and create a player with an audio file, and I immediately pause it:
audioFilePlayer.play(file: "Breathing_01", type: ".mp3")
audioFilePlayer.player.looping = false
AudioKit.output = audioFilePlayer.player
do {
try AudioKit.start()
} catch {
AKLog("AudioKit did not start!")
}
audioFilePlayer.player.play()
audioFilePlayer.pause()
Elsewhere in the app, I have the following code to replace the audio file used in the player:
self.audioFilePlayer.player.pause()
self.audioFilePlayer.rePlay(file: "Breathing_01", type: "mp3")
self.audioFilePlayer.player.play()
When I run the app and initiate the process of trying to replace the file, I see this log:
2020-04-05 17:32:13.674413-0700 Mindful[24081:4439478] [general] AKAudioPlayer.swift:replace(file:):397:AKAudioPlayer -> File with "Breathing_01.mp3" Reloaded (AKAudioPlayer.swift:replace(file:):397)
2020-04-05 17:32:13.686119-0700 Mindful[24081:4439478] [general] AKAudioPlayer.swift:startTime:171:AKAudioPlayer.startTime = 0.0, startingFrame: 0 (AKAudioPlayer.swift:startTime:171)
2020-04-05 17:32:14.282632-0700 Mindful[24081:4439478] [general] AKAudioPlayer.swift:updatePCMBuffer():570:read 13359773 frames into buffer (AKAudioPlayer.swift:updatePCMBuffer():570)
But no audio output at all. When setting breakpoints, I can confirm that my player is playing, but have no audio.
Any help appreciated!
I had the same issue. Doing the following fixed it:
try player.replace(file: audioFile)
DispatchQueue.main.async {
self.player.play(from: 0)
}

Issue grabbing Audio Data from AVCaptureDataOutputSynchronizer (Swift)

I am trying to use AVCaptureDataOutputSynchronizer to capture media from multiple capture outputs. I have issues grabbing the audio data in the dataOutputSynchronizer as the audioData in the guard statement always fails. I have is no issue with grabbing the videoData and I am able to display the frame using the code below.
I wonder if its an issue with AVCaptureSynchronizedSampleBufferData, but AVCaptureSynchronizedSampleBufferData is a container for video or audio samples collected using synchronized capture.
I am not sure if the issue is I am incorrectly configuring my AVCaptureAudioDataOutput correctly. Here is my ViewController class initializing and retrieving the video and audio data.
Note that I am running this on an iPhoneXS and running iOS 12. Code is written in swift. Any help with trying to debug this would be great!
//
// ViewController.swift
// AudioVideoSyncTest
//
// Created by Andrew Mendez on 2/23/19.
// Copyright © 2019 Andrew Mendez. All rights reserved.
//
import UIKit
import AVFoundation
class ViewController: UIViewController,AVCaptureDataOutputSynchronizerDelegate,AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate {
var capSession:AVCaptureSession!
var videoDataOutput = AVCaptureVideoDataOutput()
var audioDataOutput = AVCaptureAudioDataOutput()
var dataOutputSynchronizer:AVCaptureDataOutputSynchronizer!
var dataOutputQueue = DispatchQueue(label: "com.amendez.dataOutputQueue")
#IBOutlet var imageView: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
initSession()
capSession.startRunning()
}
func initSession(){
capSession = AVCaptureSession()
capSession.sessionPreset = .photo
let dualCameraDevice = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .unspecified)
let audioDevice = AVCaptureDevice.default(for: .audio)
let videoInput = try? AVCaptureDeviceInput(device: dualCameraDevice!)
let audioInput = try? AVCaptureDeviceInput(device: audioDevice!)
do{
if capSession.canAddInput(videoInput!) == true{
capSession.addInput(videoInput!)
}else{print("Issue input camera")}
if capSession.canAddInput(audioInput!) == true {
capSession.addInput(audioInput!)
} else {print("Issue Adding audio input")}
// configuring outputs
// video output
videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]
videoDataOutput.alwaysDiscardsLateVideoFrames = true
videoDataOutput.setSampleBufferDelegate(self, queue: dataOutputQueue)
guard capSession.canAddOutput(videoDataOutput) else { fatalError() }
capSession.addOutput(videoDataOutput)
let videoConnection = videoDataOutput.connection(with: .video)
videoConnection!.videoOrientation = .portrait
// audio
guard capSession.canAddOutput(audioDataOutput) else {
print("FAILED");fatalError()}
audioDataOutput.setSampleBufferDelegate(self , queue: dataOutputQueue)
capSession.addOutput(audioDataOutput)
// synchronizer
dataOutputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [
videoDataOutput,
audioDataOutput])
dataOutputSynchronizer.setDelegate(self, queue: dataOutputQueue)
}catch{
print("Error Config Input")
}
}
#IBAction func startRecord(_ sender: Any) {
}
#IBAction func stopRecord(_ sender: Any) {
}
func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) {
//get video data
guard let videoData = synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else{
return
}
guard !videoData.sampleBufferWasDropped else{
print("Dropped video:\(videoData)")
return
}
let pixBuffer = CMSampleBufferGetImageBuffer(videoData.sampleBuffer)
DispatchQueue.main.async {
self.imageView.image = UIImage(ciImage: CIImage(cvImageBuffer:pixBuffer! ))
}
//get video data
guard let audioData = synchronizedDataCollection.synchronizedData(for: audioDataOutput) as? AVCaptureSynchronizedSampleBufferData else{
print("Error getting Audio Buffer")
return
}
guard !audioData.sampleBufferWasDropped else{
print("Dropped audio:\(audioData)")
return
}
print(audioData.sampleBuffer)
}
}

iPhone simulator 10.0 - Speech recognition unavailable

A newbie to swift! I am trying to implement an app that converts speech to text using speech recognizer.
Problem
SFSpeechRecognizer().isAvailable is false
private let request = SFSpeechAudioBufferRecognitionRequest()
private var task: SFSpeechRecognitionTask?
private let engine = AVAudioEngine()
func recognize() {
guard let node = engine.inputNode else {
return
}
let recordingFormat = node.outputFormat(forBus: 0)
node.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { buffer, _ in
self.recognitionRequest.append(buffer);
}
engine.prepare()
do {
try engine.start()
} catch {
return print(error)
}
guard let systemRecognizer = SFSpeechRecognizer() else {
return
}
if !systemRecognizer.isAvailable {
self.log(.debug, msg: "Entered this condition and stopped!")
return
}
}
Question
I am not sure why it stops in the simulator. Does microphone works in iPhone simulator?
Update
I tried testing with a audio file with below code,
let audioFile = Bundle.main.url(forResource: "create_activity", withExtension: "m4a", subdirectory: "Sample Recordings")
let recognitionRequest = SFSpeechURLRecognitionRequest(url: audioFile!)
getting error which says, Error Domain=kAFAssistantErrorDomain Code=1101 "(null)"
It looks that simulator has gained access to microphone with iOS 11.
Unfortunately I was not able to find any documentation confirming that, but can confirm this functionality with the following code sample. Works perfectly fine on iOS 11 simulator, but does nothing on iOS 10 simulator (or earlier).
import UIKit
import Speech
class ViewController: UIViewController {
private var recognizer = SFSpeechRecognizer()
private var request = SFSpeechAudioBufferRecognitionRequest()
private let engine = AVAudioEngine()
override func viewDidLoad() {
super.viewDidLoad()
requestPermissions()
}
private func requestPermissions() {
//
// Do not forget to add `NSMicrophoneUsageDescription` and `NSSpeechRecognitionUsageDescription` to `Info.plist`
//
// Request recording permission
AVAudioSession.sharedInstance().requestRecordPermission { allowed in
if allowed {
// Request speech recognition authorization
SFSpeechRecognizer.requestAuthorization { status in
switch status {
case .authorized: self.prepareSpeechRecognition()
case .notDetermined, .denied, .restricted: print("SFSpeechRecognizer authorization status: \(status).")
}
}
} else {
print("AVAudioSession record permission: \(allowed).")
}
}
}
private func prepareSpeechRecognition() {
// Check if recognizer is available (has failable initializer)
guard let recognizer = recognizer else {
print("SFSpeechRecognizer not supported.")
return
}
// Prepare recognition task
recognizer.recognitionTask(with: request) { (result, error) in
if let result = result {
print("SFSpeechRecognizer result: \(result.bestTranscription.formattedString)")
} else {
print("SFSpeechRecognizer error: \(String(describing: error))")
}
}
// Install tap to audio engine input node
let inputNode = engine.inputNode
let busNumber = 0
let recordingFormat = inputNode.outputFormat(forBus: busNumber)
inputNode.installTap(onBus: busNumber, bufferSize: 1024, format: recordingFormat) { buffer, time in
self.request.append(buffer);
}
// Prepare and start audio engine
engine.prepare()
do {
try engine.start()
} catch {
return print(error)
}
}
}
Do not forget to add NSMicrophoneUsageDescription and NSSpeechRecognitionUsageDescription to Info.plist.

AVAudioPlayer not playing audio in Swift

I have this code in a very simple, single view Swift application in my ViewController:
var audioPlayer = AVAudioPlayer()
#IBAction func playMyFile(sender: AnyObject) {
let fileString = NSBundle.mainBundle().pathForResource("audioFile", ofType: "m4a")
let url = NSURL(fileURLWithPath: fileString)
var error : NSError?
audioPlayer = AVAudioPlayer(contentsOfURL: url, error: &error)
audioPlayer.delegate = self
audioPlayer.prepareToPlay()
if (audioPlayer.isEqual(nil)) {
println("There was an error: (er)")
} else {
audioPlayer.play()
NSLog("working")
}
I have added import AVFoundation and audioPlayer is a global variable. When I execute the code, it does print "working", so it makes it through without errors but no sound is played. The device is not in silent.
There's so much wrong with your code that Socratic method breaks down; it will probably be easiest just to throw it out and show you:
var player : AVAudioPlayer! = nil // will be Optional, must supply initializer
#IBAction func playMyFile(sender: AnyObject?) {
let path = NSBundle.mainBundle().pathForResource("audioFile", ofType:"m4a")
let fileURL = NSURL(fileURLWithPath: path)
player = AVAudioPlayer(contentsOfURL: fileURL, error: nil)
player.prepareToPlay()
player.delegate = self
player.play()
}
I have not bothered to do any error checking, but the upside is you'll crash if there's a problem.
One final point, which may or may not be relevant: not every m4a file is playable. A highly compressed file, for example, can fail silently (pun intended).
Important that AvPlayer is class member and not in the given function, else it goes out of scope... :)
I had to declare a global player variable
var player: AVAudioPlayer!
and set it in viewDidLoad
override func viewDidLoad() {
super.viewDidLoad()
player = AVAudioPlayer()
}
Then I could play the audio file wherever like this:
func playAudioFile(){
do {
if audioFileUrl == nil{
return
}
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default)
try AVAudioSession.sharedInstance().setActive(true)
/* The following line is required for the player to work on iOS 11. Change the file type accordingly*/
player = try AVAudioPlayer(contentsOf: audioFileUrl, fileTypeHint: AVFileType.m4a.rawValue)
/* iOS 10 and earlier require the following line:
player = try AVAudioPlayer(contentsOf: url, fileTypeHint: AVFileTypeMPEGLayer3) */
guard let player = player else { return }
player.play()
print("PLAYING::::: \(audioFileUrl)")
}
catch let error {
print(error.localizedDescription)
}
}
}
Here is a working snippet from my swift project. Replace "audiofile" by your file name.
var audioPlayer = AVAudioPlayer()
let audioPath = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("audiofile", ofType: "mp3"))
audioPlayer = AVAudioPlayer(contentsOfURL: audioPath, error: nil)
audioPlayer.delegate = self
audioPlayer.prepareToPlay()
audioPlayer.play()
You can download fully functional Swift Audio Player application source code from here https://github.com/bpolat/Music-Player
for some reason (probably a bug) Xcode can't play certain music files in the .m4a and the .mp3 format I would recommend changing them all to .wav files to get it to play
//top of your class
var audioPlayer = AVAudioPlayer
//where you want to play your sound
let Sound = NSURL(fileURLWithPath: Bundle.main.path(forResource: "sound", ofType: "wav")!)
do {
audioPlayer = try AVAudioPlayer(contentsOf: Sound as URL)
audioPlayer.prepareToPlay()
} catch {
print("Problem in getting File")
}
audioPlayer.play()
var audioPlayer = AVAudioPlayer()
var alertSound = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("KiepRongBuon", ofType: "mp3")!)
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: nil)
AVAudioSession.sharedInstance().setActive(true, error: nil)
var error:NSError?
audioPlayer = AVAudioPlayer(contentsOfURL: alertSound, error: &error)
audioPlayer.prepareToPlay()
audioPlayer.play()
I used the below code in my app and it works. Hope that is helpful.
var audioPlayer: AVAudioPlayer!
if var filePath = NSBundle.mainBundle().pathForResource("audioFile", ofType:"mp3"){
var filePathUrl = NSURL.fileURLWithPath(filePath)
audioPlayer = AVAudioPlayer(contentsOfURL: filePathUrl, error: nil)
audioPlayer.play()
}else {
println("Path for audio file not found")
}
In Swift Coding using Try catch, this issues will solve and play audio for me and my code below,
var playerVal = AVAudioPlayer()
#IBAction func btnPlayAction(sender: AnyObject) {
let fileURL: NSURL = NSURL(string: url)!
let soundData = NSData(contentsOfURL: fileURL)
do {
playerVal = try AVAudioPlayer(data: soundData!)
}
catch {
print("Something bad happened. Try catching specific errors to narrow things down",error)
}
playerVal.delegate = self
playerVal.prepareToPlay()
playerVal.play()
}
Based on #matt answer but little bit detailed 'cause original answer did not completely satisfied me.
import AVFoundation
class YourController: UIViewController {
private var player : AVAudioPlayer?
override func viewDidLoad() {
super.viewDidLoad()
prepareAudioPlayer()
}
#IBAction func playAudio() {
player?.play()
}
}
extension YourController: AVAudioPlayerDelegate {}
private extension YourController {
func prepareAudioPlayer() {
guard let path = Bundle.main.path(forResource: "you-audio", ofType:"mp3") else {
return
}
let fileURL = URL(fileURLWithPath: path)
do {
player = try AVAudioPlayer(contentsOf: fileURL)
} catch let ex {
print(ex.localizedDescription)
}
player?.prepareToPlay()
player?.delegate = self
}
}
swift 3.0:
import UIKit
import AVFoundation
class ViewController: UIViewController
{
var audioplayer = AVAudioPlayer()
#IBAction func Play(_ sender: Any)
{
audioplayer.play()
}
#IBAction func Pause(_ sender: Any)
{
if audioplayer.isPlaying
{
audioplayer.pause()
}
else
{
}
}
#IBAction func Restart(_ sender: Any)
{
if audioplayer.isPlaying
{
audioplayer.currentTime = 0
audioplayer.play()
}
else
{
audioplayer.play()
}
}
override func viewDidLoad()
{
super.viewDidLoad()
do
{
audioplayer = try AVAudioPlayer(contentsOf:URL.init(fileURLWithPath:Bundle.main.path(forResource:"bahubali", ofType: "mp3")!))
audioplayer.prepareToPlay()
var audioSession = AVAudioSession.sharedInstance()
do
{
try audioSession.setCategory(AVAudioSessionCategoryPlayback)
}
catch
{
}
}
catch
{
print (error)
}
}
}

Resources