I have created a subclass of SCNNode. It is made up of few child nodes.
I have declared a method, viz. soundCasual() which adds a SCNAudioPlayer to the instance of this class. Everything is working as expected and audio is being played, when this method is called. This method is called on that node whenever that node is tapped (gesture).
Code:
class MyNode: SCNNode {
let wrapperNode = SCNNode()
let audioSource5 = SCNAudioSource(fileNamed: "audiofile.mp3")
override init() {
super.init()
if let virtualScene = SCNScene(named: "MyNode.scn", inDirectory: "Assets.scnassets/Shapes/MyNode") {
for child in virtualScene.rootNode.childNodes {
wrapperNode.addChildNode(child)
}
}
}
func soundCasual() {
DispatchQueue.global(qos: .userInteractive).async { [weak self] in
if let audioSource = self?.audioSource5 {
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.wrapperNode.removeAllAudioPlayers()
self?.wrapperNode.addAudioPlayer(audioPlayer)
}
}
}
}
Issue within Instruments (Allocations)
When I analyse my whole codebase (which is several other things), I see that whenever I tap on that node, allocation count of SCNAudioPlayer is increased by one while profiling within Instruments. But all of the increase is persistent. By definition of SCNAudioPlayer, I assumed that this the player is removed after playback, which is why the increment should be in Transient allocations, but it is not working like this. That is why I tried removeAllAudioPlayers() before adding an SCNAudioPlayer to the node, as you can see in the code for soundCasual(). But the issue remains.
Till this snapshot was taken, I had tapped on that node about 17 times, and it also shows 17 against Persistent allocations for SCNAudioPlayer.
Note: SCNAudioSource is 10, as it should be, since there are 10 audio source I am using in the app
And this is happening for all other SCNNodes in my application without fail.
Kindly help as I am not able to understand what exactly am I missing.
EDIT
As per recommended, I changed my init() as
let path = Bundle.main.path(forResource: "Keemo", ofType: "scn", inDirectory: "Assets.scnassets/Shapes/Keemo")
if let path = path , let keemo = SCNReferenceNode(url: URL(fileURLWithPath: path)) {
keemo.load()
}
func soundPlay() {
DispatchQueue.global(qos: .userInteractive).async { [weak self] in
if let audioSource = self?.audioSourcePlay {
audioSource.volume = 0.1
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.removeAllAudioPlayers()
self?.addAudioPlayer(audioPlayer)
}
}
}
Despite, allocations in Instruments show audioPlayers as persistent. Though on checking node.audioPlayers it shows that at one point, there is only one audioPlayer node attached.
EDIT
This issue appears even in a simple case when I use the boilerplate codebase attached in a Scenekit app created by default in XCode. Hence, this issue has been raised as a bug to Apple. https://bugreport.apple.com/web/?problemID=43482539
WORKAROUND
I am using AVAudioPlayer instead of SCNAudioPlayer, not exactly the same thing, but at least memory this way will not cause a crash.
I am not familiar with SceneKit, but from my experience with UIKit and SpriteKit I suspect that your use of wrapperNode and virtualScene is messing with the garbage collector.
I would try removing wrapperNode and adding everything to self (since self is a SCNNode).
Which node is being used in your scene? self or wrapperNode? And with your sample code wrapperNode is not added to self so it may or may not actually be part of the scene.
Also, you should probably be using SCNReferenceNode instead of the virtual scene thing you're using.
!!! this code has not been tested !!!
class MyNode: SCNReferenceNode {
let audioSource5 = SCNAudioSource(fileNamed: "audiofile.mp3")
func soundCasual() {
DispatchQueue.global(qos: .userInteractive).async { [weak self] in
if let audioSource = self?.audioSource5 {
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.removeAllAudioPlayers()
self?.addAudioPlayer(audioPlayer)
}
}
}
}
// If you programmatically create this node, you'll have to call .load() on it
let referenceNode = SCNReferenceNode(URL: referenceURL)
referenceNode.load()
HtH!
If you haven't found the answer already, you need to remove the SCNAudioPlayer from the node once it has completed playing:
if let audioSource = self?.audioSourcePlay {
audioSource.volume = 0.1
let audioPlayer = SCNAudioPlayer(source: audioSource)
self?.removeAllAudioPlayers()
self?.addAudioPlayer(audioPlayer)
self?.audioplayer.didFinishPlayback = {
self?.removeAudioPlayer(audioPlayer)
}
}
Related
I am able to detect when a screen is detected, associate it with an appropriate windowScene and add a view to it. Slightly hacky but approximately working (code for disconnection not included here), thanks to this SO question:
class ExternalViewController: UIViewController {
override func viewDidLoad() {
view.backgroundColor = .cyan
print("external frame \(view.frame.width)x\(view.frame.height)")
}
}
class ViewController: UIViewController {
var additionalWindows: [UIWindow] = []
override func viewDidLoad() {
//nb, Apple documentation seems out of date.
//https://stackoverflow.com/questions/61191134/setter-for-screen-was-deprecated-in-ios-13-0
NotificationCenter.default.addObserver(forName: UIScreen.didConnectNotification, object: nil, queue: nil) { [weak self] notification in
guard let self = self else {return}
guard let newScreen = notification.object as? UIScreen else {return}
// Give the system time to update the connected scenes
DispatchQueue.main.asyncAfter(deadline: .now() + 0.05) {
// Find matching UIWindowScene
let matchingWindowScene = UIApplication.shared.connectedScenes.first {
guard let windowScene = $0 as? UIWindowScene else { return false }
return windowScene.screen == newScreen
} as? UIWindowScene
guard let connectedWindowScene = matchingWindowScene else {
NSLog("--- Connected scene was not found ---")
return
//fatalError("Connected scene was not found") // You might want to retry here after some time
}
let screenDimensions = newScreen.bounds
let newWindow = UIWindow(frame: screenDimensions)
NSLog("newWindow \(screenDimensions.width)x\(screenDimensions.height)")
newWindow.windowScene = connectedWindowScene
let vc = ExternalViewController()
vc.mainVC = self
newWindow.rootViewController = vc
newWindow.isHidden = false
self.additionalWindows.append(newWindow)
}
}
}
}
When I do this in the iOS simulator, I see my graphics fill the screen as intended, but when running on my actual device, it appears with a substantial black border around all sides.
Note that this is not the usual border seen with the default display mirroring behaviour - the 16:9 aspect ratio is preserved, and I do see different graphics as expected, (flat cyan color in my example code, normally I'm doing some Metal rendering that has some slight anomalies that are out of scope here, although perhaps might lead to some different clues on this if I dig into it deeper).
The print messages report the expected 1920x1080 dimensions. I don't know UIKit very well, and haven't been doing much active Apple development (I'm dusting off a couple of old side projects here in the hopes of being able to use them to project visuals at a gig in the near future), so I don't know if there's something else to do with sizing constraints etc that I might be missing, but even so it's hard to see why it would behave differently in the simulator.
Other apps I have installed from the app store do indeed show fullscreen graphics on the external display - Netflix shows fullscreen video as you would expect, Concepts shows a different representation of the document than the one you see on the device.
So, in this instance the issue is to do with Overscan Compensation. Thanks to Jerrot on Discord for pointing me in the right direction.
In the context of my app, it is sufficient to add newScreen.overscanCompensation = .none in the connection notification delegate (actually, in the part that is delayed a few ms after that - it doesn't work if applied directly in the connection notification). In the question linked above, there is further discussion of other aspects that may be important in a different context.
This is my ViewController modified to achieve the desired result:
class ViewController: UIViewController {
var additionalWindows: [UIWindow] = []
override func viewDidLoad() {
//nb, Apple documentation seems out of date.
//https://stackoverflow.com/questions/61191134/setter-for-screen-was-deprecated-in-ios-13-0
NotificationCenter.default.addObserver(forName: UIScreen.didConnectNotification, object: nil, queue: nil) { [weak self] notification in
guard let self = self else {return}
guard let newScreen = notification.object as? UIScreen else {return}
// Give the system time to update the connected scenes
DispatchQueue.main.asyncAfter(deadline: .now() + 0.05) {
// Find matching UIWindowScene
let matchingWindowScene = UIApplication.shared.connectedScenes.first {
guard let windowScene = $0 as? UIWindowScene else { return false }
return windowScene.screen == newScreen
} as? UIWindowScene
guard let connectedWindowScene = matchingWindowScene else {
NSLog("--- Connected scene was not found ---")
return
//fatalError("Connected scene was not found") // You might want to retry here after some time
}
let screenDimensions = newScreen.bounds
////// new code here --->
newScreen.overscanCompensation = .none
//////
let newWindow = UIWindow(frame: screenDimensions)
NSLog("newWindow \(screenDimensions.width)x\(screenDimensions.height)")
newWindow.windowScene = connectedWindowScene
let vc = ExternalViewController()
vc.mainVC = self
newWindow.rootViewController = vc
newWindow.isHidden = false
self.additionalWindows.append(newWindow)
}
}
}
}
In this day and age, I find it pretty peculiar that overscan compensation is enabled by default.
I am very new to Swift and Xcode and I was trying to use a video as a transition from one scene to another. I have most of the code for this figured out but the size of the video does not match the size of the screen, meaning when the transition occurs the background changes size. I could fix this by changing the AVLayerVideoGravity but the commands in the documentation give me errors, and I haven't found a way to set them properly. I'm sure this is really simple I just can't find the syntax I need anywhere, especially in the newest versions of swift and Xcode so I thought I would ask. Here is the code I have.
import UIKit
import AVKit
class MainRoom: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
}
#IBAction func Backpack(_ sender: Any) {
if let path = Bundle.main.path(forResource: "OpenBackpack", ofType: "avi", inDirectory: "SegueAnimations")
{
let OpenBackpack = AVPlayer(url: URL(fileURLWithPath: path))
let animPlayer = AVPlayerViewController()
animPlayer.player = OpenBackpack
animPlayer.showsPlaybackControls = true
let avPlayerLayer = AVPlayerLayer(player: OpenBackpack)
//Set avlayervideogravity to resizeaspectfill
present(animPlayer, animated: true, completion:
{
OpenBackpack.play()
})
}
}
}
Thank you for your help!
Found it! I misunderstood the code in another post, this works well
animPlayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
I have an app that I'm adding sounds to. It has a keypad and when the user taps a button, the number animates to show the user that their press went through.
However, since both are happening on the main thread, adding the following code below, the play() function causes a slight delay in the animation. If the user waits ~2 seconds and taps a keypad number again, they see the delay again. So as long as they're hitting keypad numbers under 2s, they don't see another lag.
I tried wrapping the code below in a DispatchQueue.main.async {} block with no luck.
if let sound = CashSound(rawValue: "buttonPressMelody\(count)"),
let url = Bundle.main.url(forResource: sound.rawValue, withExtension: sound.fileType) {
self.audioPlayer = try? AVAudioPlayer(contentsOf: url)
self.audioPlayer?.prepareToPlay()
self.audioPlayer?.play()
}
How can I play this audio and have the animation run without them interfering and with the audio coinciding with the press?
Thanks
I experienced a rather similar problem in my SwiftUI app, and in my case the solution was in proper resource loading / AVAudioPlayer initialization and preparing.
I use the following function to load audio from disk (inspired by ImageStore class from Apple's Landmarks SwiftUI Tutorial)
final class AudioStore {
typealias Resources = [String:AVAudioPlayer]
fileprivate var audios: Resources = [:]
static var shared = AudioStore()
func audio(with name: String) -> AVAudioPlayer {
let index = guaranteeAudio(for: name)
return audios.values[index]
}
static func loadAudio(with name: String) -> AVAudioPlayer {
guard let url = Bundle.main.url(forResource: name, withExtension: "mp3") else { fatalError("Couldn't find audio \(name).mp3 in main bundle.") }
do { return try AVAudioPlayer(contentsOf: url) }
catch { fatalError("Couldn't load audio \(name).mp3 from main bundle.") }
}
fileprivate func guaranteeAudio(for name: String) -> Resources.Index {
if let index = audios.index(forKey: name) { return index }
audios[name] = AudioStore.loadAudio(with: name)
return audios.index(forKey: name)!
}
}
In the view's init I initialize the player's instance by calling audio(with:) with proper resource name.
In onAppear() I call prepareToPlay() on view's already initialized player with proper optional unwrapping, and finally
I play audio when the gesture fires.
Also, in my case I needed to delay the actual playback by some 0.3 seconds, and for that I despatched it to the global queue. I should stress that the animation with the audio playback was smooth even without dispatching it to the background queue, so I concluded the key was in proper initialization and preparation. To delay the playback, however, you can only utilize the background thread, otherwise you will get the lag.
DispatchQueue.global(qos: .userInitiated).asyncAfter(deadline: .now() + 0.3) {
///your audio.play() analog here
}
Hope that will help some soul out there.
I'm current developing a small game where balloons float up from the bottom of the screen and the player has to pop them before they reach the top.
Each balloon object contains a method to safely set up it's AVAudioPlayer's:
func setUpAudioPlayerWithFile(file:NSString, type:NSString) -> AVAudioPlayer? {
let path = NSBundle.mainBundle().pathForResource(file as String, ofType: type as String)
let url = NSURL.fileURLWithPath(path!)
var audioPlayer:AVAudioPlayer?
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: url)
}catch {
print("BALLOON ERROR - AudioPlayer not avaliable.")
}
return audioPlayer
}
Then in each balloons init() method I run the following code:
if let popSound = self.setUpAudioPlayerWithFile("PopSound", type: "wav") {
self.popSound = popSound
}
Which works absolutely fine until a couple of minutes into the game. At this point I start receiving "BALLOON ERROR - AudioPlayer not available." in the console for every single balloon being spawned from then onwards indicating that my resource isn't being found?
At this time my SKEmitterNodes start to return nil as well. (2016-08-13 18:59:54.527 Stop & Pop[256:13785] *** -[NSKeyedUnarchiver initForReadingWithData:]: data is NULL)
Is there anything obvious I'm missing that could be causing these errors?
I hope I've provided enough information, thank you for reading.
No... this won't work. You have to instantiate the audioPlayer variable into the Class Level.
I tried to do this, instantiated the var audioPlayer on another ViewController at the Class Level.
Your best bet is to localize this method into your ViewController, and put the audioPlayer here:
class YourClassName : Type {
var audioPlayer : AVAudioPlayer?
yourMethodForCalling() {
}
I was quite dubious on this question's title phrasing, but I think that's the whole point as it is.
I've been trying to just read the CoreMotion data on the WatchKit, but as it turns out, I can't get startDeviceMotionUpdatesToQueue to work, my handler is never called.
I tried running in a custom background thread (NSOperationQueue()), still no luck.
I'm debugging on a real Apple Watch, not the simulator.
In my WKInterfaceController:
let manager = CMMotionManager()
override func awakeWithContext(context: AnyObject?) {
super.awakeWithContext(context)
let communicator = SessionDelegate()
manager.deviceMotionUpdateInterval = 1 / 60
manager.startDeviceMotionUpdatesToQueue(NSOperationQueue.mainQueue()) {
(motionerOp: CMDeviceMotion?, errorOp: NSError?) -> Void in
print("got into handler")
guard let motion = motionerOp else {
if let error = errorOp {
print(error.localizedDescription)
}
assertionFailure()
return
}
print("passed guard")
let roll = motion.attitude.roll
let pitch = motion.attitude.pitch
let yaw = motion.attitude.yaw
let attitudeToSend = ["roll": roll, "pitch": pitch, "yaw": yaw]
communicator.send(attitudeToSend)
}
print("normal stack")
}
the output is
normal stack
normal stack
(Yes, twice! I don't know why that either, but that is not the point, must be another thing I'm doing wrongly)
I'm posting this here 'cause I have no clue to where look into, this is freaking crazy.
Device Motion (startDeviceMotionUpdatesToQueue) is not available in WatchOS2 yet (deviceMotionAvailable returns false), probably accelerometer can help you startAccelerometerUpdatesToQueue