Crash when instantiating ARSCNView for the second time - ios

I have a problem in my iPhone app when trying to instantiate an ARSCNView again, after destroying it.
In my ViewController I programmatically create an ARSCNView for motion capture interaction:
func addARSceneView() {
arSceneView = ARSCNView(frame: self.view.frame)
arSceneView.loops = true
arSceneView.session.delegate = self
self.view.addSubview(arSceneView)
arSceneView.session.run(ARBodyTrackingConfiguration())
}
When the user leaves this part of the app, I tear it down like this:
func removeARSceneView() {
arSceneView.session.pause()
arSceneView.pause(self)
arSceneView.session.delegate = nil
arSceneView.removeFromSuperview()
arSceneView = nil
}
Later, when I try to instantiate an ARSCNView for the second time using the first function above, it crashes with an EXC_BAD_ACCESS in the constructor:
I also tried to use a view from a xib which contains an ARSCNView but the same problem occurs, in that case in the init(coder) function of that view.
I found nothing on this problem, I guess usually developers only create an ARSCNView once.

TLDR: Turn "Metal API Validation" on in your scheme.
I found the culprit, after creating a sample project with only the ARSCNView, which did not have this problem. I started by stripping everything away from my original project until it was as barebones as the sample. That did not solve it, so I compared every little setting of the two, and behold: in the "Run" scheme of the original project, under "Diagnostics", I had "Metal – API Validation" ticked off. I don't remember when and why I did that; I assume it was some attempt to improve performance at one point. However, enabling this checkbox solved the problem completely.

Related

EAExternalAccessory Bluetooth Accessory Picker doesn't display on screen

I've been trying to use the following method on Swift 5 to display the Bluetooth Accessory Picker:
DispatchQueue.main.async {
EAAccessoryManager.shared().showBluetoothAccessoryPicker(withNameFilter: nameFilter, completion: nil)
}
But I get the following error:
A constraint factory method was passed a nil layout anchor. This is not allowed, and may cause confusing exceptions. Break on BOOL _NSLayoutConstraintToNilAnchor(void) to debug. This will be logged only once. This may break in the future.
I've tried adding a Symbolic breakpoint, but it doesn't trigger. I've also tried calling this method from other parts of the code with different views, but still no success.
I'd really appreciate some help with this issue!
It broke when Apple introduced the new Scene-based lifecycle. Roll back to the classic one and it starts working again. More details here: https://stackoverflow.com/a/70823487/415982

Removing default camera movement in SimpleApplication

I'm absolute beginner in UrhoSharp. I only wanted to implement some basic 3D stuff into my app. Everything works fine with SimpleApplication, the screen is supposed to be watched from one place and direction. When I touch the screen, the scene rotates however. How can I get rid of this behavior?
I wanted to try to override some function of SimpleApplication (probably OnUpdate) so I came with name CursedApplication replacing SimpleApplication everywhere. When I use
using CursedApplication = Urho.SimpleApplication;
everything still works. But what I supposed to be the equivalent
class CursedApplication : Urho.SimpleApplication
{
CursedApplication(ApplicationOptions options) : base(options)
{
}
}
breaks the application. Some idea how can I make things work? Or do I have to build my own scene logic without SimpleApplication?
Finally I found that this can be done with
app.Input.Enabled=false;
where app is instance of SimpleApplication.

Buttons in iOS app don't respond to touches around the edge of the screen

I'm relatively new to iOS development, but I'm having a go at working on some open source code for an old game that used to be pretty popular (Eden World Builder)
I've made quite a lot of progress in cleaning up the codebase, making small changes. But there's an issue I can't seem to fix. Every button in the game will not respond to taps around the edges of the screen. If a button is in the corner of the screen, you will have to tap towards the bottom of the button.
I've tried to move the buttons away from the edges, and they work, but that isn't practical for use. So there's something preventing the edges of the screen from registering button taps for some reason, and it doesn't seem to be anything to do with the button target areas themselves.
One thing I've noticed: This game is currently on the App Store, even though it hasn't been updated since 2015. In the App Store version (Which is built from the same code that I have) the issue doesn't occur. It must be something to do with building it in a newer version of Xcode, right?
Any assistance would be very helpful, this has been frustrating me for weeks now. Thanks
The answer to your first question is most likely either that your button is outside its parent view, or a gesture recognizer is interfering.
If a button extends beyond its parent views boundaries (or any of its higher parents' boundaries), it will still be visible as long as the parent doesn't have clipping enabled. The result is that you will still see the button, but it will only respond when touching the parts that are inside the parent view. You can find this visually by using Xcode View UI Hierarchy found in the Debug Navigator.
If it is gesture recognizers that interfere with your button, there are several solutions that might work. Several are described in the link you got from #Anbu in the comment.
The answer to your second question is that old apps are linked against old frameworks. Even if they run on the latest iOS version, they still pull in older versions of the framework, causing them to (mostly) continue work as before. This is done to keep compatibility with legacy code.
Try adding this to viewDidAppear
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
if let window = view.window,
let recognizers = window.gestureRecognizers {
recognizers.forEach { r in
r.delaysTouchesBegan = false
r.cancelsTouchesInView = false
r.isEnabled = false
}
}
}

App crashes strangely regarding MKMapView memory issues

I've been developing an app which was working perfectly fine until I added an MKMapView. If I navigate in the map for a broader area, dismiss the view controller with the map and add a new view controller, the app will crash with Xcode saying "Lost connection to iPhone".
I have searched online and I found it should be a memory issue. So I have used a lot of ways to clean the memory of MKMapView, including only storing one instance of the MKMapView in AppDelegate and cleaning it after ViewDidDisappear like this:
if let annotations = self.mapView?.annotations {
self.mapView?.removeAnnotations(annotations)
}
if self.mapView?.mapType == MKMapType.standard {
self.mapView?.mapType = MKMapType.hybrid
} else if self.mapView?.mapType == MKMapType.hybrid {
self.mapView?.mapType = MKMapType.standard
}
self.mapView?.delegate = nil
self.mapView?.removeFromSuperview()
self.mapView = nil
Even more strange is that when I was tracking the memory usage, while in the map, the memory usage can go up to 300+mb and it seems to clean itself while exploring more. However, there will be around 200mb left after I dismiss that view controller and when I'm adding another simple VC with just one UIIMageView, it crashes with the "Lost Connection" thing.
I'm new to memory management but I do have a screenshot here for instrument profile:
Just like here in the image. Generation A is the point where I presented the view controller with the MKMapView and Generation B is the point where I dismissed that view. The memory usage was apparently dropping, which is good. But as soon as I click on presenting a different view controller, it disconnects just like in Xcode.
I have done a lot of research on this and I really don't know what to do at this point. Thanks if you guys can help me out here!!!!
Problem fixed after I switched to Google Map...

Xcode incorrectly displaying the value of watch variables

I was scratching my head wondering why my rvc was set to nil when watching it while debugging through Xcode and thinking there must be something wrong with my project. So I created a project from scratch using the Xcode single view app template and then the only change I made was to add the following lines to didFinishLaunchingWithOptions:
Now look at what Xcode is displaying for the value of rvc in the watch window:
What's going on, why is it reporting rvc is nil?
Its not actually nil, but Xcode's watch window is reporting it as such. Is this a known issue with Xcode?
I found out the problem - its the positioning of the breakpoint, position it a line higher and rvc is not nil.
If you look at the watch window you can see rvc is light text when the other variables are bold text.
So my conclusion is at that point Xcode is reporting that rvc has gone out of scope and been deallocated, yet as the return statement has not yet executed at that point then rvc should not yet have gone out of scope.
Is it some Objective-C optimization thing going on resulting in this in Xcode or is it a defect with Xcode?
Either way its wasted a few hours of my time thinking I had a problem when there wasn't. In the future I need to make sure never to position breakpoints on a return statement again that contains local variables.

Resources