ARKit switch between ARWorldTrackingConfiguration and ARFaceTrackingConfiguration - Rear and Front Camera - ios

In my project I want to switch between ARWorldTrackingConfiguration and ARFaceTrackingConfiguration.
I use two different types of view a ARSCNView to use the rear camera and a ARView to do the face tracking.
First I start the ARSCNView and after, if the user want, he can switch to face tracking
I start my view controller in this mode:
sceneView.delegate = self
sceneView.session.delegate = self
// Set up scene content.
setupCamera()
sceneView.scene.rootNode.addChildNode(focusSquare)
let configurationBack = ARWorldTrackingConfiguration(
configurationBack.isAutoFocusEnabled = true
configurationBack.planeDetection = [.horizontal, .vertical]
sceneView.session.run(configurationBack, options: [.resetTracking, .removeExistingAnchors])
And I load my Object (.scn)
When I want to switch to front camera I and pass to ARView I do this:
let configurationFront = ARFaceTrackingConfiguration()
// here I stop my ARSCNView session
self.sceneView.session.pause()
self.myArView = ARView.init(frame: self.sceneView.frame)
self.myArView!.session.run(configurationFront)
self.myArView!.session.delegate = self
self.view.insertSubview(self.myArView!, aboveSubview: self.sceneView)
And than I load my .rcproject
So my problem begin here, when I try to return to back camera and pass to ARWorldTracking again.
This is my method:
// remove my ARView with face tracking
self.myArView?.session.pause()
UIView.animate(withDuration: 0.2, animations: {
self.myArView?.alpha = 0
}) { (true) in
self.myArView?.removeFromSuperview()
self.myArView = nil
}
// here I restart the initial ARSCNView
let configurationBack = ARWorldTrackingConfiguration(
configurationBack.isAutoFocusEnabled = true
configurationBack.planeDetection = [.horizontal, .vertical]
session.run(configurationBack, options: [.resetTracking, .removeExistingAnchors])
When I switch to back camera, the sensor doesn't track the planes correctly.
How can I fix that, so how can I switch correctly between ARWorldTrackingConfiguration and ARFaceTrackingConfiguration?
Thanks in advance

Make sure to also remove all added nodes to the scene when you're pausing the session. Add below code after where you pause the session with self.sceneView.session.pause():
self.sceneView.scene.rootNode.enumerateChildNodes { (childNode, _) in
childNode.removeFromParentNode()
}

Related

Why am I having trouble filling an external screen on iPad (but not in simulator)?

I am able to detect when a screen is detected, associate it with an appropriate windowScene and add a view to it. Slightly hacky but approximately working (code for disconnection not included here), thanks to this SO question:
class ExternalViewController: UIViewController {
override func viewDidLoad() {
view.backgroundColor = .cyan
print("external frame \(view.frame.width)x\(view.frame.height)")
}
}
class ViewController: UIViewController {
var additionalWindows: [UIWindow] = []
override func viewDidLoad() {
//nb, Apple documentation seems out of date.
//https://stackoverflow.com/questions/61191134/setter-for-screen-was-deprecated-in-ios-13-0
NotificationCenter.default.addObserver(forName: UIScreen.didConnectNotification, object: nil, queue: nil) { [weak self] notification in
guard let self = self else {return}
guard let newScreen = notification.object as? UIScreen else {return}
// Give the system time to update the connected scenes
DispatchQueue.main.asyncAfter(deadline: .now() + 0.05) {
// Find matching UIWindowScene
let matchingWindowScene = UIApplication.shared.connectedScenes.first {
guard let windowScene = $0 as? UIWindowScene else { return false }
return windowScene.screen == newScreen
} as? UIWindowScene
guard let connectedWindowScene = matchingWindowScene else {
NSLog("--- Connected scene was not found ---")
return
//fatalError("Connected scene was not found") // You might want to retry here after some time
}
let screenDimensions = newScreen.bounds
let newWindow = UIWindow(frame: screenDimensions)
NSLog("newWindow \(screenDimensions.width)x\(screenDimensions.height)")
newWindow.windowScene = connectedWindowScene
let vc = ExternalViewController()
vc.mainVC = self
newWindow.rootViewController = vc
newWindow.isHidden = false
self.additionalWindows.append(newWindow)
}
}
}
}
When I do this in the iOS simulator, I see my graphics fill the screen as intended, but when running on my actual device, it appears with a substantial black border around all sides.
Note that this is not the usual border seen with the default display mirroring behaviour - the 16:9 aspect ratio is preserved, and I do see different graphics as expected, (flat cyan color in my example code, normally I'm doing some Metal rendering that has some slight anomalies that are out of scope here, although perhaps might lead to some different clues on this if I dig into it deeper).
The print messages report the expected 1920x1080 dimensions. I don't know UIKit very well, and haven't been doing much active Apple development (I'm dusting off a couple of old side projects here in the hopes of being able to use them to project visuals at a gig in the near future), so I don't know if there's something else to do with sizing constraints etc that I might be missing, but even so it's hard to see why it would behave differently in the simulator.
Other apps I have installed from the app store do indeed show fullscreen graphics on the external display - Netflix shows fullscreen video as you would expect, Concepts shows a different representation of the document than the one you see on the device.
So, in this instance the issue is to do with Overscan Compensation. Thanks to Jerrot on Discord for pointing me in the right direction.
In the context of my app, it is sufficient to add newScreen.overscanCompensation = .none in the connection notification delegate (actually, in the part that is delayed a few ms after that - it doesn't work if applied directly in the connection notification). In the question linked above, there is further discussion of other aspects that may be important in a different context.
This is my ViewController modified to achieve the desired result:
class ViewController: UIViewController {
var additionalWindows: [UIWindow] = []
override func viewDidLoad() {
//nb, Apple documentation seems out of date.
//https://stackoverflow.com/questions/61191134/setter-for-screen-was-deprecated-in-ios-13-0
NotificationCenter.default.addObserver(forName: UIScreen.didConnectNotification, object: nil, queue: nil) { [weak self] notification in
guard let self = self else {return}
guard let newScreen = notification.object as? UIScreen else {return}
// Give the system time to update the connected scenes
DispatchQueue.main.asyncAfter(deadline: .now() + 0.05) {
// Find matching UIWindowScene
let matchingWindowScene = UIApplication.shared.connectedScenes.first {
guard let windowScene = $0 as? UIWindowScene else { return false }
return windowScene.screen == newScreen
} as? UIWindowScene
guard let connectedWindowScene = matchingWindowScene else {
NSLog("--- Connected scene was not found ---")
return
//fatalError("Connected scene was not found") // You might want to retry here after some time
}
let screenDimensions = newScreen.bounds
////// new code here --->
newScreen.overscanCompensation = .none
//////
let newWindow = UIWindow(frame: screenDimensions)
NSLog("newWindow \(screenDimensions.width)x\(screenDimensions.height)")
newWindow.windowScene = connectedWindowScene
let vc = ExternalViewController()
vc.mainVC = self
newWindow.rootViewController = vc
newWindow.isHidden = false
self.additionalWindows.append(newWindow)
}
}
}
}
In this day and age, I find it pretty peculiar that overscan compensation is enabled by default.

iOS ARKit4 - how to fix white balance for the ARCamera?

I have an app that generates point cloud from multiple ARFrame. It appears that the camera used to capture the image has dynamic white balance, and can change it in the middle of a capture session.
How do I configure ARView, ARSession, or ARCamera to force it to lock white balance for the duration of the session?
I have access to the following parameters, but do not see anything related to white balance.
var arView: ARView!
let session: ARSession = arView.session
var sampleFrame: ARFrame = session.currentFrame!
let camera = sampleFrame.camera
func configureSessionAndRun() {
arView.automaticallyConfigureSession = false
let configuration = ARWorldTrackingConfiguration()
configuration.sceneReconstruction = .meshWithClassification
configuration.frameSemantics = .smoothedSceneDepth
configuration.planeDetection = [.horizontal, .vertical]
configuration.environmentTexturing = .automatic
arView.session.run(configuration)
}
There are only two AR View's properties that could help, but they are just gettable, not settable:
let frame = arView.session.currentFrame
frame?.camera.exposureDuration // { get }
frame?.camera.exposureOffset // { get }

Resetting AVPlayer and ARImageTracking

I made an app that allows you to scan an image with an ios device and it plays a video using arKit tracking . However, once the video is finished, I can't seem to find a way to make it restart . Is there anyway I can recall all the code in the viewcontroller?
if let imageAnchor = anchor as? ARImageAnchor {
// Create a plane
let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
if imageAnchor.referenceImage.name == "skateboard" {
// Set AVPlayer as the plane's texture and play
plane.firstMaterial?.diffuse.contents = self.SkateboardVideoPlayer
self.SkateboardVideoPlayer.play()
}
}
Have you tried looping the video with a notification observer?
See the below code from here:
var SkateboardVideoPlayer: AVPlayer!
// In viewDidLoad or similar method
...
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: self.SkateboardVideoPlayer.currentItem, queue: .main) { [weak self] _ in
self?.SkateboardVideoPlayer?.seek(to: CMTime.zero)
self?.SkateboardVideoPlayer?.play()
}

ARKit: How to reset world orientation after interruption

I'm trying to get the ARWorldTracking session to re-orient north after a session interruption. I've went over the documentation a few times but I'm finding it confusing.
Current Behavior:
When I lock the device and reopen the app, triggering the sessionWasInterrupted, the SCNNodes all shift counterclockwise on the compass by ~90 degrees or so.
When you call the run(_:options:) method with a configuration of a
different type than the session's current configuration, the session
always resets tracking
I interpreted that as saying that when I generate a new set of configurations that is different from the viewWillAppear, the session will "reset". I'm not getting what is actually happening, but the orientation after interruption is off. (and
removeExistingAnchors does nothing)
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARWorldTrackingSessionConfiguration()
configuration.planeDetection = .horizontal
configuration.worldAlignment = .gravityAndHeading
sceneView.session.run(configuration)
}
func sessionWasInterrupted(_ session: ARSession) {
let configuration = ARWorldTrackingSessionConfiguration()
configuration.planeDetection = .horizontal
configuration.worldAlignment = .gravityAndHeading
self.sceneView.session.run(configuration, options: [ARSession.RunOptions.removeExistingAnchors, ARSession.RunOptions.resetTracking])
}
Desired Behavior:
When the app detects a session interruption, I'd like it to re-orient itself back to true north.
This issue was killing me, too - you helped me out with half the solution - adding the 'Reset tracking / Remove existing anchors' flags was the magic key for me - I think the other half is the guidance from This post where you have to pause your session and remove all the nodes from the scene, then re-position them. The combination of both of these things got the compass to reset back to True North after session interruption for me.
func resetARSession() {
// Called by sessionInterruptionDidEnd
sceneView.session.pause()
sceneView.scene.rootNode.enumerateChildNodes { (node, stop) in
node.removeFromParentNode() }
setupARSession()
setupSceneView()
}
func setupARSession() {
let configuration = ARWorldTrackingConfiguration()
configuration.worldAlignment = .gravityAndHeading
sceneView.session.run(configuration, options: [ARSession.RunOptions.resetTracking, ARSession.RunOptions.removeExistingAnchors])
}

AVCaptureMetadataOutput Region of Interest for Barcodes

I am looking to display a UIView subclass within a UIStackView. The subclass is called PLBarcodeScannerView and is using AVCaptureMetadataOuput to detect barcodes within the camera's field of view. Because this view is not the entire screen, I need to set the region of interest to be the same as the frame of the PLBarcodeScannerView because the user is only seeing a portion of the camera view and we want to be sure that the barcode in the visible view is the one being scanned.
Issue
I cannot seem to set the metadataOutputRectOfInterest properly, nor does the "zoom level" of the preview layer on this view seem correct, although the aspect ratio is correct. The system does receive barcodes successfully, but they are not always visible within the preview window. Codes are still scanned when they reside outside the visible preview window.
Screenshot:
The colorful photo is the PLBarcodeScannerView. Only codes which are fully visible inside this view should be considered.
Below is the code that initializes the view:
This is called within the init methods of PLBarcodeScannerView:UIView
func setupView() {
session = AVCaptureSession()
let tap = UITapGestureRecognizer(target: self, action: #selector(self.resume))
addGestureRecognizer(tap)
// Set the captureDevice.
let videoCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
// Create input object.
let videoInput: AVCaptureDeviceInput?
do {
videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
} catch {
return
}
// Add input to the session.
if (session!.canAddInput(videoInput)) {
session!.addInput(videoInput)
} else {
scanningNotPossible()
}
// Create output object.
let metadataOutput = AVCaptureMetadataOutput()
// Add output to the session.
if (session!.canAddOutput(metadataOutput)) {
session!.addOutput(metadataOutput)
// Send captured data to the delegate object via a serial queue.
metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
// Set barcode type for which to scan: EAN-13.
metadataOutput.metadataObjectTypes = [
AVMetadataObjectTypeCode128Code
]
} else {
scanningNotPossible()
}
// Determine the size of the region of interest
let x = self.frame.origin.x/UIScreen.main.bounds.width
let y = self.frame.origin.y/UIScreen.main.bounds.height
let width = self.frame.width/UIScreen.main.bounds.height
let height = self.frame.height/UIScreen.main.bounds.height
let scanRectTransformed = CGRect(x: x, y: y, width: 1, height: height)
metadataOutput.metadataOutputRectOfInterest(for: scanRectTransformed)
// Add previewLayer and have it show the video data.
previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.frame = self.bounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
layer.addSublayer(previewLayer)
// Begin the capture session.
session!.startRunning()
}

Resources