The provided configuration is not supported on this device - ios

I have Iphone 7, and am working on 3D face filters like tiktok but whenever i run on the app from xcode it shows error The provided configuration is not supported on this device and only shows black screen

You can't use all ARKit features on iPhone 7, some features require A12 processor, at least, and higher. For example: on iPhone 7 you can't use such features as People Occlusion, Body Tracking, simultaneous 3-Face Detection or Scene Reconstraction.
And remember: you have always to check with if or guard statement wheather a feature is supported on current device:
guard let config = arView.session.configuration as? ARWorldTrackingConfiguration
else {
print("You can't run this config on this device.")
}
guard ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth)
else {
print("People Occlusion isn't supported here.")
}
config.frameSemantics.insert(.personSegmentationWithDepth)
arView.session.run(config)

Related

Why canSetSessionPreset always returns true when device can't support the preset?

We want to use Apple's API to check if the device camera supports 4K, but have no success so far.
let captureSession = AVCaptureSession()
let supported = captureSession.canSetSessionPreset(AVCaptureSession.Preset.hd4K3840x2160)
if(supported){
print("supported")
}
else{
// Somehow, never comes here.
print("unsupported")
}
The above code/api always returns true even if the device can't support 4K preset (e.g. iPod Touch 7th generation).
In case this can't be fixed, Is there any other way of checking 4K camera?
I highly value your help.
Thank you.

Check if iOS device has LiDAR in Swift

is there a way in Swift to check if the device has a LiDAR sensor? Unfortunately I've didn't find anything in the official Apple documentary nor with internet search.
My current workaround is to determine the device type like described in this post:
How to determine the current iPhone/device model?
Thanks
Use this code:-
import ARKit
let supportLiDAR = ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh)
guard supportLiDAR else {
print("LiDAR isn't supported here")
return
}
Scene reconstruction requires a device with a LiDAR Scanner, such as the fourth-generation iPad Pro.
reference:- https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration/3521377-supportsscenereconstruction
The Accepted answer is fine and here is another solution :
you can check for the availability of depth data from LiDAR, we need to check whether our device supports this sensor
and enable its flag ‘.sceneDepth’ in ARConfiguration.
Use This Func
func setupARConfiguration() -> ARConfiguration{
let configuration = ARWorldTrackingConfiguration()
// add specific configurations
if ARWorldTrackingConfiguration.supportsFrameSemantics(.sceneDepth) {
configuration.frameSemantics = .sceneDepth
}else {
print("Device is not support lidar sensor")
}
return configuration
}
from Apple Docs :
Call this function before attempting to enable a frame semantic on your app's configuration. For example, if you call supportsFrameSemantic(.sceneDepth) on ARWorldTrackingConfiguration, the function returns true on devices that support the LiDAR scanner's depth buffer.
Ref :
https://developer.apple.com/documentation/arkit/arconfiguration/3089122-supportsframesemantics

How to support low data mode in iOS 13?

"Low Data Mode" is introduced in iOS 13. See "Settings" section Apple's iOS 13 overview:
I couldn't find any developer documentation on this.
Is this something third party app developers can opt into, as suggested by MacRumors? Or is will it just suspend background activity when not connected to Wi-Fi, as suggested by AppleInsider?
To determine if iOS is currently in Low Data mode, you can use the Network library:
import Network // Put this on top of your class
let monitor = NWPathMonitor()
monitor.pathUpdateHandler = { path in
if path.isConstrained {
// Path uses an interface in Low Data Mode.
}
else if path.isExpensive {
// Path uses an interface that is considered expensive, such as Cellular or a Personal Hotspot.
}
}
monitor.start(queue: DispatchQueue.global(qos: .background))
URLSession supports LowData mode in iOS 13.
Steps
Provide two different resources for high resolution and low resolution(low data mode)
If statusCode == 200 (low data mode is disabled in setting).
If error.networkAvailableReason == .constrained (low data mode is enable in settings)
Check out Advances in Networking, Part 1 from WWDC 2019 from 16:00 for a demo and sample code. You can use Combine to make the code simpler, but this is not required.
First of all, you need to configure your URLSession (NSURLSession) to either allow or disallow expensive or constrained network connections to be opened.
You can do that by changing URLSession's respective properties allowsExpensiveNetworkAccess or allowsConstrainedNetworkAccess to false (NO).
If a URLSessionTask results in an error that is either an NSURLErrorNotConnectedToInternet error containing NSURLErrorNetworkUnavailableReasonKey entry in userInfo (Objective-C) or an URLError with a non-nil NetworkUnavailableReason property set (Swift), then you need to act accordingly.
Those reasons can be:
expensive
constrained
cellular
The cellular reason has sort of been there since iOS 7, so it's not new, but the entire reason enumeration is, as Apple streamlined connection type handling a bit this year.
Here is the solution in Xamarin, for those interested:
NWPathMonitor monitor = new NWPathMonitor();
monitor.SetUpdatedSnapshotHandler(path =>
{
if (path.Status == NWPathStatus.Satisfied)
{
if(path.IsConstrained)
{
// Path uses an interface in Low Data Mode.
}
}
});
monitor.SetQueue(CoreFoundation.DispatchQueue.DefaultGlobalQueue);
monitor.Start();

ARKit – How to generate a worldMap for big environment?

I'm developing 4-players game using ARKit. I know how to save and then to get a worldMap. It's not difficult.
sceneView.session.getCurrentWorldMap { worldMap, error in
guard let map = worldMap
else { self.showAlert(title: "Can't get current world map", message: error!.localizedDescription); return }
guard let snapshotAnchor = SnapshotAnchor(capturing: self.sceneView)
else { fatalError("Can't take snapshot") }
map.anchors.append(snapshotAnchor)
do {
let data = try NSKeyedArchiver.archivedData(withRootObject: map, requiringSecureCoding: true)
try data.write(to: self.mapSaveURL, options: [.atomic])
DispatchQueue.main.async {
self.loadExperienceButton.isHidden = false
self.loadExperienceButton.isEnabled = true
}
} catch {
fatalError("Can't save map: \(error.localizedDescription)")
}
}
But I don't know how to track with iPhone the following place (it's 50x50 meters) to generate this worldMap.
Could you give me an idea how to track this space?
If you wanna effectively move within the real environment with AR objects around you, you should use the whole developer's arsenal for precise positioning: Core Location framework (it provides services for determining a device’s geographic location, altitude, orientation, or position relative to a nearby iBeacon), iBeacon framework and certified hardware for it (interactive possibilities for location awareness of iBeacon is especially cool for indoor navigation), Vision and Core ML frameworks (designed for use of a trained machine learning model to classify input data like signs and images).
Before using the aforementioned frameworks you should track with iPhone the whole environment multiple times (every time adding new feature points to the existing array of features). Look at the picture below to imagine how point cloud looks like:
P.S.
In addition to the above, ARKit 4.0 offers to developers such tools as ARGeoTrackingConfiguration with ARGeoAnchors and Scene Reconstruction feature (works when your device equipped with a LiDAR scanner).

ARKit: how to correctly check for supported devices?

I'm using ARKit but it is not a core functionality in my app, so I'm not setting the arkit key in the UIRequiredDeviceCapabilities. I'm setting the directive #available(iOS 11.0, *), but ARKit requires an A9 processor or above (that is, iPhone 6S or newer...)
What the best way to check that? I've found a workaround that involves checking the device model in several places, but it looks like it's a bit complicated. And would this be rejected in a review for the Store?
You should check the isSupported boolean provided by the ARConfiguration class for this.
From the Apple Developer Documentation:
isSupported
A Boolean value indicating whether the current device supports this session configuration class.
All ARKit configurations require an iOS device with an A9 or later processor. If your app otherwise supports other devices and offers augmented reality as a secondary feature, use this property to determine whether to offer AR-based features to the user.
Just check ARConfiguration availability.
if (ARConfiguration.isSupported) {
// Great! let have experience of ARKIT
} else {
// Sorry! you don't have ARKIT support in your device
}
Here is the solution Ar support or not
// Get array containing installed apps
var installedApps = require("react-native-installed-packages");
let appArray = installedApps.getApps();
// Check app_array for ar core package
var arPackage = appArray.find(function (_package) {
return _package == "com.google.ar.core";
});
if (typeof arPackage === "undefined") {
console.log("AR not support");
} else {
console.log("AR support");
}

Resources