ARKit just showing blue screen in Unity, not using camera? - ios

Ok I have no idea what is going on here, cant find any solutions anywhere. Here is what I happens when I try to run this ARKit Unity demo (or any AR demo for that matter) https://github.com/FusedVR/PetAR built to my iPhone -
The UI shows up, but where the camera capture is supposed to be occurring, I just have a blue screen. This is not what happens on their demo video online and it seems no one else has this problem.
I am on Unity 5.6.6, however I was on 2017 before and that did not work either. I made sure I had some text written in my "Camera description" field so the iPhone would allow camera access, and I am out of solutions at this point.
How can I get ARKit to work in Unity deployed to iOS? What am I doing wrong here?
I have the Unity build deploying via Xcode 9 the most recent beta.

There are certain hardware and software requirements in order to run ARKit-based applications.
https://developer.apple.com/arkit/
High Performance Hardware and Rendering Optimizations
ARKit runs on the Apple A9 and A10 processors.
Practically, you need an iPhone 6s or newer.
Introducing ARKit
iOS 11 introduces ARKit, a new framework
iOS 11 is also required.

Related

ios ARKit 3 with iPad Pro 2020, how to use front camera data with back camera tracking?

The ARKit API supports simultaneous world and face tracking via the back and front cameras, but unfortunately due to hardware limitations, the new iPad Pro 2020 is unable to use this feature (probably because the LIDAR camera takes a lot more power). This is a bit of a step back.
Here is an updated reference in the example project:
guard ARWorldTrackingConfiguration.supportsUserFaceTracking else {
fatalError("This sample code requires
iOS 13 / iPad OS 13, and an iOS device with
a front TrueDepth camera. Note: 2020 iPads
do not support user face-tracking while world tracking.")
}
There is also a forum conversation proving that this is an unintentional hardware flaw.
It looks like the mobile technology is not "there yet" for both. However, for my use case I just wanted to be able to switch between front and back tracking modes seamlessly, without needing to reconfigure the tracking space. For example, I would like a button to toggle between "now you track and see my face" mode and "world tracking" mode.
There are 2 cases: it's possible or it's impossible, but maybe there are some alternative approaches depending on that.
Is it possible, or would switching AR tracking modes necessitate setting-up the tracking space again? If so, how would it be achieved?
If it's impossible:
Even if I don't get face-tracking during world-tracking, is there a way to get a front-facing camera feed that I can use with the Vision framework, for example?
Specifically: how do I enable back-facing tracking and get front and back facing camera feeds simultaneously, and disable one or the other selectively? If it's possible even without front-facing tracking and only the basic feed, this will work.

Why is Pokemon Go running on unsupported devices?

If most of the devices are not supported ARCore, then why is Pokemon Go running on every device?
My device is not supported by ARCore but Pokemon Go is on it with full performance.
Why?
Until October 2017, Pokemon Go appears to use a Niantic made AR engine. At a high level, the game placed the Pokemon globally in space at a server defined location (the spawn point). The AR engine used the phone’s GPS and compass to determine if the phone should be moved to the left or to the right. Eventually, the phone pointed to the right heading and the AR engine drawed the 3D model over the video coming from the camera. At that time there was no attempt to perform mapping of the environment, surface recognition, etc. That was a simple, yet very effective technique which created the stunning effects we’ve all seen.
After that Niantic has shown prototypes of Pokemon GO using ARKit for iOS. It is easy to notice enhancements: missed pokeballs appear to bounce very naturally on the sidewalk and respect physics, it feels like Pikachu naturally walks on the sidewalk as opposed to floating in the air with the currently released game. Most observers expected Niantic to replace the current engine with ARKit (iOS) and ARCore (Android), possibly via Unity 3D AR APIs.
In early 2018 Niantic improved the aspect of the game on Android by adding support for ARCore, Google’s augmented reality SDK. And a similar update to what we’ve already seen on iOS 11, which was updated to support ARKit. The iOS update gave the virtual monsters a much greater sense of presence in the world, due to camera tracking, allowing them to more accurately stand on real-world surfaces rather than floating in the center of the frame. Android users will need a phone compatible with ARCore in order to use the new “AR+” mode.
Prior to AR+, Pokémon Go would use rough approximations of where objects were to try and place the Pokémon in your environment, but it was mostly a clunky workaround that functioned mostly as a novelty feature. The new AR+ mode also lets iOS users take advantage of a new capture bonus, called expert handler, that involves sneaking up close to a Pokémon, so as not to scare it away, in order to more easily capture it. With ARKit, since it’s designed to use the camera with the gyroscope and all the sensors, it actually feeds in 60 fps with full resolution. It’s a lot more performant and it actually uses less battery than the original AR mode.
For iOS users there's a standard list of supported devices:
iPhone 6s and higher
iPad 2017 and higher
For Android users not everything is clear. Let's see why. Even if you have an officially unsupported device with poor-calibrated sensors you can still use ARCore on your phone. For example, ARCore for All allows you do it. So for Niantic, as well, there's no difficulties to make every Android phone suitable for Pokemon Go.
Hope this helps.

Xcode 9 beta – Unsupported pixel format [Vuforia ARKit]

I have an project iOS within Unity that accesses the device's camera for Vuforia.
At some point within the App, ARKit will also be accessed. However, when switching from Vuforia to ARKit, I get this error;
//[Sensor] Unsupported pixel format: 875704438
//AR FAIL
This only happens, when Vuforia boosts up first, and then ARKit. Alone, if ARKit is only used, then it works fine.
This only seems to be the case with iOS11 - Beta 3. Worked fine on Beta 2, but I can't downgrade.
Any reasons as to what may be causing this?

A-Frame library limits on mobile?

This plunker shows a simple VR scene with the A-Frame library (a plane + ~10 lights).
It runs great on desktop whichever the quantity of lights.
The mobile iOS version loads at 60fps with 11 lights but shows a blank page with 12+ lights.
The stats show perfectly, I used WeInRe to output the console with no particular warning.
Is there a limitation of the complexity of the scene?
Thanks.
Regards,
JD
A-Frame Version: 0.4.0
Platform / Device: iOS 10.2 / iPhone 6s - Chrome & Safari
Reproducible Code Snippet or URL:
Editable :
https://plnkr.co/edit/Am8rjMdeaPzUWnFKX2i1?p=preview
Fullscreen preview :
https://run.plnkr.co/CgcUZgDUuPfeY15R/
Lights are expensive. I believe three.js has a limit on the number of lights, and there might be hardware constraints. It may be 60fps with just one plane, but each object you add will have to factor in 12 lights, the scene will quickly degrade.
Check out deferred renderer for many lights. Not sure if it works on mobile. https://github.com/takahirox/aframe-deferred-renderer

Use native iOS camera filters with Cordova Camera Plugin

I would like to use the native camera filters and editing options (like red eyes correction) available on iOS 7 with iPhone 4S and further on my Cordova app using the Camera Plugin.
I remember I already managed to do it, but I can't remember how...
I've tried the allowEdit: true option but it only allows you to crop your image before uploading...
I'm turning really crazy because I'm 100% sure I already got it working on a former project, without even trying to, and people seems to say it's impossible...
Does somebody have an idea ?

Resources