I have an project iOS within Unity that accesses the device's camera for Vuforia.
At some point within the App, ARKit will also be accessed. However, when switching from Vuforia to ARKit, I get this error;
//[Sensor] Unsupported pixel format: 875704438
//AR FAIL
This only happens, when Vuforia boosts up first, and then ARKit. Alone, if ARKit is only used, then it works fine.
This only seems to be the case with iOS11 - Beta 3. Worked fine on Beta 2, but I can't downgrade.
Any reasons as to what may be causing this?
Related
I recently started building a game for iOS using my iPhone 5s for testing.
However, recently I have decided to try my iPhone SE to see how it looks; all of the materials on all of the objects appear to be missing, but the color of the trails for the particle effects and the line renderers still have their color.
It could be a shader issue or a problem with the graphics API Unity is using. If you're not using the standard shader then make sure that you're shader is compatible with mobile devices. Also make sure that it's included in the project by creating a folder named Resources and moving your shader into that.
If you're using one of the standard shaders that comes with Unity then the issue is likely not a shader one but the Graphics API selected. It's likely using Metal which is causing that issue. Use OpenGL ES instead of Metal.
Disable Auto Graphics API then change iOS Graphics API to OpenGLES2 or OpenGLES3 in Unity's Player Settings.
Running a fresh build of the exact same application on iOS 11.2 and 11.3, everything works fine on 11.2, but on 11.3 it appears as if none of the materials are getting rendered onto the ARKit face node, that uses a SCNMorpher and blendshape values to update.
I'm going to dig into this more and report back a fix if I can find one, but I thought I'd drop a beacon here to see if anyone else is having similar issues, and if so, if they've found any solutions, or if anyone has any ideas in general.
It is my understanding that minor version bumps should be fully backwards compatible. Is that correct?
Thanks
Update 1: This seems to effect both ARKit and SceneKit scenes.
Update 2: This seems to be related to both materials and the way lighting is handled differently in 11.3. Changing the lighting to be rendered further away, a similar effect to 11.2 is achieved. It's almost as if the base units for distance have changed from meters to inches or something. Confusingly, changing the lighting distance seems to only effect a device running 11.3 now, not 11.2. The problem now is that the rendering is completely matte. Further isolation reveals that specular is rendering normally in 11.2, but when specular is set in 11.3, either as a UIImage or a UIColor, it simply does not have any effect. I'm going to try setting the specular as something else, like a CALayer and CGImage, and see if either of those work, despite all of the above having claimed support in the documentation.
iOS 11.3 seems to change the default lightingModel, which is the root cause of all the woes.
Simply setting the materials explicitly to what was previously implicit resolved all pains.
For example,
baseNode.geometry?.materials[0].lightingModel = .blinn
Ok I have no idea what is going on here, cant find any solutions anywhere. Here is what I happens when I try to run this ARKit Unity demo (or any AR demo for that matter) https://github.com/FusedVR/PetAR built to my iPhone -
The UI shows up, but where the camera capture is supposed to be occurring, I just have a blue screen. This is not what happens on their demo video online and it seems no one else has this problem.
I am on Unity 5.6.6, however I was on 2017 before and that did not work either. I made sure I had some text written in my "Camera description" field so the iPhone would allow camera access, and I am out of solutions at this point.
How can I get ARKit to work in Unity deployed to iOS? What am I doing wrong here?
I have the Unity build deploying via Xcode 9 the most recent beta.
There are certain hardware and software requirements in order to run ARKit-based applications.
https://developer.apple.com/arkit/
High Performance Hardware and Rendering Optimizations
ARKit runs on the Apple A9 and A10 processors.
Practically, you need an iPhone 6s or newer.
Introducing ARKit
iOS 11 introduces ARKit, a new framework
iOS 11 is also required.
I would like to use the native camera filters and editing options (like red eyes correction) available on iOS 7 with iPhone 4S and further on my Cordova app using the Camera Plugin.
I remember I already managed to do it, but I can't remember how...
I've tried the allowEdit: true option but it only allows you to crop your image before uploading...
I'm turning really crazy because I'm 100% sure I already got it working on a former project, without even trying to, and people seems to say it's impossible...
Does somebody have an idea ?
I enable multisampling support on iOS 4.3.3 with OpenGLES2, and the rendering result is awful, as if the color is in RGB565, not ARGB8888.
The thing is, either turn multisampling off, or deploying the same ipa to an iOS 5 device(which indicates that I did turn on multisampling correctly), the problem will not occur, except that turning multisampling off will make it very ugly(which also indicates that the multisample did work for most cases).
The test case is very simple, just render a quad with an texture attached in ortho projection mode, the color format of texture isRGBA8888.
Anyone has ever met the same problem before? Is this a bug of Apple SDK?
BTW, the SDK I used is the one shipped with Xcode 4.3.2, the ios deployment target is set to 4.0
Turns out the root cause is the eaglLayer.drawableProperties was set to kEAGLColorFormatRGB565.
But still, why the result is so different when turning multisampling on/off?
I was wondering if this caused by different color format of sampling buffer(RGBA8) and surface buffer(RGB565), then I changed the sampling buffer to GL_RGB565, but the problem still no solved.
Maybe this is something not well implemented regarding glResolveMultisampleFramebufferAPPLE? Anyway the problem does not exist on iOS 5 device.