A-Frame library limits on mobile? - ios

This plunker shows a simple VR scene with the A-Frame library (a plane + ~10 lights).
It runs great on desktop whichever the quantity of lights.
The mobile iOS version loads at 60fps with 11 lights but shows a blank page with 12+ lights.
The stats show perfectly, I used WeInRe to output the console with no particular warning.
Is there a limitation of the complexity of the scene?
Thanks.
Regards,
JD
A-Frame Version: 0.4.0
Platform / Device: iOS 10.2 / iPhone 6s - Chrome & Safari
Reproducible Code Snippet or URL:
Editable :
https://plnkr.co/edit/Am8rjMdeaPzUWnFKX2i1?p=preview
Fullscreen preview :
https://run.plnkr.co/CgcUZgDUuPfeY15R/

Lights are expensive. I believe three.js has a limit on the number of lights, and there might be hardware constraints. It may be 60fps with just one plane, but each object you add will have to factor in 12 lights, the scene will quickly degrade.
Check out deferred renderer for many lights. Not sure if it works on mobile. https://github.com/takahirox/aframe-deferred-renderer

Related

MTKView vs GLKView performance on old devices

I am running into weird performance issues on older device (iPad Mini 2 of year 2014) ever since migrating to MTKView from OpenGLES based views. To summarize, I am rendering live video preview using MTKView and also optionally record videos (like in RosyWriter sample code). My new app version is written in Swift and supports both Metal & OpenGLES. So I have both Open GL and Metal rendering code to compare. I also have a legacy version of app written in Objective C & uses OpenGLES for all preview and rendering. Here are performance issues I see on iPad Mini 2:
When using MTKView, if I debug the app in XCode and zoom video (using AVCaptureDevice zoom API, no Metal code to crop/zoom frames), the frame rate drops significantly (from 30 to 10 fps). If I record video while zooming (which means additional video encoding pipeline in another thread), there are frame drops in the recorded video as well.
When I profile the app using time profiler, MTKView is smooth enough with zoom. However, if I start recording video now , MTKView fps again drops or frames get delayed in the preview, but there is no frame drop issue in recorded video -- the end recorded video is still smooth.
If I select Release version of app for Debug, behavior is same as while profiling,
If I switch back to GLKView instead of MTKView, there are no issues in preview even in Debug version. There are still issues when recording -- preview gets delayed when zooming. But if I profile the app, there are NO delays or frame drops!!!!
Finally, the original legacy version in Objective C that uses OpenGLES for all purposes has no issues at all.
Now the question is what tools in Instruments can I use to nail down the exact issues for frame drops in preview and recording. Can Metal not match performance of OpenGLES on older devices?

ARKit just showing blue screen in Unity, not using camera?

Ok I have no idea what is going on here, cant find any solutions anywhere. Here is what I happens when I try to run this ARKit Unity demo (or any AR demo for that matter) https://github.com/FusedVR/PetAR built to my iPhone -
The UI shows up, but where the camera capture is supposed to be occurring, I just have a blue screen. This is not what happens on their demo video online and it seems no one else has this problem.
I am on Unity 5.6.6, however I was on 2017 before and that did not work either. I made sure I had some text written in my "Camera description" field so the iPhone would allow camera access, and I am out of solutions at this point.
How can I get ARKit to work in Unity deployed to iOS? What am I doing wrong here?
I have the Unity build deploying via Xcode 9 the most recent beta.
There are certain hardware and software requirements in order to run ARKit-based applications.
https://developer.apple.com/arkit/
High Performance Hardware and Rendering Optimizations
ARKit runs on the Apple A9 and A10 processors.
Practically, you need an iPhone 6s or newer.
Introducing ARKit
iOS 11 introduces ARKit, a new framework
iOS 11 is also required.

iOS games with random frame drops (including my game)

I'm almost finishing my iOS game written in Swift + SpriteKit.
It's a quite simple game, 30-32 nodes at max. Only 1 thing has physics. The rest is a few animated clouds (around 6). The CPU usage is around 2-3% and max RAM usage of 75-80MB.
Including that I also get frame drops when changing from one scene to another. Why that could be?
(I'm pre-loading all the textures and sounds during game init, and not on the scenes)
When I use the simulator for 5S up to 6S Plus, I don't see any frame drop in there. So that's weird. Looks like it's not my game but my iPhone 6S?
Now, I do also have other games installed on the same device from different developers, and I frequently get random frame drops too. Lags for 2-3 seconds and then comes back to 60fps.
Does anyone know if this is something that's happening after an X iOS update ? or I was even thinking this my be some kind of background service running that's killing my phone. Call it facebook, whatsapp, messenger, etc.
Is there any way I could possibly check on what's going on?
Was this caused by the way that newer versions of SpriteKit are defaulting to Metal render mode as compared to OpenGL mode? For example, do your problems go away when PrefersOpenGL=YES is added to Info.plist? I covered a bit of this performance issue in my blog post about a SpriteKit repeat shader. Note that you should only be testing on an actual iOS device, not the simulator.

papervision 3d ridiculously slow on air for iOS

My game uses air for ios 3.5 gpu mode and when making a 24x24 segments Sphere using papervision, the game's FPS drops significantly on mobile devices like iPad. I Can't even display a single rough sphere if I were to maintain the game's FPS at high.
Is this normal or am I doing something wrong or is papervision not suitable for Air for iOS? Also is there any fast rendering 3d library that can run with gpu rendering mode? At this point the development has gone too far to switch it back to direct rendering mode
Papervision is not an active 3d lib, for 60fps rendering you have to use a 3D hardware accelerated flash framework like one of those :
Away3d
demo : http://away3d.com/showcase/
Flare3d
demo : http://www.flare3d.com/showcase/
edit
Minko
demo : aerys.in/portfolio/mercedes-e500
Alternativa3D
demo : alternativaplatform.github.com/Alternativa3D/Demos/ParticlesDemo/
And yes they all works on an ipad :)

cacheAsBitmap has no effect on a Sprite masked with a scrollRect in AIR for iOS

I'm developing a simple kinetic menu UI component for AIR for iPad. It's basically a lightweight fill-in for a combobox that matches the style of iOS. I have a sprite containing any where from 2 to 60 item buttons that pops up and lets you flick/ scroll through them, only showing about 7 items at any given time.
My first attempt at this used a mask over my sprite, moving my menu sprite up and down under the stationary mask. This produced lackluster results on the test device (< 20 fps).
I then tried a blitting solution, leaving the menu sprite off the display list and using BitmapData.draw() to render only the part to the list i needed visible. This produced the best results on my Windows dev platform, but this time the framerate dropped below 10 fps on iPad. I am assuming I was incurring either a taxing CPU usage or a GPU readback penalty. Originally I had hoped to be able to run my app a 60 fps, however I've ratcheted my goal down to a more humble 30 fps.
Which brings me to my 3rd attempt at this UI component using the sprite's .scrollRect masking function in conjunction with .cacheAsBitmap . Again, the observed behaviors differ wildly between AIR on Windows vs. iOS. On Windows it only redraws the part of the menu sprite bounded by the dimensions of the scrollRect as it should. With iOS i can touch the area of the screen above or below the visible area of the menu sprite and still drag the menu even though my finger is over "empty" space! The performance here is decent, hovering between (19 - 25 fps) and would almost certainly be perfect at 30 if it worked as it did on windows.
Does anyone have any ideas either about the scrollRect feature's behavior on AIR for iOS or a better way of implementing an iOS native style gliding menu in AIR for iOS?
Note, the above methods were tried in both CPU and GPU mode, but CPU mode performed vastly better. I used AIR 2.7 installed on top of Flash Pro CS 5.5, with FlashDevelop as my IDE.
http://esdot.ca/site/2011/fast-rendering-in-air-3-0-ios-android#comment-10
Really nice guy from the above link: "Ya, scrollRect is basically a no-go on mobile, basically forget that API even exists. Believe it or not… old school masking is the way to go. Round and round we go!"

Resources