papervision 3d ridiculously slow on air for iOS - ios

My game uses air for ios 3.5 gpu mode and when making a 24x24 segments Sphere using papervision, the game's FPS drops significantly on mobile devices like iPad. I Can't even display a single rough sphere if I were to maintain the game's FPS at high.
Is this normal or am I doing something wrong or is papervision not suitable for Air for iOS? Also is there any fast rendering 3d library that can run with gpu rendering mode? At this point the development has gone too far to switch it back to direct rendering mode

Papervision is not an active 3d lib, for 60fps rendering you have to use a 3D hardware accelerated flash framework like one of those :
Away3d
demo : http://away3d.com/showcase/
Flare3d
demo : http://www.flare3d.com/showcase/
edit
Minko
demo : aerys.in/portfolio/mercedes-e500
Alternativa3D
demo : alternativaplatform.github.com/Alternativa3D/Demos/ParticlesDemo/
And yes they all works on an ipad :)

Related

MTKView vs GLKView performance on old devices

I am running into weird performance issues on older device (iPad Mini 2 of year 2014) ever since migrating to MTKView from OpenGLES based views. To summarize, I am rendering live video preview using MTKView and also optionally record videos (like in RosyWriter sample code). My new app version is written in Swift and supports both Metal & OpenGLES. So I have both Open GL and Metal rendering code to compare. I also have a legacy version of app written in Objective C & uses OpenGLES for all preview and rendering. Here are performance issues I see on iPad Mini 2:
When using MTKView, if I debug the app in XCode and zoom video (using AVCaptureDevice zoom API, no Metal code to crop/zoom frames), the frame rate drops significantly (from 30 to 10 fps). If I record video while zooming (which means additional video encoding pipeline in another thread), there are frame drops in the recorded video as well.
When I profile the app using time profiler, MTKView is smooth enough with zoom. However, if I start recording video now , MTKView fps again drops or frames get delayed in the preview, but there is no frame drop issue in recorded video -- the end recorded video is still smooth.
If I select Release version of app for Debug, behavior is same as while profiling,
If I switch back to GLKView instead of MTKView, there are no issues in preview even in Debug version. There are still issues when recording -- preview gets delayed when zooming. But if I profile the app, there are NO delays or frame drops!!!!
Finally, the original legacy version in Objective C that uses OpenGLES for all purposes has no issues at all.
Now the question is what tools in Instruments can I use to nail down the exact issues for frame drops in preview and recording. Can Metal not match performance of OpenGLES on older devices?

Unity game lag when test on iOS devices

I recently create a small puzzle game in Unity, just a simple one, not fancy effect or anything. It really smooth when I test run on Unity.
FPS normally cap at 200 and on the largest resolution its around 80 - 120 FPS - super smooth. After that, I build an iOS version and test on ios device, it's quite laggy. I tested on iPhone 6+, iPhone X, iPad 9.5 inches, and the outcome still the same, its a little bit lag. Maybe be I need to adjust some graphics settings on Unity ? Please I need some advice from you guys. Thanks for reading.
You can try a few things.
At the very beginning of your app, keep a target frame rate to 30.
Application.targetFrameRate = 30;
Downgrade quality settings to medium. Within that, also disable or dull down things related to lighting in case yours is a simple 2D game.
Optimize art. Pack art in packing tags, and on iOS, keep their compression at PVRTC. Only the ones looking really bad after compression should be RGB24 or RGBA32. Disable options like Generate Physics shape(if you're not using that), and Generate mipmaps.
Have a look at your UI. Anything in UI that is not interactive(like simple images, or texts, which are not buttons or input texts, etc) should have Raycasting off. The Rich Text option in texts too should be off if that is not affecting your app specifically.

Take photos with "portrait" mode (bokeh) in iOS 11 / iPhone 7/8plus in 3rd-party app?

The iPhone 7 plus and 8 plus (and X) have an effect in the native camera app called "Portrait mode", which simulates a bokeh-like effect by using depth data to blur the background.
I want to add the capability to take photos with this effect in my own app.
I can see that in iOS 11, depth data is available. But I have no idea how to use this to achieve the effect.
Am I missing something -- is it possible to turn on this effect somewhere and just get the image with it applied, rather than having to try and make this complicated algorithm myself?
cheers
Unfortunately portrait mode and portrait lighting aren't open to developers as of iOS 11 so you would have to implement a similar effect on your own. Capturing Depth in iPhone Photography and Image Editing with Depth from this years WWDC go into detail on how to capture and edit images with depth data.
There are two sample projects on the developer site that show you how to capture and visualize depth data using a Metal shader, and on how to detect faces using AVFoundation. You could definitely use these to get you started! If you search for AVCam in the Guides and Sample Code they should be the first two that come up (I would post the links but stack overflow is only letting me add two).

A-Frame library limits on mobile?

This plunker shows a simple VR scene with the A-Frame library (a plane + ~10 lights).
It runs great on desktop whichever the quantity of lights.
The mobile iOS version loads at 60fps with 11 lights but shows a blank page with 12+ lights.
The stats show perfectly, I used WeInRe to output the console with no particular warning.
Is there a limitation of the complexity of the scene?
Thanks.
Regards,
JD
A-Frame Version: 0.4.0
Platform / Device: iOS 10.2 / iPhone 6s - Chrome & Safari
Reproducible Code Snippet or URL:
Editable :
https://plnkr.co/edit/Am8rjMdeaPzUWnFKX2i1?p=preview
Fullscreen preview :
https://run.plnkr.co/CgcUZgDUuPfeY15R/
Lights are expensive. I believe three.js has a limit on the number of lights, and there might be hardware constraints. It may be 60fps with just one plane, but each object you add will have to factor in 12 lights, the scene will quickly degrade.
Check out deferred renderer for many lights. Not sure if it works on mobile. https://github.com/takahirox/aframe-deferred-renderer

cacheAsBitmap has no effect on a Sprite masked with a scrollRect in AIR for iOS

I'm developing a simple kinetic menu UI component for AIR for iPad. It's basically a lightweight fill-in for a combobox that matches the style of iOS. I have a sprite containing any where from 2 to 60 item buttons that pops up and lets you flick/ scroll through them, only showing about 7 items at any given time.
My first attempt at this used a mask over my sprite, moving my menu sprite up and down under the stationary mask. This produced lackluster results on the test device (< 20 fps).
I then tried a blitting solution, leaving the menu sprite off the display list and using BitmapData.draw() to render only the part to the list i needed visible. This produced the best results on my Windows dev platform, but this time the framerate dropped below 10 fps on iPad. I am assuming I was incurring either a taxing CPU usage or a GPU readback penalty. Originally I had hoped to be able to run my app a 60 fps, however I've ratcheted my goal down to a more humble 30 fps.
Which brings me to my 3rd attempt at this UI component using the sprite's .scrollRect masking function in conjunction with .cacheAsBitmap . Again, the observed behaviors differ wildly between AIR on Windows vs. iOS. On Windows it only redraws the part of the menu sprite bounded by the dimensions of the scrollRect as it should. With iOS i can touch the area of the screen above or below the visible area of the menu sprite and still drag the menu even though my finger is over "empty" space! The performance here is decent, hovering between (19 - 25 fps) and would almost certainly be perfect at 30 if it worked as it did on windows.
Does anyone have any ideas either about the scrollRect feature's behavior on AIR for iOS or a better way of implementing an iOS native style gliding menu in AIR for iOS?
Note, the above methods were tried in both CPU and GPU mode, but CPU mode performed vastly better. I used AIR 2.7 installed on top of Flash Pro CS 5.5, with FlashDevelop as my IDE.
http://esdot.ca/site/2011/fast-rendering-in-air-3-0-ios-android#comment-10
Really nice guy from the above link: "Ya, scrollRect is basically a no-go on mobile, basically forget that API even exists. Believe it or not… old school masking is the way to go. Round and round we go!"

Resources