MTKView vs GLKView performance on old devices - ios

I am running into weird performance issues on older device (iPad Mini 2 of year 2014) ever since migrating to MTKView from OpenGLES based views. To summarize, I am rendering live video preview using MTKView and also optionally record videos (like in RosyWriter sample code). My new app version is written in Swift and supports both Metal & OpenGLES. So I have both Open GL and Metal rendering code to compare. I also have a legacy version of app written in Objective C & uses OpenGLES for all preview and rendering. Here are performance issues I see on iPad Mini 2:
When using MTKView, if I debug the app in XCode and zoom video (using AVCaptureDevice zoom API, no Metal code to crop/zoom frames), the frame rate drops significantly (from 30 to 10 fps). If I record video while zooming (which means additional video encoding pipeline in another thread), there are frame drops in the recorded video as well.
When I profile the app using time profiler, MTKView is smooth enough with zoom. However, if I start recording video now , MTKView fps again drops or frames get delayed in the preview, but there is no frame drop issue in recorded video -- the end recorded video is still smooth.
If I select Release version of app for Debug, behavior is same as while profiling,
If I switch back to GLKView instead of MTKView, there are no issues in preview even in Debug version. There are still issues when recording -- preview gets delayed when zooming. But if I profile the app, there are NO delays or frame drops!!!!
Finally, the original legacy version in Objective C that uses OpenGLES for all purposes has no issues at all.
Now the question is what tools in Instruments can I use to nail down the exact issues for frame drops in preview and recording. Can Metal not match performance of OpenGLES on older devices?

Related

Metal App FPS drops in Debug but fine in Instruments while profiling

I am using Metal for rendering live video frames plus some custom control (a circular slider) for zooming that I implemented using Quartz 2D API. When I run the app in debugger, I see FPS drop from 30 to sometimes 11 and zoom is not smooth on older devices such as iPad Mini 2. I then run the code in Time Profiler and surprisingly, there is no fps drop in Time Profiler. App runs smooth in Profiler. How do I know what is causing fps drop in debug?
It's probably the Metal Validation layer that's active for your debug scheme. It's not typically surprising that performance of programs is worse in general when debugging (due to lack of optimizations, or asserts being enabled, etc.).
If you want to get similar Metal performance when debugging, you can try disabling the Metal Validation in the scheme settings. But, then, of course, you lose the actual debugging benefit of the validation of your use of Metal.

iOS games with random frame drops (including my game)

I'm almost finishing my iOS game written in Swift + SpriteKit.
It's a quite simple game, 30-32 nodes at max. Only 1 thing has physics. The rest is a few animated clouds (around 6). The CPU usage is around 2-3% and max RAM usage of 75-80MB.
Including that I also get frame drops when changing from one scene to another. Why that could be?
(I'm pre-loading all the textures and sounds during game init, and not on the scenes)
When I use the simulator for 5S up to 6S Plus, I don't see any frame drop in there. So that's weird. Looks like it's not my game but my iPhone 6S?
Now, I do also have other games installed on the same device from different developers, and I frequently get random frame drops too. Lags for 2-3 seconds and then comes back to 60fps.
Does anyone know if this is something that's happening after an X iOS update ? or I was even thinking this my be some kind of background service running that's killing my phone. Call it facebook, whatsapp, messenger, etc.
Is there any way I could possibly check on what's going on?
Was this caused by the way that newer versions of SpriteKit are defaulting to Metal render mode as compared to OpenGL mode? For example, do your problems go away when PrefersOpenGL=YES is added to Info.plist? I covered a bit of this performance issue in my blog post about a SpriteKit repeat shader. Note that you should only be testing on an actual iOS device, not the simulator.

changes in recorded high fps video files from iOS 8.0.2 to 8.1?

My app works with 240 fps video from the iPhone 6/Plus camera, combining multiple overlaid AVMutableCompositionTracks into one composition for export. Both viewing in the app and exporting can take place either rendered out to 30 fps or at original fps.
For videos taken on 8.0.2 or before there is no issue whatsoever exporting at original fps and basically preserving all characteristics of the original file (except for the overlay). For videos taken on 8.1 there is about a 50% chance that the composition gets corrupted and fails to export, or when played back in-app freezes about a second into playback (while audio keeps playing for a while) before finally stopping with AVFoundationErrorCode -11819, "AVErrorMediaServicesWereReset".
If frameDuration is instead set to 1/30, 1/60, even 1/200 or sometimes up to around 1/220 for a 240 (well 239.84...) fps file, there is no issue with playback or export. Export also seems a bit more tolerant- than playback and sometimes goes through even when playback fails.
I have noticed changes in other apps with 8.1, MoviePro for example now records at around 207-209 fps when set to 240 fps and sometimes 59 when set to 60, while Apple's SloPoke example is more variable as well, usually recording at various values between 235-236. Apple's Camera app is still at 239.84 though.
Does anyone know what has changed and how to get around it? Obviously something about fps variability but how do I handle it? Or could it simply be a bug?
Appears to be a bug in AVPlayer and AVExportSession. Exports with no issue using AVAssetWriter.
Apple's newly updated sample project AVCustomEdit that uses a custom OpenGL compositor for transitions between clips demonstrates the same issue when modified to take an iOS 8.1 240 fps video and play it back at framerates near 240.
Can't believe I spent like 30 hours trying to figure out what was wrong when sidestepping it was this easy.
I have noticed changes in other apps with 8.1, MoviePro for example now records at around 207-209 fps when set to 240 fps
This is a bug in MoviePro app that is fixed in upcoming update still to be submitted. On iOS 8.1, the new update pending submission records very close to 240 fps at all bit rates.

How to screen record the iOS simulator at 60 fps?

It turned out that capturing video from the screen is a hard task on the Mac. I have a small game running in the simulator and want to make a screencast of the gameplay for youtube. Since it's a fast-paced scroller game, video must be recorded at 60 fps to look good.
I know the actual video on youtube for example is just 24 to 30 fps, but each such slow frame is blended with another.
When capturing the simulator at a lower frame rate than 60 fps the result is jagged a lot since every frame is razor sharp with no blending.
I tried a couple of Mac screen recorders but none of them were able to capture 60fps video from the simulator, and the frames in the resulting video looked like if the app took plenty of screenshots and stiffed them together into a video container.
But since there are great demo videos on youtube showing fast-paced gameplay of iOS apps without just recording the screen with a video camera, I wonder what kind of application they use to get a smooth screen capture.
Hopefully someone who already went through this problem can point out some solutions.
I've had good results screen recording from the simulator using SnapZ Pro X from Ambrosia software:
http://www.ambrosiasw.com/utilities/snapzprox/
One problem that you're likely to have is that the simulator only simulates iOS's OpenGL graphics in software, so unless you have a really powerful Mac, it's likely that the simulator won't be able to run your game at 60fps anyway.
It's possible that the videos you've seen used the HDMI video out on the iPhone to mirror the screen from the device into a video capture card on the computer. That would likely perform much better because the Mac wouldn't have to both generate and record the graphics simultaneously.
I remember watching a video of the Aquaria guys talking about how they recorded their gameplay videos. Essentially the game recorded the input from the controller/keyboard while the game was played normally. Then they could play back the game they had just played but one frame at a time, with each frame being rendered out to a file as it went. Then all those frames are composited together and bam, a full 60fps video with perfectly rendered graphics. Bit overkill but it's a nice solution.
A program that is able to record at 60 fps is Screenflick.

AV Foundation camera preview layer gets zoomed in, how to zoom out?

The application currently I am using has a main functionality to scan QR/Bar codes continuously using Zxing library (http://code.google.com/p/zxing/). For continuous frame capturing I used to initialize the AVCaptureSession and AVCaptureVideoOutput, AVCaptureVideoPreviewLayer described in the apple Q&A http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html.
My problem is, when I used to run the camera preview, the image I can see through the Video device is much larger (1.5x) than the image we can see through the still camera of the iPhone. Our customer needs to hold the iPhone around 5cm distance from the bar code when he is scanning, but if you hold the iPhone to that parameter, the whole QR code won't be visible and the decoding fails.
Why is Video camera in iPhone 4 enlarges the image (by seeing through the AVCaptureVideoPreviewLayer) ?.
This is a function of the AVCaptureSession video preset, accessible by using the .sessionPreset property. For example, after configuring your captureSession, but before starting it, you would add
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
See the documentation here:
iOS Reference Document
The default preset for video is 1280x720 (I think) which is a lower resolution than the max supported by the camera. By using the "Photo" preset, you're getting the raw camera data.
You see the same behaviour with the built-in iPhone Camera app. Switch between still and video capture modes and you'll notice that the default zoom level changes. You see a wider view in still mode, whereas video mode zooms in a bit.
My guess is that continuous video capture needs to use a smaller area of the camera sensor to work optimally. If it used the whole sensor perhaps the system couldn't sustain 30 fps. Using a smaller area of the sensor gives the effect of "zooming in" to the scene.
I am answering my own question again. This was not answered even in Apple Dev forum, therefore I directly filed a technical support request from Apple and they have replied that this is a known issue and will be fixed and released with a future version. So there is nothing we can do more than waiting and see.

Resources