Xcode GPU shader profiler - profiler

The Xcode->GPU Capture frame->GPU shader profiler can't work, I run the Capture frame, and open the Metal compute shader source code by GPU shader profiler. But I can't see the shader code performance profile like thatGPU shader profiler

Set your project's "iOS Deployment Target" to a higher version. I had this issue and in my case it was set to 9.x. When I set it to 11.4 I could see the percentage on source lines.

Related

SCNView uses OpenGL on iPhone 6 in simulator,

SceneKit uses for SCNView in iPhone 6 on simulator OpenGL,
this causes to changing lightning model from PBR to Phong
Error: Physically based lighting model is not supported by the OpenGL
renderer, using Phong instead
Is there a way to fix it?
Newer iPhones render model in SceneKit correctly in PBR
SceneKit uses Metal in the simulator since Xcode 11 running on macOS Catalina. On previous versions of the system only OpenGL was supported.
In Xcode 11, Simulator adds support for Metal development. You can write iOS and tvOS apps that use Metal and test them in the Simulator, gaining the benefits of hardware acceleration on the Mac during development of your app. If you use frameworks built on top of Metal, such as SceneKit, Core Animation, and UIKit, you'll also see better performance when testing your apps in Simulator.
(https://developer.apple.com/documentation/metal/developing_metal_apps_that_run_in_simulator)

HLSL rendering to larger than sceen target fails completely

Hi so im rendering a 3d scene using a hlsl shader if the output target size is less or equal to my window size everything works fine but if i try have it render to a 2x (renderscale) etc nothing shows up, if i try use opengl instead everything works fine, what reason could there be that the directx hlsl wont render anything to the target if size above the window size? hlsl version 9
Looks like the culprit was screen antialiazing that caused it

Does Xcode support shader debugger in iPhone when debugging metal shader?

My macOS version: macOS Mojave, Version 10.14.4
My iPhone device: iPhone X, iOS 11.2.2
My Xcode version: Xcode 10.2.1
In my scheme -> Options:
1. GPU Frame Capture: Automatically Enabled
2. Metal API Validation: Enabled
When I want to use capture GPU frame to debug metal shader, the shader debugger icon was gray, suggesting that Shader Debugger is not supported, update your operating system in order to support the shader debugger.
I want to know whether Xcode supports shader debugger on iPhone. If do suppport, then how to use it.
Thank you!

Disordered texture displayed on iOS device powered by Unity3D

Screenshot on iPhone6s :
Unity3D version:5.3.5f1
Texture Format:TrueColor (Not Compressed)
iOS device:iPhone6s iOS 9.3.4
And this issue only occured on our trunk branch recently, and other branches do not have this issue.
So we are sure about is:
Shaders are correct.
Meshes and texture coordinates(uv) of each vertex are correct.
By using Unity Profiler, what can also be sure:
Memory in use is not too large ( compared with other branches )
By using GPU Frame Capture, I found that the texture is incorrect in the captured frame:
GPU Frame Capture :
Has anyone ever solved this issue?

Air iOS Flash, StageQuality.LOW doesn't seem to work

I've read few articles about optimizing Flash performance on Air for iOS and many of them suggested to use StageQuality.LOW mode. But it seems Air(I'm using 3.6) doesn't allow you to use StageQuality.LOW, it must be either HIGH or BEST quality.
Am I missing something? Is there any way to turn it LOW quality mode?
In fact, there is a difference in performance. StageQuality the difference in rendering performance such as bitmaps or textfield. StageQuality Lower Not applying antialiasing vector graphics, bitmap graphics without does not draw smoothly. Higher quality settings produce better rendering of scaled bitmaps. also, higher quality settings are computationally more expensive. In particular, when rendering scaled video, using higher quality settings can reduce the frame rate.
What are you doing the work? If you use the BitmapData, recommend as follows.
AIR 3.3 later added a new featured drawing method called drawWithQuality, which allows you to draw assets at a quality that is independent from the stage quality setting used in your app.
In the past, the stage quality would have to be modified to change the quality of BitmapData.draw. BitmapData. drawWithQuality is an extension of BitmapData.draw that adds an optional parameter to specify the quality of vector rendering.
function drawWithQuality(source: IBitmapDrawable, matrix:Matrix=null, colorTransform: ColorTransform =null, blendMode:String=null, clipRect:Rectangle=null, smoothing:Boolean=false, quality:String=null): void
Following are the supported quality values:
StageQuality.LOW
StageQuality.MEDIUM
StageQuality.BEST
StageQuality.HIGH_8X8_LINEAR
StageQuality.HIGH_16X16
StageQuality.HIGH_16X16_LINEAR
It does work it just always shows up as HIGH in the AIR simulator. Will change on actual device. Bitmaps will smooth even in low quality in GPU mode as well.

Resources