I'm trying to build code on an older iOS devices that have arm6 processors and support for only OpenGL ES 1.1. The function glBlendFuncSeparate crashes. I found this post on stackoverflow -
iPhone OpenGL ES missing functions should be there - glBlendFuncSeparate etc
So I added the OES to the end of the function and it still crashes. I've double checked to make sure I am including OpenGLES/ES1/glext.h and I am.
Does anyone know what the deal is with this? Is this function supported in OpenGLES 1.1 on iOS? If so, how do you use it?
After much research and work, I've found that glBlendFuncSeparate is not supported on OpenGLES 1.1 for iOS on older devices.
The easiest work around we found (though is does involve more draw calls) is to use glColorMask. You mask off the alpha and set your blend mode and draw once, then you mask off you color and set your alpha blend mode and draw again.
Hope this helps others with the same problem!
Related
So I've got a weird issue. I'm porting a modern source port of an old game engine to iOS and tvOS. It was written using OpenGL and I found a fork of it that uses OpenGL ES and I've been able to Frankenstein combine the two to where it will actually now successfully run on iOS and tvOS devices.
Here's the weird part - parts of the game do not render correctly on my iPhone X but they do render correctly on my iPad Air 2 and Apple TV (4th gen).
I notice in the flurry of messages in the output window that the spot where the engine outputs renderer information, on the iPhone X it says
OpenGL version: OpenGL ES 2.0 Metal 58.4
whereas on, say, the iPad Air 2 it says
OpenGL version: OpenGL ES 2.0 A8X GPU - 130.1
"OpenGL ES Metal" sounds like "Jumbo Shrimp" to me since those are obviously not the same thing. If I were to guess, I'd say that on the iPhone X the OpenGL ES drivers are running on top of some sort of Metal translation layer, which may be Apple's long term plan for having some sort of future-proofing in the wake of the OpenGL ES deprecation.
But for whatever reason it's breaking my game engine and although I'm decent at making code work together, I don't know enough about graphics coding to know even where to look to change things to work.
Obviously the right answer is to fix whatever is causing the issue but as a short term fix I'm curious if there's any way to get a game on iOS to not use OpenGL ES on top of Metal? (if that is indeed what is happening)
So, like Brad Larson says below, the answers were: yes OpenGL ES is running on top of Metal and no, it can't be avoided. However for future reference if anyone else is running into this problem I solved the real underlying issue with the help of another SO answer:
WebGL GLSL fragment shader not working on iOS
Basically the floating point precision of the shaders need to be upgraded from lowp and mediump to highp
Yes, OpenGL ES is effectively an interface to Metal on recent iOS versions (since I believe 10, if I'm not mistaken).
From the Roblox Metal retrospective:
It is also worth noting that there is no GL driver on iOS 10 on the newest iPhones, apparently. GL is implemented on top of Metal, which means using OpenGL is only really saving you a bit of development effort – not that much, considering that the promise of “write once, run anywhere” that OpenGL has does not really work out on mobile.
You can verify this yourself by profiling OpenGL ES code on modern iOS versions:
In addition to the above, you'll see Metal-specific operations like -[MTLIOAccelCommandQueue submitCommandBuffers:count:] appearing in profiling of your OpenGL ES applications.
Near as I can tell, there's no way to circumvent this, it's how rendering is architected on modern iOS versions. That said, I've not seen this alter the behavior of my OpenGL ES code. I have seen different iOS GPUs produce slightly different rendering behavior due to their hardware, so it's possible you're encountering something that is device-specific.
Check for the usual suspects involving precision qualifiers, Z-fighting, division-by-zero in shaders, etc. and see if one of those is messing up your rendering. Those are all places where I've seen new device generations disrupt my rendering.
I recently started building a game for iOS using my iPhone 5s for testing.
However, recently I have decided to try my iPhone SE to see how it looks; all of the materials on all of the objects appear to be missing, but the color of the trails for the particle effects and the line renderers still have their color.
It could be a shader issue or a problem with the graphics API Unity is using. If you're not using the standard shader then make sure that you're shader is compatible with mobile devices. Also make sure that it's included in the project by creating a folder named Resources and moving your shader into that.
If you're using one of the standard shaders that comes with Unity then the issue is likely not a shader one but the Graphics API selected. It's likely using Metal which is causing that issue. Use OpenGL ES instead of Metal.
Disable Auto Graphics API then change iOS Graphics API to OpenGLES2 or OpenGLES3 in Unity's Player Settings.
Running a fresh build of the exact same application on iOS 11.2 and 11.3, everything works fine on 11.2, but on 11.3 it appears as if none of the materials are getting rendered onto the ARKit face node, that uses a SCNMorpher and blendshape values to update.
I'm going to dig into this more and report back a fix if I can find one, but I thought I'd drop a beacon here to see if anyone else is having similar issues, and if so, if they've found any solutions, or if anyone has any ideas in general.
It is my understanding that minor version bumps should be fully backwards compatible. Is that correct?
Thanks
Update 1: This seems to effect both ARKit and SceneKit scenes.
Update 2: This seems to be related to both materials and the way lighting is handled differently in 11.3. Changing the lighting to be rendered further away, a similar effect to 11.2 is achieved. It's almost as if the base units for distance have changed from meters to inches or something. Confusingly, changing the lighting distance seems to only effect a device running 11.3 now, not 11.2. The problem now is that the rendering is completely matte. Further isolation reveals that specular is rendering normally in 11.2, but when specular is set in 11.3, either as a UIImage or a UIColor, it simply does not have any effect. I'm going to try setting the specular as something else, like a CALayer and CGImage, and see if either of those work, despite all of the above having claimed support in the documentation.
iOS 11.3 seems to change the default lightingModel, which is the root cause of all the woes.
Simply setting the materials explicitly to what was previously implicit resolved all pains.
For example,
baseNode.geometry?.materials[0].lightingModel = .blinn
With Metal, does it matter whether or not I have anything attached to the color targets in order to use early_fragment_tests on iOS? When I add early_fragment_tests to my frag shader, then it seems as though all my fragments are rejected on iOS. It works properly on OS X. So far the only thing I can think of is that it might have to do with the fact that I don't have any color targets attached. Would that matter? I'm writing to offscreen buffers instead.
Thanks
Bob
I enable multisampling support on iOS 4.3.3 with OpenGLES2, and the rendering result is awful, as if the color is in RGB565, not ARGB8888.
The thing is, either turn multisampling off, or deploying the same ipa to an iOS 5 device(which indicates that I did turn on multisampling correctly), the problem will not occur, except that turning multisampling off will make it very ugly(which also indicates that the multisample did work for most cases).
The test case is very simple, just render a quad with an texture attached in ortho projection mode, the color format of texture isRGBA8888.
Anyone has ever met the same problem before? Is this a bug of Apple SDK?
BTW, the SDK I used is the one shipped with Xcode 4.3.2, the ios deployment target is set to 4.0
Turns out the root cause is the eaglLayer.drawableProperties was set to kEAGLColorFormatRGB565.
But still, why the result is so different when turning multisampling on/off?
I was wondering if this caused by different color format of sampling buffer(RGBA8) and surface buffer(RGB565), then I changed the sampling buffer to GL_RGB565, but the problem still no solved.
Maybe this is something not well implemented regarding glResolveMultisampleFramebufferAPPLE? Anyway the problem does not exist on iOS 5 device.