iOS fragment shaders with multiple outputs - ios

Is it possible to write GLSL ES fragment shaders under iOS that generate multiple RGB outputs and have (for example) one sent to the screen and one sent to a texture?
Under normal GLSL I guess this would be done by writing to gl_FragColor[i] from the fragment shader. However, it seems that GLSL ES 2.0 only supports a single color output.

OpenGL ES 2.0 does not support FBOs with multiple render targets. Therefore, GLSL ES also does not support it.
NVIDIA has an extension for it, but obviously that only works on NVIDIA's hardware.

Related

What is the difference of shaders with extensions .Shader, .vsh, .fsh?

What is the difference between files named "name. Shader" & "name. vsh" & "name. fsh" in scene kit ?when i call some shaders in my project my model would be like a purple mask. What should i do?
there are three kinds of shaders when working with SceneKit.
As for every OpenGL app there are vertex shaders and fragment shaders. Vertex shaders often have the .vert or .vsh extension and fragment shaders often have the .frag or .fsh extension. These shaders are used with the SCNProgram class.
In addition SceneKit exposes the concept of shader modifiers which often have the .shader extension. Shader modifiers can affect either a vertex of fragment shader and are used with the SCNShadable protocol.
These file extensions are just indications and could be really anything you want.

What is the subset of OpenGL ES 3.0 that SceneKit supports?

In the documentation of SCNView it is stated that:
SceneKit supports OpenGL ES 3.0, but some features are disabled when rendering in a OpenGL ES 3.0 context
I could not find anywhere which features were disabled. I wanted to use my own shader with SceneKit (assigning a SCNProgram to my material) and I tried to use a 3D texture. But I got the following error:
SceneKit: error, C3DBaseTypeFromString: unknown type name 'sampler3D'
So I'm guessing that 3D textures are part of the disabled features but I could not find a confirmation anywhere. Do I have to give up on SceneKit and do all my rendering with OpenGL manually just to use 3D textures?
Bonus question: Why Apple would support only a subset of OpenGL ES 3.0 in SceneKit since iOS has full support?
Some features of SceneKit don't work in an ES3 context. You should still be able to use all ES3 features in your OpenGL code.
This looks like an error in SceneKit detecting the uniform declaration for use with its higher-level APIs... so you won't be able to, say, bind an SCNMaterialProperty to that uniform with setValue:forKey:. However, you should still be able to use the shader program -- you'll have to bind it with glBindTexture/glActiveTexture instead (inside a block you set up with handleBindingOfSymbol:usingBlock:).

Is it possible to read floats out from OpenGL ES framebuffer via the iOS texture cache API?

This is related to OpenGL ES 2.0 :glReadPixels() with float or half_float textures.
I want to read out the float values from a framebuffer object after rendering.
On iOS, the following
GLint ext_type;
glGetIntegerv(GL_IMPLEMENTATION_COLOR_READ_TYPE, &ext_type);
really just tells us that glReadPixels only allows GL_UNSIGNED_BYTEs to be read out.
Is there a way to use the textured cache technique related in this article to get around this?
The back story is that I am trying to implement a general matrix multiplication routine for arbitrary-sized matrices (e.g., 100,000 x 100,000) using an OpenGL ES 2.0 fragment shader (similar to Dominik Göddeke's trusty ol' tutorial example). glReadPixel is not being particularly cooperative here because it converts the framebuffer floats to GL_UNSIGNED_BITS, causing a loss of precision.
I asked a similar question and I think the answer is NO, if only because texture caches (as an API) use CoreVideo pixel buffers and they don't currently don't support float formats.

why does webgl blending not support COLOR_LOGIC_OP mode?

I am new to WebGL. I create a FBO and render a texture on it. Now another texture is suppose to be rendered in the same coordinate by blending the fragments' R,G,B and A values in COLOR_LOGIC_OP mode. Then I checkout the WebGL Specification, and could not find any information about the blending mode. Has the blending mode not been implemented yet?
The simple answer is because WebGL is based on OpenGL ES 2.0 and OpenGL ES 2.0 doesn't support glLogicOp.
It's also not in OpenGL ES 3.0 either which the next version of WebGL will be based on so it will most likely not appear in WebGL for the forceable future.

Accessing multiple textures from fragment shader in OpenGL ES 2.0 on iOS

I have N textures in my app, which are 2D slices from 3D volumetric data. In the fragment shader, I need to access all of these textures. My understanding is that I can only access bound textures from the shader, which means I am limited by the number of multi-texturing units allowed.
N can vary from 8 to 512, depending on the data. Is there a way to do this without multi-texturing?
The reason for this approach is because 3D texturing is not available on OpenGL ES 2.0. I'd appreciate suggestions on any other way of doing this.
I also considered texture atlases, but I think the maximum single texture dimensions will be a problem.

Resources