Uniform Buffers in WebGL? - webgl

Are there WebGL UniformBuffers? Because there doesn't seem to be an gl.UNIFORM_BUFFER anywhere. Are they just named differently then in OpenGL or do they not exist?

Uniform Buffers are in WebGL2 but not WebGL1
See all these questions and answers
This is because WebGL1 is based on OpenGL ES 2.0 which does not have Uniform Buffers
WebGL2 is based on OpenGL ES 3.0 which does have Uniform Buffers.

Related

Is it possible to read floats out from OpenGL ES framebuffer via the iOS texture cache API?

This is related to OpenGL ES 2.0 :glReadPixels() with float or half_float textures.
I want to read out the float values from a framebuffer object after rendering.
On iOS, the following
GLint ext_type;
glGetIntegerv(GL_IMPLEMENTATION_COLOR_READ_TYPE, &ext_type);
really just tells us that glReadPixels only allows GL_UNSIGNED_BYTEs to be read out.
Is there a way to use the textured cache technique related in this article to get around this?
The back story is that I am trying to implement a general matrix multiplication routine for arbitrary-sized matrices (e.g., 100,000 x 100,000) using an OpenGL ES 2.0 fragment shader (similar to Dominik Göddeke's trusty ol' tutorial example). glReadPixel is not being particularly cooperative here because it converts the framebuffer floats to GL_UNSIGNED_BITS, causing a loss of precision.
I asked a similar question and I think the answer is NO, if only because texture caches (as an API) use CoreVideo pixel buffers and they don't currently don't support float formats.

why does webgl blending not support COLOR_LOGIC_OP mode?

I am new to WebGL. I create a FBO and render a texture on it. Now another texture is suppose to be rendered in the same coordinate by blending the fragments' R,G,B and A values in COLOR_LOGIC_OP mode. Then I checkout the WebGL Specification, and could not find any information about the blending mode. Has the blending mode not been implemented yet?
The simple answer is because WebGL is based on OpenGL ES 2.0 and OpenGL ES 2.0 doesn't support glLogicOp.
It's also not in OpenGL ES 3.0 either which the next version of WebGL will be based on so it will most likely not appear in WebGL for the forceable future.

iOS fragment shaders with multiple outputs

Is it possible to write GLSL ES fragment shaders under iOS that generate multiple RGB outputs and have (for example) one sent to the screen and one sent to a texture?
Under normal GLSL I guess this would be done by writing to gl_FragColor[i] from the fragment shader. However, it seems that GLSL ES 2.0 only supports a single color output.
OpenGL ES 2.0 does not support FBOs with multiple render targets. Therefore, GLSL ES also does not support it.
NVIDIA has an extension for it, but obviously that only works on NVIDIA's hardware.

Accessing multiple textures from fragment shader in OpenGL ES 2.0 on iOS

I have N textures in my app, which are 2D slices from 3D volumetric data. In the fragment shader, I need to access all of these textures. My understanding is that I can only access bound textures from the shader, which means I am limited by the number of multi-texturing units allowed.
N can vary from 8 to 512, depending on the data. Is there a way to do this without multi-texturing?
The reason for this approach is because 3D texturing is not available on OpenGL ES 2.0. I'd appreciate suggestions on any other way of doing this.
I also considered texture atlases, but I think the maximum single texture dimensions will be a problem.

OpenGL ES 2.0 Vertex skinning maximum number of bones?

When drawing a vertex skinned model, what is the maximum number of bones per draw-call/batch for different iOS devices?
On OpenGL ES 1.1 the limit is set by the number of palette matrices, but how about OpenGL ES 2.0, what sets the limit?
OpenGL ES 2.0 uses shaders for all of its vertex processing. So it depends on how many uniform matrices you can create. This is an implementation-defined limit, so it varies from hardware to hardware.
You can also use quaternions+position for bones instead of full matrices to save space.
From iOS 7 you can access the texture units from the vertex shader, so you can make a texture, fill it with your matrices and access the matrices from your vertex shader. This will allow many more matrices to be accessed, at the expense of a more complex implementation.

Resources