WebGL blendFunc bug or misunderstanding? - webgl

I'm trying to draw a semi-transparent black triangle on a black background.
Here's my sample: http://codewithoutborders.com/posted/blend.html
The blendFunc is set and enabled (line 151):
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
gl.enable(gl.BLEND);
The buffer is cleared to black (line 129):
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
The shader fills with semi-transparent black (line 10):
gl_FragColor = vec4(0.0, 0.0, 0.0, .5);
I would expect the triangle to be black, since the docs state that
result = {foreground}*sourceAlpha + {background}*(1-sourceAlpha)
= (0,0,0,.5)*.5 + (0,0,0,1)*.5
= (0,0,0,.75)
which is black, right?
so where does the grey come from?

(how embarrasing. answering my own question 3 minutes after posting, but not before significant hair loss)
the grey comes from the white background of the HTML page showing through after 'punching' a 3/4-transparent hole though the otherwise-black canvas.
adding
style="background:black;"
to the tag solves the problem.
[gman's comment, below, is a better fix]

Related

Trails effect, clearing a frame buffer with a transparent quad

I want to get a trails effect. I am drawing particles to a frame buffer. which is never cleared (accumulates draw calls). Fading out is done by drawing a black quad with small alpha, for example 0.0, 0.0, 0.0, 0.1. A two step process, repeated per frame:
- drawing a black quad
- drawing particles at new positions
All works nice, the moving particles produce long trails EXCEPT the black quad does not clear the FBO down to perfect zero. Faint trails remain forever (e.g. buffer's RGBA = 4,4,4,255).
I assume the problem starts when a blending function multiplies small values of FBO's 8bit RGBA (destination color) by, for example (1.0-0.1)=0.9 and rounding prevents further reduction. For example 4 * 0.9 = 3.6 -> rounded back to 4, for ever.
Is my method (drawing a black quad) inherently useless for trails? I cannot find a blend function that could help, since all of them multiply the DST color by some value, which must be very small to produce long trails.
The trails are drawn using a code:
GLuint drawableFBO;
glGetIntegerv(GL_FRAMEBUFFER_BINDING, &drawableFBO);
glBindFramebuffer(GL_FRAMEBUFFER, FBO); /// has an attached texture glFramebufferTexture2D -> FBOTextureId
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glUseProgram(fboClearShader);
glUniform4f(fboClearShader.uniforms.color, 0.0, 0.0, 0.0, 0.1);
glUniformMatrix4fv(fboClearShader.uniforms.modelViewProjectionMatrix, 1, 0, mtx.m);
glBindVertexArray(fboClearShaderBuffer);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
glUseProgram(particlesShader);
glUniformMatrix4fv(shader.uniforms.modelViewProjectionMatrix, 1, 0, mtx.m);
glUniform1f(shader.uniforms.globalAlpha, 0.9);
glBlendFunc(GL_ONE, GL_ONE);
glBindTexture(particleTextureId);
glBindVertexArray(particlesBuffer);
glDrawArrays(GL_TRIANGLES, 0, 1000*6);
/// back to drawable buffer
glBindFramebuffer(GL_FRAMEBUFFER, drawableFBO);
glUseProgram(fullScreenShader);
glBindVertexArray(screenQuad);
glBlendFuncGL_ONE dFactor:GL_ONE];
glBindTexture(FBOTextureId);
glDrawArrays(GL_TRIANGLES, 0, 6);
Blending is not only defined by the by the blend function glBlendFunc, it is also defined by the blend equation glBlendEquation.
By default the source value and the destination values are summed up, after they are processed by the blend function.
Use a blend function which subtracts a tiny value from the destination buffer, so the destination color will slightly decreased in each frame and finally becomes 0.0.
The the results of the blend equations is clamped to the range [0, 1].
e.g.
dest_color = dest_color - RGB(0.01)
The blend equation which subtracts the source color form the destination color is GL_FUNC_REVERSE_SUBTRACT:
float dec = 0.01f; // should be at least 1.0/256.0
glEnable(GL_BLEND);
glBlendEquation(GL_FUNC_REVERSE_SUBTRACT);
glBlendFunc(GL_ONE, GL_ONE);
glUseProgram(fboClearShader);
glUniform4f(fboClearShader.uniforms.color, dec, dec, dec, 0.0);

SpriteKit: SKShader with custom shader not doing `mix` properly

I am using a custom shader to create a sprite in SpriteKit.
The main part of the shader does this...
vec4 col = mix(baseColor, overlayColor, overlayColor.a);
The colours I have are something like...
baseColor = UIColor(red:0.51, green:0.71, blue:0.88, alpha:1.0)
and...
overlay = UIColor(red: 1.0, green: 1.0, blue: 1.0, alpha: 0.5)
According to everywhere on the internet (links to follow) the blend function I'm using above is the same as a Normal blend mode in Photoshop, Pixelmator, Sketch, etc... this should result in the colour...
col = UIColor(red:0.75, green:0.85, blue:0.95, alpha:1.00)
Which is a bright blue colour. However, what I'm getting is...
col = UIColor(red:0.51, green:0.61, blue:0.69, alpha:1.00)
Which is a murky grey colour.
You can see what it should look like here... https://thebookofshaders.com/glossary/?search=mix
If you enter the code...
#ifdef GL_ES
precision mediump float;
#endif
uniform vec2 u_resolution;
uniform float u_time;
vec4 colorA = vec4(0.5, 0.7, 0.9, 1.0);
vec4 colorB = vec4(1.0, 1.0, 1.0, 0.5);
void main() {
vec4 color = vec4(0.0);
// Mix uses pct (a value from 0-1) to
// mix the two colors
color = mix(colorA, colorB, colorB.a);
color.a = 1.0;
gl_FragColor = color;
}
The ideal output colour looks like this...
But it looks like this...
I’m gonna investigate this more tomorrow. I wish I knew why the output was completely different than everywhere else says it should be.
OK, thanks to #Reaper's comment I started out on a full investigation of what was going wrong with this.
When mixing the hard coded colours in our shader the colour was actually correct.
So it lead me to look at the texture images we were using.
The one that had the white with 50% alpha was definitely white. (not grey).
But... (quirk in the system?) open GL was picking it up as 50% grey with 50% alpha.
And so the mix wasn't actually mixing the correct colours in the first place.
We fixed this by using a 100% alpha image for the colours and a 100% image for the alpha map (black -> white).
By doing this it fixed the problem we were having.

Artefact in shader for iOS

kernel vec4 custom(__sample s1, __sample s2) {
if(s1.a == 0.0)
return s2;
if(s2.a == 0.0)
return s1;
vec4 res;
float temp = 1.0 - s2.a;
res.rgb = s2.rgb * s2.aaa + s1.rgb * (temp, temp, temp);
res.a = 1.0;
return res;
}
I am trying to merge 2 images, but have artifacts in bounds, where due to aliasing the pixels have less than 1 alpha. Is there any suggestions what I am doing wrong :/ For example the cheeks have some kind of bound, which doesn't appear if I place one over the other via UIImageViews
OpenGL ES uses post-multiplied alpha, which means that channels are blended independently, but which is inaccurate for a lot of blending operations.
Imagine blending a source fragment [1.0, 0.0, 0.0, 0.5] (red, half transparent) on to a destination framebuffer [0.0, 0.0, 1.0, 0.0] (blue, fully transparent). You would logically expect "half transparent red" as the original destination color is not visible due to the 0.0 alpha value. However, because the channels are blended independently you would end up with the final RGB color being purple [0.5, 0.0, 0.5].
The fast way to fix this is by propagating the color values out into the transparent region, so that the opaque pink color extends out in to the feather region where you start to fade it out.
A better way to fix this is using pre-multipled alpha, but that starts to have side-effects (loss of precision in texel storage, and you need a different blend equation).

CGContextStrokePath color

I use CGContextStrokePath painted on a straight line in a white background picture, stroke color is red, alpha is 1.0
After drawing the line, why the points is not (255, 0, 0), but (255, 96, 96)
Why not pure red?
Quartz (the iOS drawing layer) uses antialiasing to make things look smooth. That's why you're seeing non-pure-red pixels.
If you stroke a line of width 1.0 and you want only pure red pixels, the line needs to be horizontal or vertical and it needs to run along the center of the pixels, like this:
CGContextMoveToPoint(gc, 0, 10.5);
CGContextAddLineToPoint(gc, 50, 10.5);
CGContextStroke(gc);
The .5 in the y coordinates puts the long along the centers of the pixels.

Overlapping 2 transparent Texture2Ds in OpenGL ES

I'm trying to make a 2D game for the iPad with OpenGL. I'm new to OpenGL in general so this blending stuff is new.
My drawing code looks like this:
static CGFloat r=0;
r+=2.5;
r=remainder(r, 360);
glLoadIdentity();
//you can ignore the rotating and scaling
glRotatef(90, 0,0, -1);
glScalef(1, -1, 1);
glTranslatef(-1024, -768, 0);
glClearColor(0.3,0.8,1, 1.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glEnable(GL_TEXTURE_2D);
glEnable (GL_BLEND);
glBlendFunc (GL_ONE,GL_ONE_MINUS_SRC_ALPHA);
[texture drawInRect:CGRectMake(512-54, abs(sin(((r+45)/180)*3.14)*500), 108, 108)];
[texture drawInRect:CGRectMake(512-54, abs(sin((r/180)*3.14)*500), 108, 108)];
("texture" is a Texture2D that has a transparent background)
All I need to know how to do is make it so that a blue box around the texture doesnt cover up the other one.
Sounds like you just need to open up the texture image in your favourite image editor and set the blue area to be 0% opaque (i.e. 0) in the alpha channel. The SRC_ALPHA part of GL_ONE_MINUS_SRC_ALPHA means the alpha value in the source texture.
Chances are you're using 32-bit colour, in which case you'll have four channels, 8 bits for Red, Green, Blue and Alpha.

Resources