Multiple-passes SSAO border artifact - webgl

I am playing with multiple passes in WebGL to apply some visual enhancements to my renders.
I am ping-ponging two textures (taken from here: WebGL Image Processing Continued), and this is basically my workflow:
draw the geometry and store the depth vales in a texture buffer
apply the SSAO pass (based on this Fiddle from Rabbid76)
at the end, I am rendering the texture to the screen
Now, I am not able to understand why there is a black border around some parts of the final image.
I tried to fine adjust some SSAO parameters, but I wasn't able to get rid from that artifact, so, just guessing, i believe now that this black border is due a wrong blending setup of my texture buffers.
This is actually the code in the depth pass:
gl.disable(gl.BLEND);
gl.enable(gl.DEPTH_TEST);
gl.depthMask(true);
... drawings
I tried also this way:
gl.enable(gl.BLEND);
gl.blendEquation(gl.FUNC_ADD);
gl.blendFunc(gl.GL_SRC_ALPHA, gl.GL_ONE_MINUS_SRC_ALPHA);
...but is not leading me to any result.
In the picture below, this artifact is clearly to see as a thin black line around the stanford dragon:
As I am completely lost with this issue, can someone point me to the right direction?
My question: I need to draw a geometry with transparent background - which is the correct blending mode for that, when rendering to a back buffer and ping-ponging two textures to apply multiple effects?

For the posterity, I was using:
gl.clearColor(0, 0, 0, 1);
so i changed the pack/unpack functions as follows:
vec4 PackDepth32_0(float depth) {
const vec4 bit_shift = vec4(255.0 * 255.0 * 255.0, 255.0 * 255.0, 255.0, 1.0);
const vec4 bit_mask = vec4(0.0, 1.0 / 255.0, 1.0 / 255.0, 1.0 / 255.0);
vec4 res = fract(depth * bit_shift);
res -= res.xxyz * bit_mask;
return res;
}
float UnpackDepth32_0(vec4 color) {
const vec4 bit_shift = vec4(1.0 / (255.0 * 255.0 * 255.0), 1.0 / (255.0 * 255.0), 1.0 / 255.0, 1.0);
return dot(color, bit_shift);
}
which solves my problem.

Related

GLSL hue shader producing odd results, IOS only?

I have a cross-platform LibGDX app. This particular GLSL shader code is used to shift the hue of a particular texture.
It works great on Android and when debugging on Desktop, but on an iPad this is the result (excuse photos of screen, easiest way to get data from this device).
Code:
const mat3 rgb2yiq = mat3(0.299, 0.595716, 0.211456, 0.587, -0.274453, -0.522591, 0.114, -0.321263, 0.311135);
const mat3 yiq2rgb = mat3(1.0, 1.0, 1.0, 0.9563, -0.2721, -1.1070, 0.6210, -0.6474, 1.7046);
vec4 outColor = texture2D(u_texture, v_texCoord) * v_color;
float alpha = outColor.a;
// Hue shift
if (u_hueAdjust > 0.0 && u_hueAdjust < 1.0 && alpha > 0.0)
{
vec3 unmultipliedRGB = outColor.rgb / alpha;
vec3 yColor = rgb2yiq * unmultipliedRGB;
float originalHue = atan(yColor.b, yColor.g);
float finalHue = originalHue + u_hueAdjust * 6.28318; //convert 0-1 to radians
float chroma = sqrt(yColor.b * yColor.b + yColor.g * yColor.g);
vec3 yFinalColor = vec3(yColor.r, chroma * cos(finalHue), chroma * sin(finalHue));
outColor.rgb = (yiq2rgb * yFinalColor) * alpha;
}
Obviously there's some really weird artifacts that seem to affect certain areas, in particular black/white colors. But also in general a subtle change in color is noted that isn't attributable to a desired hue-change effect.
Overall this shader is wonky on IOS (but working fine on Android/Desktop), but after playing with it for a while I'm completely out of ideas, anyone lead me in the right direction?
In the documentation for atan, it says The result is undefined if x=0..
Is it possible that yColor.g is zero on the greyscale?
The issue is discussed here: Robust atan(y,x) on GLSL for converting XY coordinate to angle

SpriteKit: SKShader with custom shader not doing `mix` properly

I am using a custom shader to create a sprite in SpriteKit.
The main part of the shader does this...
vec4 col = mix(baseColor, overlayColor, overlayColor.a);
The colours I have are something like...
baseColor = UIColor(red:0.51, green:0.71, blue:0.88, alpha:1.0)
and...
overlay = UIColor(red: 1.0, green: 1.0, blue: 1.0, alpha: 0.5)
According to everywhere on the internet (links to follow) the blend function I'm using above is the same as a Normal blend mode in Photoshop, Pixelmator, Sketch, etc... this should result in the colour...
col = UIColor(red:0.75, green:0.85, blue:0.95, alpha:1.00)
Which is a bright blue colour. However, what I'm getting is...
col = UIColor(red:0.51, green:0.61, blue:0.69, alpha:1.00)
Which is a murky grey colour.
You can see what it should look like here... https://thebookofshaders.com/glossary/?search=mix
If you enter the code...
#ifdef GL_ES
precision mediump float;
#endif
uniform vec2 u_resolution;
uniform float u_time;
vec4 colorA = vec4(0.5, 0.7, 0.9, 1.0);
vec4 colorB = vec4(1.0, 1.0, 1.0, 0.5);
void main() {
vec4 color = vec4(0.0);
// Mix uses pct (a value from 0-1) to
// mix the two colors
color = mix(colorA, colorB, colorB.a);
color.a = 1.0;
gl_FragColor = color;
}
The ideal output colour looks like this...
But it looks like this...
I’m gonna investigate this more tomorrow. I wish I knew why the output was completely different than everywhere else says it should be.
OK, thanks to #Reaper's comment I started out on a full investigation of what was going wrong with this.
When mixing the hard coded colours in our shader the colour was actually correct.
So it lead me to look at the texture images we were using.
The one that had the white with 50% alpha was definitely white. (not grey).
But... (quirk in the system?) open GL was picking it up as 50% grey with 50% alpha.
And so the mix wasn't actually mixing the correct colours in the first place.
We fixed this by using a 100% alpha image for the colours and a 100% image for the alpha map (black -> white).
By doing this it fixed the problem we were having.

WebGL: Beveled smooth anti-aliased circles using GL_POINT? [duplicate]

This question already has an answer here:
Alpha rendering difference between OpenGL and WebGL
(1 answer)
Closed 5 years ago.
Sorry for such a long title.
This is a followup on my old question, which led me to drawing a beveled circle looking like this:
If you look closely at this image, you can see the pixelation around the edge.
I wanted to smooth this circle with anti-aliasing, and I ended up with this:
If you look closely at this image, you can see a white border around the edge.
I'd like to remove this white border if possible, but my shader is a mess at this point:
#extension GL_OES_standard_derivatives : enable
void main(void) {
lowp vec2 cxy = 2.0 * gl_PointCoord - 1.0;
lowp float radius = dot(cxy, cxy);
const lowp vec3 ambient = vec3(0.5, 0.2, 0.1);
const lowp vec3 lightDiffuse = vec3(1, 0.5, 0.2);
lowp vec3 normal = vec3(cxy, sqrt(1.0 - radius));
lowp vec3 lightDir = normalize(vec3(0, -1, -0.5));
lowp float color = max(dot(normal, lightDir), 0.0);
lowp float delta = fwidth(radius);
lowp float alpha = 1.0 - smoothstep(1.0 - delta, 1.0 + delta, radius);
gl_FragColor = vec4(ambient + lightDiffuse * color, alpha);
}
If anyone could figure out how to remove the white border, that'd be fantastic!
In WebGL the default is to use pre-multiplied alpha. (see WebGL and Alpha)
When initializing the canvas, I has to be specified no alpha:
gl = document.getElementById("canvas").getContext("webgl", { alpha: false });
see further the answers to the questions:
WebGL: How to correctly blend alpha channel png
Alpha rendering difference between OpenGL and WebGL

Rounding error in texture lookup on iPad / OpenGL ES 2.0

I'm using a texture attached to a framebuffer as a custom depth buffer. In a first rendering pass, I render to the texture so it stores the depth values. In the second rendering pass, I do a lookup from this texture to decide whether to render or discard the fragment.
This works well, except that on the device (iPad 3), there are some annoying artifacts which seem to come from some kind of rounding error which happens when the depth values are written to the texture. I tried writing some fixed value like 0.5 to the texture, but when it is read back from the texture, it's more than 0.03 higher or lower than 0.5.
I 'encode' the depth value into three RGB values (the fourth component, alpha or w, is ignored):
const highp vec4 packFactors = vec4(1.0, 256.0, 256.0 * 256.0, 256.0 * 256.0 * 256.0);
const highp vec4 cutoffMask = vec4(1.0 / 256.0, 1.0 / 256.0, 1.0 / 256.0, 0.0);
void main() {
highp float depth = ...
...
highp vec4 packedVal = fract(packFactors * depth);
packedVal.x = depth; // undo effect of fract() on x component
gl_FragColor = packedVal - packedVal.yzww * cutoffMask;
}
This way, I get 3x8 bits precision to store the depth (inspired by http://www.rojtberg.net/348/powervr-sgx-530-does-not-support-depth-textures)
In the other fragment shader (second rendering pass), I read from the texture like this:
highp vec4 depthBufferLookup = texture2D(depthTexture, vDepthTex);
highp float depthFromDepthBuffer = dot(depthBufferLookup, vec4(1.0) / packFactors);
using the same values for packFactors as in the first shader.
I would expect this procedure to give a decent precision, but an error of more than 0.03 at a value of 0.5 makes it pretty unusable.
Any hints?
BTW I'm using the following texture type:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, data);

OpenGL ES Green Screen... But I want to use black

This works great for green screen, my background is green and using this code it makes green = alpha.
lowp vec4 textureColor = texture2D(u_samplers2D[0], vTextu);
lowp float rbAverage = textureColor.r * 0.5 + textureColor.b * 0.5;
lowp float gDelta = textureColor.g - rbAverage;
textureColor.a = 1.0 - smoothstep(0.0, 0.25, gDelta);
textureColor.a = textureColor.a * textureColor.a * textureColor.a;
gl_FragColor = textureColor;
How do I change the code so that it use a black background instead of green? I'm thinking I could get values for the dark red, green, blues then use that as the alpha? Any pointers would be kind.
You can calculate how close value to black, simplest way is to take maximum of the r g and b values, then set some threshold when you consider color opaque, and map your alpha values to [0..1] in pseudo code:
float brightness = max(textureColor.r, textureColor.g, textureColor.b);
float threshold = 0.1;
float alpha = brightness > threshold ? 1 : (threshold - brightness) / threshold;
gl_FragColor = vec4(textureColor.rgb, alpha);
lowp vec4 textureColor = texture2D(u_samplers2D, vTextu);
lowp float gtemp = smoothstep(0.0, 0.5, textureColor.r);
gl_FragColor = vec4(1.0, 1.0, 1.0, gtemp);
This works for my situation. I was using a greyscale image with black background and just needed the white elements. The problem I had was that I was applying the alpha to grey pixels, but now Im using white and applying the alpha to that. I can adjust the smoothstep values to change the effect. And also add another alpha to fade in or out the image.

Resources