Proper way of doing blending in WebGl - webgl

I have a couple of problems when doing blending in WebGL. One of them is the way that colors are rendered regardless of the alpha value when blending is on. Meaning darker colors are always blended with what is underneath, even when alpha is set to 1.0. Yes, the more brighter colors are rendered differently depending on the alpha value, so there isn't a problem in the way I set up my shaders, I think.
That again I haven't had a chance to render a full scene yet, I am currently doing only testing with WebGl, so I only draw simple object on top of the default background. Will these blending problems be "fixed" once I render every bit of the screen using objects, or is this a limitation with WebGL?

Try setting the blend function like this:
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
That should be the default, at least it seems to be in Firefox.

Related

Webgl full screen blending slowdown

I've tried to make an "overlay" effect in a 3d scene. After drawing stuff to the buffer, i tried to draw a full screen quad with blending enabled and the depth test disabled. On some android devices this seems to have caused a slow down.
I found this link:
The particularly slow point is the point where the drawing of a pixel needs to check what the color behind it was.
So instead of drawing a single full screen quad, i divided it up in tiles, and rendered with multiple draw calls, which seems to have caused some gain.
What may be happening here and how can this be profiled with webgl i.e. how does one come to the conclusion from the quote above?
I guess that to profile it, you simply have to test with several blending function, with or without blending enabled, etc...
Blending is not a trivial operation, and indeed we can assume that blending function which need to read pixel on buffer could induce performance lose, like all "reading" operation in OpenGL, because this can block the pipeline. I guess most of modern desktop GPU have some specific design to optimize this, but on mobile phones, this is maybe more problematic.
Anyway, if you are about to draw a full screen quad, why don't you render your quad directly using two source texture, which you blend directly in the fragment shader using a custom equation ? this way, you don't need to use blending and you avoid any back buffer reading problem.

OpenGL photoshop overlay blend mode

Im trying to implement a particle system (using OpenGL 2.0 ES), where each particle is made up of a quad with a simple texture
the red pixels are transparent. Each particle will have a random alpha value from 50% to 100%
Now the tricky part is i like each particle to have a blendmode much like Photoshop "overlay" i tried many different combinations with the glBlendFunc() but without luck.
I dont understand how i could implement this in a fragment shader, since i need infomations about the current color of the fragment. So that i can calculate a new color based on the current and texture color.
I also thought about using a frame buffer object, but i guess i would need to re-render my frame-buffer-object into a texture, for each particle since each particle every frame, since i need the calculated fragment color when particles overlap each other.
Ive found math' and other infomations regrading the Overlay calculation but i have a hard time figuring out which direction i could go to implement this.
http://www.pegtop.net/delphi/articles/blendmodes/
Photoshop blending mode to OpenGL ES without shaders
Im hoping to have a effect like this:
You can get information about the current fragment color in the framebuffer on an iOS device. Programmable blending has been available through the EXT_shader_framebuffer_fetch extension since iOS 6.0 (on every device supported by that release). Just declare that extension in your fragment shader (by putting the directive #extension GL_EXT_shader_framebuffer_fetch : require at the top) and you'll get current fragment data in gl_LastFragData[0].
And then, yes, you can use that in the fragment shader to implement any blending mode you like, including all the Photoshop-style ones. Here's an example of a Difference blend:
// compute srcColor earlier in shader or get from varying
gl_FragColor = abs(srcColor - gl_LastFragData[0]);
You can also use this extension for effects that don't blend two colors. For example, you can convert an entire scene to grayscale -- render it normally, then draw a quad with a shader that reads the last fragment data and processes it:
mediump float luminance = dot(gl_LastFragData[0], vec4(0.30,0.59,0.11,0.0));
gl_FragColor = vec4(luminance, luminance, luminance, 1.0);
You can do all sorts of blending modes in GLSL without framebuffer fetch, but that requires rendering to multiple textures, then drawing a quad with a shader that blends the textures. Compared to framebuffer fetch, that's an extra draw call and a lot of schlepping pixels back and forth between shared and tile memory -- this method is a lot faster.
On top of that, there's no saying that framebuffer data has to be color... if you're using multiple render targets in OpenGL ES 3.0, you can read data from one and use it to compute data that you write to another. (Note that the extension works differently in GLSL 3.0, though. The above examples are GLSL 1.0, which you can still use in an ES3 context. See the spec for how to use framebuffer fetch in a #version 300 es shader.)
I suspect you want this configuration:
Source: GL_SRC_ALPHA
Destination: GL_ONE.
Equation: GL_ADD
If not, it might be helpful if you could explain the math of the filter you're hoping to get.
[EDIT: the answer below is true for OpenGL and OpenGL ES pretty much everywhere except iOS since 6.0. See rickster's answer for information about EXT_shader_framebuffer_fetch which, in ES 3.0 terms, allows a target buffer to be flagged as inout, and introduces a corresponding built-in variable under ES 2.0. iOS 6.0 is over a year old at the time of writing so there's no particular excuse for my ignorance; I've decided not to delete the answer because it's potentially valid to those finding this question based on its opengl-es, opengl-es-2.0 and shader tags.]
To confirm briefly:
the OpenGL blend modes are implemented in hardware and occur after the fragment shader has concluded;
you can't programmatically specify a blend mode;
you're right that the only workaround is to ping pong, swapping the target buffer and a source texture for each piece of geometry (so you draw from the first to the second, then back from the second to the first, etc).
Per Wikipedia and the link you provided, Photoshop's overlay mode is defined so that the output pixel from a background value of a and a foreground colour of b, f(a, b) is 2ab if a < 0.5 and 1 - 2(1 - a)(1 - b) otherwise.
So the blend mode changes per pixel depending on the colour already in the colour buffer. And each successive draw's decision depends on the state the colour buffer was left in by the previous.
So there's no way you can avoid writing that as a ping pong.
The closest you're going to get without all that expensive buffer swapping is probably, as Sorin suggests, to try to produce something similar using purely additive blending. You could juice that a little by adding a final ping-pong stage that converts all values from their linear scale to the S-curve that you'd see if you overlaid the same colour onto itself. That should give you the big variation where multiple circles overlap.

Colors not blending properly in OpenGL ES

I'm trying to render 2 (light) circles in OpenGL ES in 2D. The middle is white, the border is black. It works fine, as long as they don't overlap:
But as soon as they do, I get this artifact:
I'm using glBlendFunc(GL_ONE, GL_ONE) with blending enabled of course.
What could be causing this? Is there a way to fix it?
I'd like them to blend more like this:
Thanks!
Are your circles currently linear gradients? You might get less of an artifact if you have a different curve.
Based on your example, though, it looks like you want the maximum intensity of the two circles, not the sum of the intensities. It appears that Apple's OpenGL ES 2.0 implementation support the EXT_blend_minmax extension, which lets you specify that the resulting fragment values should be the maximum of the inbound and existing values. Maybe try that?
The result you're seeing is exactly what should come out for linear gradients. Hint: Open up Photoshop or The GIMP draw two radial gradients into two layers and set them to "Addition" blending mode. It will look exactly like your picture.
A effect like what you desire is given with square gradients. If your gradient is in the range 0…1 take the square of the value and draw this. You may apply a sqrt later if you want to linearize the single gradients.
Not that this is something not easily done using the blending stage; it can be done using multiple passes, but then it's actually more straightforward to use a shader to combine passed from two FBOs.

DirectX Lighting

Hi. I've got a small game using directX10 and C++. However, I started making it using the meshloaderfromOBJ10 direct X sample and I have just been building on it. However, my objects are all looking just plain black althought they have colour.
I know this is because of the light, but seemingly changing any of the code to do with the light does nothing from this sample.
I was wondering if anyone knows a simple(ish) method to just light everything, as in make it bright everywhere. I don't care about shadows or reflection or anything like that as for what I'm doing it is not necessary but it would be nice to see my objects instead of just silhouettes.
Cheers.
However, my objects are all looking just plain black althought they have colour.
If shader expects texture to be set and reads material information from that texture, and you set null texture to corresponding texture stage(sampler or whatever it is called in DX10), shader will read black color from that null texture. And black material without specular/emissive or reflection mapping will always look black, no matter how many lights you use.
Use white texture on materials/objects without texture (assuming your shader understands material color and multiplies it with texture color). Or switch to DX9 and use fixed-function pipeline which treats missing textures as white. Or modify the shader to support materials without texture.
method to just light everything, as in make it bright everywhere
You can use "global" ambient (which you'll have to add in your shader, because D3D10 doesn't have fixed function pipeline). However, you don't really want it, because
I don't care about shadows
you actually care about shadows just don't know it yet. global ambient value will make all materials evenly colored without any gradients. It means that if you have an untextured complicated mesh, you won't be able to figure out what you're looking at and everything that is textured will look ugly. Also, black materials will still remain black. So to "make it bright everywhere", you'll need "sun" - directional light source or a very big point light.

Is that a right way using premultiplied alpha to do alpha blending in GL-ES?

As I know alpha composition uses straight alpha basically.
But all GLES blending sample codes I've seen are using premultiplied alpha.
Is this correct and right way to do this?
Is premultiplied alpha a regular way in real-time graphics?
If the blending sample code you’ve seen targets iPhone OS, that’s because Core Graphics and Core Animation both operate on premultiplied alpha, so drawing into a CGBitmapContext produces premultiplied data, and non-opaque view contents need to be in premultiplied alpha format to composite correctly over other views. However, if you’re just trying to blend content into an opaque view, then either approach will work, as long as your blending matches the format of your image data.
There are upsides and downsides to premultiplied alpha data. Tom Forsyth has a good explanation of the many useful properties of premultiplied alpha. The primary downside is that some stages of fixed-function fragment processing (texture environment, fog, etc.) are built around non-premultiplied alpha, and may not produce the results you want if you try to use them with premultiplied data.

Resources