WebGL Gradient Shader - webgl

I am trying to learn WebGL and would like to have shader that gives a mesh a gradient effect from top to bottom. For example, the bottom of a ball or wall having no blue color and the top having all blue. I know I need to modify the fragment color with the y component of gl_Position but my implementations have thus far given me a black screen. Any help would be appreciated.

Are you sure that the fragments are getting actually drawn on the screen (disable culling, depth test), no GL errors ? If yes, only issue can be the Alpha setting with blending enabled. Try disabling GL_BLEND, or changing the value of Alpha to 1.0, like below, setting RGB to your colors:
gl_FragColor = vec4(R,G,B, 1.0);

Related

Texture Brush (Drawing Application ) Using Metal

I am trying to implement a metal-backed drawing application where brushstrokes are drawn on an MTKView by textured square repeatedly along a finger position.
I am drawing this with alpha 0.2. When the squares are overlapped the color is added. How can I draw with alpha 0.2.
I think you need to draw the brush squares to a separate texture, initially cleared to transparent, without blending. Then draw that whole texture to your view with blending.
If you draw the brush squares directly to the view, then they will accumulate. After you draw square 1, it's part of the image. Metal can no longer distinguish it from anything else that was already there. So, when you draw square 2 overlapping it, it will blend with what's already there, including square 1.

Additive blending of positive and negative color fragments in a single render pass

I'm working on a WebGL project that resembles a particle system. For aesthetic purposes, my single rendering pass is configured to blend additively, and I've disabled depth testing. I'm also clearing my viewport buffer to 50% gray, for the sake of argument.
gl.enable(gl.BLEND);
gl.blendFunc(gl.ONE, gl.ONE);
gl.disable(gl.DEPTH_TEST);
gl.clearColor(0.5, 0.5, 0.5, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
I've uploaded a vertex buffer and index buffer to the GPU representing two partially overlapping triangles. Their vertices have a vec3 color attribute. I've assigned each vertex a color of 50% gray (0.5, 0.5, 0.5).
When I draw the triangles with my shaders, I'm relieved to report that my viewport buffer now looks 50% gray with two overlapping triangular regions that are white. The triangles are white because their fragments' color values were additively blended with those already in the color buffer.
Now, I re-upload the vertex buffer with the following change: the color of the vertices of the second triangle are now -50% gray (-0.5, -0.5, -0.5).
What I hope to accomplish is that my viewport buffer would look 50% gray with two overlapping triangular regions– one white, one black– which intersect, and produce 50% gray at their intersection. After all, adding a negative number should be the same as subtracting a positive number of the same magnitude.
What I see instead is a 50% gray viewport with only one triangular region– the white one.
I assume that this is because the output of my fragment shader is being clamped to a range whose lower bound is zero, before it's blended with the color buffer. I would like to know how to circumvent this clamping– ideally in WebGL, without requiring multiple render passes.
I'll be testing solutions in the source of the page at this URL: http://rezmason.net/scourge/issues/positive_negative_fragments.html
UPDATE
As an investigation, I've experimented with performing my additive blending in a frame buffer, and then drawing the frame buffer to the viewport buffer by texturing it to a unit quad– that's two separate draw calls, of course, which ideally I'd like to avoid.
That said, because I can explicitly set the format of the frame buffer to floating point, no clamping occurs with any value while I perform operations within that buffer. When I draw the buffer to the viewport, I assume that clamping finally occurs, but by then all the blending is already done.
My question is now substantially simpler: is there any way to initialize a WebGL or OpenGL context, so that its viewport is formatted as a floating point buffer?
Use gl.blendEquation( gl.FUNC_SUBTRACT ). Then use positive values in your shader.
If you want do something in the middle, you can make some hacky things:
gl.enable(gl.BLEND);
gl.blendFuncSeparate(gl.ONE_MINUS_SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA, gl.ONE, gl.ONE);
gl.blendEquation(gl.FUNC_ADD);
It will give you this equation:
You can now draw white triangle if you set color to (0.5, 0.5, 0.5, 0) and black triangle with color (0.5, 0.5, 0.5, 1).
If you want different colors I hope you get the point. You can compare different blending functions here: http://www.andersriggelsen.dk/glblendfunc.php
Edit:
Sorry, my mistake. You should change
gl.blendFuncSeparate(gl.ONE_MINUS_DST_ALPHA, gl.ONE_MINUS_DST_ALPHA, gl.ONE, gl.ONE);
to
gl.blendFuncSeparate(gl.ONE_MINUS_SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA, gl.ONE, gl.ONE);
I've forgotten which one is source and which one is destination. I've updated my answer.

directx - shade in plane surface with lighting off

I have an plane surface(all the z of vertexes equal to 0.0) with many vertexes, and attached the texture on it with lighting off by:
device->SetRenderState(D3DRS_LIGHTING, false);
And there are shades in the render result.
I tried to turn off the normal too:
device->SetRenderState(D3DRS_NORMALIZENORMALS, false);
It doesn't work also.
Does anybody know what's going on?
I just want the texture shows distorted without any light effect.
And the surface is from tessellated NURBS control points.
The color of texture seems changed by VertexShader, is that possible? and how to solve it?
The pure white texture rendered like below:
http://dl.dropbox.com/u/2318704/image/effect.png
question is changed to directx - texture render result is incorrect

xna drawprimitives: flip face normal

I'm drawing triangles and they're invisible from the view they should be visible, so how can I flip the direction?
Thanks
Invert the vertex order in the vertex or index buffer. Or change the backface culling settings. E.g. set the CullMode of the RasterizerState to CullMode.None.
Additionally, make sure that there are no problems with lighting that make your triangles black / invisible.

Why are there dark edges / halos between transparent and opaque areas of textured surfaces (in opengl es 2.0 fragment shaders)

I'm using a PNG texture image to control the opacity of fragments in an Opengl es 2.0 shader (on iOS). The result I am after is light grey text on top of my scene's medium grey background (the fragments shader is applied to a triangle strip in the scene). The problem is that there are dark edges around my text -- they look like artifacts. I'm using PNG transparency for the alpha information -- but I'm open to other approaches. What is going on and how can I do this correctly?
First, look at this answer regarding PNG transparency and premultiplied alpha. Long story short, the pixels in the PNG image that have less than 100% opacity are being premultiplied, so they are in effect getting darker as they get more transparent. Hence the dark artifacts around the edges.
Even without PNG and premultiplied transparency, you may still run into the problem if you forget to set your fragment shader's color before applying transparency.
A solution to this problem (where you want text to be a light grey color, and everything in the texture map that's not text to be transparent) would be to create a texture map where your text is white and the background is black.
This texture map will control the alpha of your fragment. The RGB values of your fragment will be set to your light grey color.
For example:
// color used for text
lowp vec4 textColor = vec4(.82,.82,.82,1.0);
gl_FragColor = textColor;
// greyscale texture passed in as uniform
lowp vec4 alphaMap = texture2D(u_alpha_texture,v_texture);
// set the alpha using the texture
gl_FragColor.w = alphaMap.x;
In cases where your color texture varies, this approach would require two separate texture map images (one for the color, and one for the alpha). Clearly, this is less efficient then dealing with one PNG that has alpha transparency baked-in. However, in my experience it is a good tradeoff (premultiplied pixels can be counter-intuitive to deal with, and the other approaches to loading PNG transparency without pre-multiplication introduce added complexity).
An upside to this approach is that you can vary the color of the text independently of the texture map image. For instance if you wanted red text, you could change the textColor value to:
lowp vec4 textColor = vec4(1.0,0.0,0.0,1.0);
You could even vary the color over time, etc, all independently of the alpha. That's why I find this approach to be flexible.

Resources