Metal Alpha value of Color not changing - metal

fragment half4 fragment_shader(VertexOut vIn [[stage_in]]){
return half4(1,1,0.75,0.0);
}
In the Sample Drawing Triangle program for each vertices I am passing this fragment Shader.
Since Alpha is Zero There shouldnot be any output. Changing Alpha Not Working
Output not changing with Alpha Change

Related

Does Metal support anything like glDepthRange()?

I'm writing some metal code that draws a skybox. I'd like for the depth output by the vertex shader to always be 1, but of course, I'd also like the vertices to be drawn in their correct positions.
In OpenGL, you could use glDepthRange(1,1) to have the depth always be written out as 1.0 in this scenario. I don't see anything similar in Metal. Does such a thing exist? If not, is there another way to always output 1.0 as the depth from the vertex shader?
What I'm trying to accomplish is drawing the scenery first and then drawing the skybox to avoid overdraw. If I just set the z component of the outgoing vertex to 1.0, then the geometry doesn't draw correctly, obviously. What are my options here?
Looks like you can specify the fragment shader output (return value) format roughly so:
struct MyFragmentOutput {
// color attachment 0
float4 color_att [[color(0)]];
// depth attachment
float depth_att [[depth(depth_argument)]]
}
as seen in the section "Fragment Function Output Attributes" on page 88 of the Metal Shading Language Specification (https://developer.apple.com/metal/Metal-Shading-Language-Specification.pdf). Looks like any is a working value for depth_argument (see here for more: In metal how to clear the depth buffer or the stencil buffer?)
Then you would set you fragment shader to use that format
fragment MyFragmentOutput interestingShaderFragment
// instead of: fragment float4 interestingShaderFragment
and finally just write to the depth buffer in your fragment shader:
MyFragmentOutput out;
out.color_att = float(rgb_color_here, 1.0);
out.depth_att = 1.0;
return out;
Tested and it worked.

OpenGL/GLSE alpha masking

I'm implementing a paint app by using OpenGL/GLSL.
There is a feature where a user draws a "mask" by using brush with a pattern image, meantime the background changes according to the brush position. Take a look at the video to understand: video
I used CALayer's mask (iOS stuff) to achieve this effect (on the video). But this implementation is very costly, fps is pretty low. So I decided to use OpenGL for that.
For OpenGL implementation, I use the Stencil buffer for masking, i.e.:
glEnable(GL_STENCIL_TEST);
glStencilFunc(GL_ALWAYS, 1, 0);
glStencilOp(GL_KEEP, GL_KEEP, GL_REPLACE);
// Draw mask (brush pattern)
glStencilFunc(GL_EQUAL, 1, 255);
// Draw gradient background
// Display the buffer
glBindRenderbuffer(GL_RENDERBUFFER, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER];
The problem: Stencil buffer doesn't work with alpha, that's why I can't use semi-transparent patterns for the brushes.
The question: How can I achieve that effect from video by using OpenGL/GLSL but without Stencil buffer?
Since your background is already generated (from comments) then you can simply use 2 textures in the shader to draw a each of the segments. You will need to redraw all of them until user lifts up his finger though.
So assume you have a texture that has a white footprint on it with alpha channel footprintTextureID and a background texture "backgroundTextureID". You need to bind both of a textures using activeTexture 1 and 2 and pass the 2 as uniforms in the shader.
Now in your vertex shader you will need to generate the relative texture coordinates from the position. There should be a line similar to gl_Position = computedPosition; so you need to add another varying value:
backgroundTextureCoordinates = vec2((computedPosition.x+1.0)*0.5, (computedPosition.y+1.0)*0.5);
or if you need to flip vertically
backgroundTextureCoordinates = vec2((computedPosition.x+1.0)*0.5, (-computedPosition.y+1.0)*0.5):
(The reason for this equation is that the output vertices are in interval [-1,1] but the textures use [0,1]: [-1,1]+1 = [0,2] then [0,2]*0.5 = [0,1]).
Ok so assuming you bound all of these correctly you now only need to multiply the colors in fragment shader to get the blended color:
uniform sampler2D footprintTexture;
varying lowp vec2 footprintTextureCoordinate;
uniform sampler2D backgroundTexture;
varying lowp vec2 backgroundTextureCoordinates;
void main() {
lowp vec4 footprintColor = texture2D(footprintTexture, footprintTextureCoordinate);
lowp vec4 backgroundColor = texture2D(backgroundTexture, backgroundTextureCoordinates);
gl_FragColor = footprintColor*backgroundColor;
}
If you wanted you could multiply with alpha value from the footprint but that only loses the flexibility. Until the footprint texture is white it makes no difference so it is your choice.
Stencil is a boolean on/off test, so as you say it can't cope with alpha.
The only GL technique which works with alpha is the blending, but due to the color change between frames you can't simply flatten this into a single layer in a single pass.
To my mind it sounds like you need to maintain multiple independent layers in off-screen buffers, and then blend them together per frame to form what is shown on screen. This gives you complete independence for how you update each layer per frame.

Blending issue porting from OpenGLES 1.0 to 2.0 (iOS)

I'm porting a very simple piece of code from OpenGLES 1.0 to OpenGLES 2.0.
In the original version, I have blending enabled with
glEnable(GL_BLEND);
glBlendEquation(GL_FUNC_ADD);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
I'm using the same code in my ES 2.0 implementation as I need to blend the newly rendered quads with what was in the render buffer (I'm retaining the render buffer, I can't re-render the scene).
I'm using a texture (alpha values displaying a radial gradient from center to the outside, alpha goes from 1 to 0) that serves as an alpha mask, containing only white pixels with different alpha values. I give my vertices the same color say red with alpha of 100/255. My background is transparent black. Below that, I have a plain white surface (UIView). I render 4 quads.
Result with OpenGLES 1.0 (desired result)
My observations tells me that the fragment shader should simply be:
gl_FragColor = DestinationColor * texture2D(Texture, TexCoordOut);
(I got to that conclusion by trying different values for my vertices and the texture. That's also what I've read on some tutorials.)
I'm trying to write some OpenGL 2.0 code (including vertex + fragment shaders) that would give me the exact same result as in OpenGLES 1.0, nothing more, nothing less. I don't need/want to do any kind of blending in the fragment shader except applying the vertex color on the texture. Using the simple shader, here's the result I got:
I tried pretty much every combination of *, +, mix I could think of but I couldn't reproduce the same result. This is the closest I got so far, but that's definitely not the right one (and that doesn't make any sense either)
varying lowp vec4 DestinationColor;
varying lowp vec2 TexCoordOut;
uniform sampler2D Texture;
void main(void) {
lowp vec4 texture0Color = texture2D(Texture, TexCoordOut);
gl_FragColor.rgb = mix(texture0Color.rgb, DestinationColor.rgb, texture0Color.a);
gl_FragColor.a = texture0Color.a * DestinationColor.a;
}
This shader gives me the following:
By reading this and this, one can construct the blending function.
Since you're using glBlendFunc and not glBlendFuncSeparate, all the 4 channels are being blended. Using the GL_FUNC_ADD parameter sets the output O to O = sS + dD, where s and d are the blending parameters, and S and D are the source and destination colors.
The s and d parameters are set by the glBlendFunc. GL_ONE sets s to (1, 1, 1, 1), and GL_ONE_MINUS_SRC_ALPHA sets d to (1-As, 1-As, 1-As, 1-As), where As is the alpha value of the source. Therefore, your blend is doing this (in vector form):
O = S + (1-As) * D which in GLSL is O = mix(D, S, As), or:
gl_FragColor = mix(DestinationColor, TexCoordOut, As).
If the result doesn't look similar, then please verify that you're not using glBlend or any other OpenGL APIs that may change the appearance of your final result. If that doesn't help, please post a screenshot of the different outputs.
this can't be done easily with shaders since blending have to read current framebuffer state.
You can achieve this with rendering to texture and passing it to shader, if you can get color would be in framebuffer then you are ok.
your equation is:
gl_FragColor = wouldBeFramebufferColor + (1-wouldBeFramebufferColor.a) * texture0Color;
but why do you want to do it in shaders AFAIK blending was not removed in OpenGL ES 2.0
Stupid mistake. I only needed to normalize the vertex color in the vertex shader as I'm passing unsigned bytes in.

Smooth color blending in OpenGL

I'm trying to achieve the following blending when the texture at one vertex merges with another:
Here's what I currently have:
I've enabled blending and am specifying the blending function as:
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
I can see that the image drawn in the paper app is made up of a small circle that merges with the same texture before and after and has some blending effect on the color and the alpha.
How do I achieve the desired effect?
UPDATE:
What I think is happening is that the intersected region of the two textures is getting the alpha channel to be modified (either additive or some other custom function) while the texture is not being drawn in the intersected region. The rest of the region has the rest of the texture drawn. Like so:
I'm not entirely sure of how to achieve this result, though.
You shouldn't need blending for this (and it won't work the way you want).
I think as long as you define your texture coordinate in the screen space, it should be seamless between two separate circles.
To do this, instead of using a texture coordinate passed through the vertex shader, just use the position of the fragment to sample the texture, plus or minus some scaling:
float texcoord = gl_FragCoord / vec2(xresolution_in_pixels, yresolution_in_pixels);`
gl_FragColor = glTexture2D(papertexture, texcoord);
If you don't have access to GLSL, you could do something instead with the stencil buffer. Just draw all your circles into the stencil buffer, use the combined region as a stencil mask, and then draw a fullscreen quad of your texture. The color will be seamlessly deposited at the union of all the circles.
You can achieve this effect with max blending for alpha. Or manual (blending off) with shader (OpenGL ES 2.0):
#extension GL_EXT_shader_framebuffer_fetch : require
precision highp float;
uniform sampler2D texture;
uniform vec4 color;
varying vec2 texCoords;
void main() {
float res_a =gl_LastFragData[0].a;
float a = texture2D(texture, texCoords).a;
res_a = max(a, res_a);
gl_FragColor = vec4(color.rgb * res_a, res_a);
}
Result:

XNA - Render to a texture's alpha channel

I have a texture that I want to modify it's alpha channel in runtime.
Is there a way to draw on a texture's alpha channel ?
Or maybe replace the channel with that of another texture ?
Thanks,
SW.
Ok, based on your comment, what you should do is use a pixel shader. Your source image doesn't even need an alpha channel - let the pixel shader apply an alpha.
In fact you should probably calculate the values for the alpha channel (ie: run your fluid solver) on the GPU as well.
Your shader might look something like this:
float4 main(float2 uv : TEXCOORD) : COLOR
{
float4 c = tex2D(textureSampler, uv);
c.A = /* calculate alpha value here */;
return c;
}
A good place to start would be the XNA Sprite Effects sample.
There's even an effect similar to what you are doing:
(source: msdn.com)
The effect in the sample reads from a second texture to get values for the calculation of the alpha channel of the first texture when it is drawn.

Resources