How to enable alpha blending on DirectX? - directx

How to do this in DirectX?
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);
Somehow i just cant seem to get it work. Im using code:
d3ddev->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCCOLOR);
d3ddev->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCCOLOR);
d3ddev->SetRenderState(D3DRS_ALPHABLENDENABLE, 1);
d3ddev->SetRenderState(D3DRS_ALPHAFUNC, D3DCMP_GREATEREQUAL);
d3ddev->SetRenderState(D3DRS_ALPHAREF, (DWORD)50);
d3ddev->SetRenderState(D3DRS_ALPHATESTENABLE, 1);
But it will render my polygon with some sort of ghosted method, i can see through all my polygons! i just want to make the texture with alpha channel to show through those fully transparent pieces of texture. this works with alphatest, but it still shows black edges, so i guess the blending isnt enabled, even though i have set D3DRS_ALPHABLENDENABLE ! What im doing wrong?

instead of SRCCOLOR i needed to use SRCALPHA:
d3ddev->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
d3ddev->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);

Related

Blending in Metal: alpha set to 0 is still opaque

I'm having trouble setting up blending in Metal. Even when starting with the Hello Triangle example provided by Apple, using the following code
pipelineStateDescriptor.colorAttachments[0].blendingEnabled = YES;
pipelineStateDescriptor.colorAttachments[0].sourceAlphaBlendFactor = MTLBlendFactorZero;
pipelineStateDescriptor.colorAttachments[0].destinationAlphaBlendFactor = MTLBlendFactorZero;
and the fragment function
fragment float4 fragmentShader(RasterizerData in [[stage_in]]) {
return float4(in.color.rgb, 0);
}
the triangle still draws completely opaque. What I want to achieve in the end is blending between two shapes by using different blending factors, but I thought I would start with a simple example to understand what is going on. What am I missing?
sourceAlphaBlendFactor and destinationAlphaBlendFactor are to do with constructing a blend for the alpha channel. i.e. they control the alpha that will be written into your destination buffer, which will not really be visible to you. You are probably more interested in the RGB that is written into the frame buffer.
Try setting values for sourceRGBBlendFactor and destinationRGBBlendFactor instead. For traditional alpha blending set sourceRGBBlendFactor to MTLBlendFactorSourceAlpha and set destinationRGBBlendFactor to MTLBlendFactorOneMinusSourceAlpha

Sketch App Paint Blending OpenGLES

Can anyone suggest why my low opacity painting does this weird blending, while the SketchBookX app does it perfect?
In both images attached the vertical strokes on the left are done at full opacity, the strokes on the right are done at low opacity. The top image is mine and as you can see the strokes on the right at low opacity turn a orange-red color and don't blend/mesh with the full opacity strokes. But the SketchBookX app blends perfectly and maintains the same color.
I'm using glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA) and have tried many variations with no luck, so I'm starting to think there are other things that are giving me this problem.
Do I need to handle this problem in the fragment shader? I currently have this,gl_FragColor = color * rotatedTexture; I'm using PNGs for brush textures.
UPDATE: Im getting the same results without using a texture. gl_FragColor = color;
I want it to be like mixing ink, not like mixing light :)

Alpha blending in Direct3D 9. Some of the primitives aren't rendering behind the texture

I enabled alpha blending in my game, but some of the primitives aren't rendering behind the transparent texture.
Here are my render states:
d3ddevice->SetRenderState(D3DRS_LIGHTING, true);
d3ddevice->SetRenderState(D3DRS_CULLMODE, D3DCULL_CW);
d3ddevice->SetRenderState(D3DRS_ZENABLE, D3DZB_TRUE);
d3ddevice->SetRenderState(D3DRS_AMBIENT, D3DCOLOR_XRGB(15, 15, 15));
d3ddevice->SetRenderState(D3DRS_NORMALIZENORMALS, TRUE);
d3ddevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE);
d3ddevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
d3ddevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);
d3ddevice->SetRenderState(D3DRS_BLENDOP, D3DBLENDOP_ADD);
d3ddevice->SetTextureStageState(0, D3DTSS_COLOROP, D3DTOP_MODULATE);
d3ddevice->SetTextureStageState(0, D3DTSS_COLORARG1, D3DTA_TEXTURE);
d3ddevice->SetTextureStageState(0, D3DTSS_ALPHAOP, D3DTOP_MODULATE);
d3ddevice->SetTextureStageState(0, D3DTSS_ALPHAARG1, D3DTA_TEXTURE);
d3ddevice->SetTextureStageState( 0, D3DTSS_TEXTURETRANSFORMFLAGS, D3DTTFF_DISABLE );
d3ddevice->SetSamplerState(0, D3DSAMP_MINFILTER, D3DTEXF_NONE);
d3ddevice->SetSamplerState(0, D3DSAMP_MAGFILTER, D3DTEXF_NONE);
d3ddevice->SetSamplerState(0, D3DSAMP_MIPFILTER, D3DTEXF_NONE);
What happens here is that the transparent objects still get written to the depth buffer and therefore block objects which lie behind them from being rendered. There are (at least) two ways to solve this.
Sort all transparent objects in your scene such that they are rendered from back to front. You need to do this in your code, D3D won't handle this for you. Here is an MSDN article on this.
It looks that all objects that you are rendering are either fully transparent or fully opaque at any given point. In this case, there is a simpler way to deal with transparency: you can use alpha testing to discard all fragments which have 0 alpha (or alpha below a certain threshold, say 128). Here is an article on how to do this. If you are using shaders, you can use the discard command to discard all transparent fragments.

GLKit Doesn't draw GL_POINTS or GL_LINES

I am working hard on a new iOS game that is drawn only with procedurally generated lines. All is working well, except for a few strange hiccups with drawing some primitives.
I am at a point where I need to implement text, and the characters are set up to be a series of points in an array. When I go to draw the points (which are CGPoints) some of the drawing modes are working funny.
effect.transform.modelviewMatrix = matrix;
[effect prepareToDraw];
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 2, GL_FLOAT, 0, 0, &points);
glDrawArrays(GL_POINTS, 0, ccc);
I am using this code to draw from the array, and when the mode is set to GL_LINE_LOOP or GL_LINE_STRIP all works well. But if I set it to GL_POINTS, I get a gpus_ReturnGuiltyForHardwareRestert error. And if I try GL_LINES it just doesn't draw anything.
What could possibly be going on?
When you draw with GL_POINTS in ES2 or ES3, you need to specify gl_PointSize in the vertex shader or you'll get undefined behavior (ugly rendering on device at best, the crash you're seeing at worst). The vertex shader GLKBaseEffect uses doesn't do gl_PointSize, so you can't use it with GL_POINTS. You'll need to implement your own shaders. (For a starting point, try the ones in the "OpenGL Game" template you get when creating a new Xcode project, or using the Xcode Frame Debugger to look at the GLSL that GLKBaseEffect generates.)
GL_LINES should work fine as long as you're setting an appropriate width with glLineWidth() in client code.

glBlendFunc transparency in OpenGL with GLKit

I'm using GLKit for an iPad app. With this code I setup blending:
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
It works fine, but when I try to get a screenshot the blend mode seems wrong. It doesn't matter if I use GLKit's snapshot or glReadPixels.
This is what I get when working with the app:
And this is the screenshot:
Do I have to change the Blend Mode or something before I make the screenshot? And if so, to what?
The problem you are having lies most likely in how the image is generated from the RGBA data. To solve this you will need to skip the alpha channel when creating CGImage with kCGImageAlphaNoneSkipLast or have the correct alpha values in the buffer in the first place.
To explain what is going. Your GL buffer consists of RGBA values but only RGB part is used to present it but when you create the image you use the alpha channel as well, thus the difference. How it comes to this is very simple, lets take a single pixel somewhere in the middle of the screen and go through its events:
You clear the pixel to any color you want
You overwrite the pixel (all 4 channels RGBA) with a solid color received from the texture for instance (.8, .8, .8, 1.0)
You draw a color over that pixel with some smaller alpha value and try to blend it, for instance (.4, .4, .4, .25). Your blend function says to multiply the source color with the source alpha and the destination with 1 - source alpha. That results in (.4, .4, .4, .25)*.25 + (.8, .8, .8, 1.0)*.75 = (.7, .7, .7, .76)
Now the result (.7, .7, .7, .76) is displayed nicely because your buffer only presents the RGB part resulting in seeing (.7, .7, .7, 1.0) but when you use all 4 components to create the image you also use the .76 alpha value which is further used to blend the image itself. Therefor you need to skip the alpha part at some point.
There is another way: As you can see in your case there is really no need to store the alpha value to the render buffer at all as you never use it, in your blend function you only use source alpha. Therefore you may just disable it using glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_FALSE), this also means you need to clear the alpha value to 1.0 (glClearColor(x,x,x,1.0))

Resources