blend overlapping triangles in opengl - ios

I'm building a thick line from triangles. The problem i'm having is that when the curve is semi transparent and some triangles overlap i get the effect in the picture. I would like for the triangles alphas to not get added.
I'm using this blend function:
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_CONSTANT_ALPHA, GL_CONSTANT_ALPHA);

You may render the curve to separate render target with full opacity and then draw that target with custom alpha. Otherwise you should avoid overlapping.

you can use stencil test for blocking the drawing of already drawn fragments (which prevent blending from the first place)

Related

How to overlap image shapes in a DX9 shader?

I have 3 texture shapes that I want to overlap with each other but I haven't been able to figure out how to do it. I can easily make additive blending happen through:
color1 + color2 + color3;
But I cannot figure out how to do it without the additive effect. I understand I need to do "alpha blending" somehow but I've failed to get past the stage of only finding formulas or code snippets that I cannot apply properly in a DX9 pixel shader.
There are other blending modes, but the basic alpha blending in a shader program is of the form
blended = lerp(bottom_layer, top_layer, value)
Where blended will returns top_layer when value equals 1and mix between the two when value is between 0 and 1, and returns the bottom layer when value is 0.
Simply repeat the operation again for blending any additional layer.
The choice for the value depends on the application. For overlapping shapes without mixing colors, use masking shape for the value. For example:
blended = lerp( background, color_green, step(length(uv-0.5),0.5));
Will draw a green circle on top of background color without color mixing.

Texture Brush (Drawing Application ) Using Metal

I am trying to implement a metal-backed drawing application where brushstrokes are drawn on an MTKView by textured square repeatedly along a finger position.
I am drawing this with alpha 0.2. When the squares are overlapped the color is added. How can I draw with alpha 0.2.
I think you need to draw the brush squares to a separate texture, initially cleared to transparent, without blending. Then draw that whole texture to your view with blending.
If you draw the brush squares directly to the view, then they will accumulate. After you draw square 1, it's part of the image. Metal can no longer distinguish it from anything else that was already there. So, when you draw square 2 overlapping it, it will blend with what's already there, including square 1.

OpenGL point sprites with depth testing - a blending issue?

I am rendering point sprites (using OpenGL ES 2.0 on iOS) as a user's drawing strokes. I am storing these points in vertex buffer objects such that I need to perform depth testing in order for the sprites to appear in the correct order when they're submitted for drawing.
I'm seeing an odd effect when rendering these drawing strokes, as shown by the following screenshot:
Note the background-coloured 'border' around the edge of the blue stroke, where it is drawn over the green. The user drew the blue stroke after the green stroke, but when the VBOs are redrawn the blue stroke gets drawn first. When it comes to draw the green stroke, depth testing kicks in and sees that it should be behind the blue stroke, and so does this, with some success. It appears to me to be some kind of blending issue, or to do with incorrectly calculating the colour in the fragment shader? The edges of all strokes should be transparent, however it appears that the fragment shader combines it with the background texture when processing those fragments.
In my app I have created a depth renderbuffer and called glEnable(GL_DEPTH_TEST) using glDepthFunc(GL_LEQUAL). I have experimented with glDepthMask() to no avail. Blending is set to glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA), and the point sprite colour uses premultiplied alpha values. The drawing routine is very simple:
Bind render-to-texture FBO.
Draw background texture.
Draw point sprites (from a number of VBOs).
Draw this FBO's texture to the main framebuffer.
Present the main framebuffer.
EDIT
Here is some code from the drawing routine.
Setup state prior to drawing:
glDisable(GL_DITHER);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
Drawing routine:
[drawingView setFramebuffer:drawingView.scratchFramebuffer andClear:YES];
glUseProgram(programs[PROGRAM_TEXTURE]);
[self drawTexture:[self textureForBackgroundType:self.backgroundType]];
glUseProgram(programs[PROGRAM_POINT_SPRITE]);
// ...
// Draw all VBOs containing point sprite data
// ...
[drawingView setFramebuffer:drawingView.defaultFramebuffer andClear:YES];
glUseProgram(programs[PROGRAM_TEXTURE]);
[self drawTexture:drawingView.scratchTexture];
[drawingView presentFramebuffer:drawingView.defaultFramebuffer];
Thanks for any help.
If you want to draw non opaque geometries you have to z-sort them from back to front. This has been the only way to get a proper blending for many years. These days there are some algorithms for order independent transparency like Dual Depth Peeling but they are not applicable to iOS.

Drawing lines with rounded endings with Direct3D

Is there any way to draw a line using ID3DXLine with round endings? I am trying to draw a curve from number of line segments, but getting the empty areas where the line segments are connecting.
Performance here is essential.
Thanks!
Any other fast way to draw thick curved line using D3D?
You would be best off using a circular texture (with antialiasing around the edges) and then drawing half the texture at either end of the line. You can then render a strip through the center of the texture the whole way along a rectangle surrounding the line before finishing off with the other half of the texture at the other end. This will give you the effect you are after but its a tad more involved than simply calling "DrawLine" or whatever ...

XNA Layered Sprite problem

I have a game object that manages several sprite objects. Each of the sprites overlap each other a bit, and drawing them looks just fine when they are at 100% opacity. If I set their opacity to say, 50% that is when it all goes to pot because any overlapping area is not 50% opaque due to the multiple layers.
EDIT: Ooops! For some reason I thought that I couldn't upload images. Anyway....
http://postimage.org/image/2fhcmn6s/ --> Here it is. Guess I need more rep for proper inclusion.
From left to right:
1. Multiple sprites, 100% opacity. Great!
2. Both are 50%, but notice how the overlap region distinguishes them as two sprites.
3. This is the desired behavior. They are 50% opaque, but in terms of the composite image.
What is the best way to mitigate this problem? Is a render target a good idea? What if I have hundreds of these 'multi-sprites'?
Hope this makes sense. Thanks!
Method 1:
If you care about the individual opacity of each sprite, then render the image on the background to a rendertarget texture of the same size using 50% or whatever opacity you want the sprite to have against the background. Then draw this rendertarget with 100% opacity.
In this way, all sprites will be blended against the background only, and other sprites will be ignored.
Method 2:
If you don't care about setting the individual opacity of each sprite, then you can just draw all sprites with 100% opacity to a rendertarget. Then draw that render target over your background at 50% opacity.
Performance concerns:
I mentioned two examples of drawing to rendertargets, each for a different effect.
Method 1:
You want to be able to specify a different opacity for each sprite.
If so, you need to render every sprite to a rendertarget and then draw that rendertarget texture to the final texture. Effectively, this is the same cost as drawing twice as many sprites as you need. In this case, that's 400 draw calls, which can be very expensive.
If you batch the calls though, and use a single large rendertarget for all of the sprites, you might get away with just 2 draw calls (depending on how big your sprites are, and the max size of a texture).
Method 2:
You don't need different opacity per each sprite.
In this case you can almost certainly get away with just 2 draw calls, regardless of sprite size.
Just batch all draw calls of the sprites (with 100% opacity) to draw to a rendertarget. That's one draw call.
Now draw that rendertarget on top of your background image with the desired opacity (e.g. 50% opacity), and all sprites will have this opacity.
This case is easier to implement.
The first thing your example images reminded me of is the "depth-buffer and translucent surfaces" problem.
In a 3D game you must sort your translucent surfaces from back-to-front and draw them only after you have rendered the rest of your scene - all with depth reading and writing turned on. If you don't do this you end up with your 3rd image, when you normally want your 2nd image with the glass being translucent over the top of what is behind it.
But you want the 3rd image - with some transparent surfaces obscuring other ones - so you could just deliberately cause this depth problem!
To do this you need to turn on depth reads and writes and set your depth function so that a second sprite drawn at the same depth as a previously drawn sprite does not render.
To achieve this in XNA 4.0 you need to pass, to SpriteBatch.Begin, a DepthStencilState with its DepthBufferFunction set to CompareFunction.Less (by default it is less-than-or-equal-to) and DepthBufferEnable and DepthBufferWriteEnable set to true.
There may be interactions with the sprite's layerDepth parameter (I cannot remember how it maps to depth by default).
You may also need to use BasicEffect as your shader for your sprite batch - specifically so you can set a projection matrix with appropriate near and far planes. This article explains how to do that. And you may also need to clear your depth buffer before hand.
Finally, you need to draw your sprites in the correct order - with the unobscured sprite first.
I am not entirely sure if this will work and if it will work reliably (perhaps you will get some kind of depth fighting issue, I am not sure). But I think it's worth a try, given that you can leave your rendering code essentially normal and just adjust your render state.
You should try the stuff in Andrew's answer first, but if that doesn't work, you could still render all of the sprites (assuming they all have the same opacity) onto a RenderTarget(2D) with 100% opacity, and then render that RenderTarget to the screen with 50%.
Something like this in XNA 4.0:
RenderTarget2D rt = new RenderTarget2D(graphicsDevice,
graphicsDevice.PresentationParameters.BackBufferWidth,
graphicsDevice.PresentationParameters.BackBufferHeight);
GraphicsDevice.SetRenderTarget(rt);
//Draw sprites
GraphicsDevice.SetRenderTarget(null);
//Then draw rt (also a Texture2D) with 50% opacity. For example:
spriteBatch.Begin();
spriteBatch.Draw(rt, Vector2.Zero, Color.FromArgb(128, Color.White));
spriteBatch.End();

Resources