Texture in a Triangle - ios

Is it possible to do texturing inside 3 Vertex(Triangle). I drawing a triangle. I want to read the previous pixel inside fragment shader inside triangular area using sampler and I want to color. If there is previous color available i need to clear color. If color is available I need to add color to the pixel.
Inside quad we can do it sampling the texture.

Related

Texture Brush (Drawing Application ) Using Metal

I am trying to implement a metal-backed drawing application where brushstrokes are drawn on an MTKView by textured square repeatedly along a finger position.
I am drawing this with alpha 0.2. When the squares are overlapped the color is added. How can I draw with alpha 0.2.
I think you need to draw the brush squares to a separate texture, initially cleared to transparent, without blending. Then draw that whole texture to your view with blending.
If you draw the brush squares directly to the view, then they will accumulate. After you draw square 1, it's part of the image. Metal can no longer distinguish it from anything else that was already there. So, when you draw square 2 overlapping it, it will blend with what's already there, including square 1.

Metal. Why does setting MTLCullMode to none turn off depth comparison?

I an rendering a simple box:
MDLMesh(boxWithExtent: ...)
In my draw loop when I turn off back-face culling:
renderCommandEncoder.setCullMode(.none)
All depth comparison is disabled and sides of the box are drawn completely wrong with back-facing quads in front of front-facing.
Huh?
My intent is to include back-facing surfaces in the depth comparison not ignore them. This is important for when I have, for example, a shape with semi-transparent textures that reveal the shape's internals which have a different shading style. How to I force depth comparison?
UPDATE
So Warren's suggestion is an improvement but it is still not correct.
My depthStencilDescriptor:
let depthStencilDescriptor = MTLDepthStencilDescriptor()
depthStencilDescriptor.depthCompareFunction = .less
depthStencilDescriptor.isDepthWriteEnabled = true
depthStencilState = device.makeDepthStencilState(descriptor: depthStencilDescriptor)
Within my draw loop I set depth stencil state:
renderCommandEncoder.setDepthStencilState(depthStencilState)
The resultant rendering
Description. This is a box mesh. Each box face uses a shader the paints a disk texture. The texture is transparent outside the body of the disk. The shader paints a red/white spiral texture on front-facings quads and a blue/black spiral texture on back-facing quads. The box sits in front of a camera aligned quad textured with a mobil image.
Notice how one of the textures paints over the rear back-facing quad with the background texture color. Notice also that the rear-most back-facing quad is not drawn at all.
Actually it is not possible to achieve the effect I am after. I basically want to do a simple composite - Porter/Duff - here but that is order dependent. Order cannot be guaranteed here so I am basically hosed.

Considering the Stencil in depth

I'm using Orthographic projection to draw my objects.
Each object items is being added to different buffers and being drawn in several cycles.
Let's say that each object has an outline square and fill for the square (in different color).
So i'm drawing first the all the fillings, and then the outlines.
I'm using depth buffer to make sure that the outlines will not be over all the fills as shown at the picture
Now i'm facing a problem that each object contains another drawing item on it (such as text - points) which can be longer than this squares. So i'm using the stencil buffer for cutting this additional drawing over the square. Although, when doing this there is no consideration in the depth buffer.
Meaning that one text item can be drawn over the other square. as showed below.
Is there anyway\trick to make it happen ?
You should be able to set the stencil buffer to a different value for each of the squares (provided there is <= 255 squares, as you won't be able to get a more than 8-bit stencil buffer). Configure the stencil value to KEEP for pixels that fail the depth test, causing any stencil values written by quads that are further in front but have been drawn earlier to be retained.
This will allow clipping each text individually.
Another way is to use only the depth buffer and pass the pixel extents of the current quad into the text pixel shader, where you can discard any extra pixels. This requires less state changes.

Why are there dark edges / halos between transparent and opaque areas of textured surfaces (in opengl es 2.0 fragment shaders)

I'm using a PNG texture image to control the opacity of fragments in an Opengl es 2.0 shader (on iOS). The result I am after is light grey text on top of my scene's medium grey background (the fragments shader is applied to a triangle strip in the scene). The problem is that there are dark edges around my text -- they look like artifacts. I'm using PNG transparency for the alpha information -- but I'm open to other approaches. What is going on and how can I do this correctly?
First, look at this answer regarding PNG transparency and premultiplied alpha. Long story short, the pixels in the PNG image that have less than 100% opacity are being premultiplied, so they are in effect getting darker as they get more transparent. Hence the dark artifacts around the edges.
Even without PNG and premultiplied transparency, you may still run into the problem if you forget to set your fragment shader's color before applying transparency.
A solution to this problem (where you want text to be a light grey color, and everything in the texture map that's not text to be transparent) would be to create a texture map where your text is white and the background is black.
This texture map will control the alpha of your fragment. The RGB values of your fragment will be set to your light grey color.
For example:
// color used for text
lowp vec4 textColor = vec4(.82,.82,.82,1.0);
gl_FragColor = textColor;
// greyscale texture passed in as uniform
lowp vec4 alphaMap = texture2D(u_alpha_texture,v_texture);
// set the alpha using the texture
gl_FragColor.w = alphaMap.x;
In cases where your color texture varies, this approach would require two separate texture map images (one for the color, and one for the alpha). Clearly, this is less efficient then dealing with one PNG that has alpha transparency baked-in. However, in my experience it is a good tradeoff (premultiplied pixels can be counter-intuitive to deal with, and the other approaches to loading PNG transparency without pre-multiplication introduce added complexity).
An upside to this approach is that you can vary the color of the text independently of the texture map image. For instance if you wanted red text, you could change the textColor value to:
lowp vec4 textColor = vec4(1.0,0.0,0.0,1.0);
You could even vary the color over time, etc, all independently of the alpha. That's why I find this approach to be flexible.

OpenGL point sprites with depth testing - a blending issue?

I am rendering point sprites (using OpenGL ES 2.0 on iOS) as a user's drawing strokes. I am storing these points in vertex buffer objects such that I need to perform depth testing in order for the sprites to appear in the correct order when they're submitted for drawing.
I'm seeing an odd effect when rendering these drawing strokes, as shown by the following screenshot:
Note the background-coloured 'border' around the edge of the blue stroke, where it is drawn over the green. The user drew the blue stroke after the green stroke, but when the VBOs are redrawn the blue stroke gets drawn first. When it comes to draw the green stroke, depth testing kicks in and sees that it should be behind the blue stroke, and so does this, with some success. It appears to me to be some kind of blending issue, or to do with incorrectly calculating the colour in the fragment shader? The edges of all strokes should be transparent, however it appears that the fragment shader combines it with the background texture when processing those fragments.
In my app I have created a depth renderbuffer and called glEnable(GL_DEPTH_TEST) using glDepthFunc(GL_LEQUAL). I have experimented with glDepthMask() to no avail. Blending is set to glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA), and the point sprite colour uses premultiplied alpha values. The drawing routine is very simple:
Bind render-to-texture FBO.
Draw background texture.
Draw point sprites (from a number of VBOs).
Draw this FBO's texture to the main framebuffer.
Present the main framebuffer.
EDIT
Here is some code from the drawing routine.
Setup state prior to drawing:
glDisable(GL_DITHER);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
Drawing routine:
[drawingView setFramebuffer:drawingView.scratchFramebuffer andClear:YES];
glUseProgram(programs[PROGRAM_TEXTURE]);
[self drawTexture:[self textureForBackgroundType:self.backgroundType]];
glUseProgram(programs[PROGRAM_POINT_SPRITE]);
// ...
// Draw all VBOs containing point sprite data
// ...
[drawingView setFramebuffer:drawingView.defaultFramebuffer andClear:YES];
glUseProgram(programs[PROGRAM_TEXTURE]);
[self drawTexture:drawingView.scratchTexture];
[drawingView presentFramebuffer:drawingView.defaultFramebuffer];
Thanks for any help.
If you want to draw non opaque geometries you have to z-sort them from back to front. This has been the only way to get a proper blending for many years. These days there are some algorithms for order independent transparency like Dual Depth Peeling but they are not applicable to iOS.

Resources