If you want to achieve a blending of textures with transparency (like PNG) that is similar to UIKit, how do you configure OpenGL ES 1.1 appropriately?
I found:
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDisable(GL_ALPHA_TEST);
But there is also:
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE);
You should use glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);. The other one has no respect for the alpha channel at all in this case, it will simply sum up the source and destination colours for you.
There is one thing you should note though. This will work great on colours but your destination alpha channel will not be correct. In most cases you do not use it but if you wish to extract the image from buffer with the alpha channel as well you will need it, same goes for using FBO and reuse texture with transparency. In this case you should draw the alpha channel separately using glBlendFunc(GL_ONE_MINUS_DST_ALPHA, GL_ONE). This means doubling the draw calls (at least I can't think of a better solution without shaders). To draw to colour only you have to set glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_FALSE) and to draw alpha only use glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_TRUE)
To sum up:
glEnable(GL_BLEND);
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_FALSE);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
//draw the scene
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_TRUE);
glBlendFunc(GL_ONE_MINUS_DST_ALPHA, GL_ONE);
//draw the scene
glDisable(GL_ALPHA_TEST);
glDisable(GL_BLEND);
Related
I am trying to draw a texture in OpenGL ES 2.0 using GL_POINTS by applying a stencil buffer. The stencil buffer should come from a texture. I am rendering the results to another texture and then presenting the texture to screen. Here is my code for rendering to texture:
//Initialize buffers, initialize texture, bind frameBuffer
.....
glClearStencil(0);
glClear (GL_STENCIL_BUFFER_BIT);
glColorMask( GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE );
glEnable(GL_STENCIL_TEST);
glStencilFunc(GL_ALWAYS, 1, 1);
glStencilOp(GL_KEEP, GL_KEEP, GL_REPLACE);
glBindTexture(GL_TEXTURE_2D, stencil);
glUseProgram(program[PROGRAM_POINT].id);
glDrawArrays(GL_POINTS, 0, (int)vertexCount);
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);
glStencilFunc(GL_NEVER, 0, 1);
glStencilOp(GL_REPLACE, GL_KEEP, GL_KEEP);
glBindTexture(GL_TEXTURE_2D, texture);
glUseProgram(program[PROGRAM_POINT].id);
glDrawArrays(GL_POINTS, 0, (int)vertexCount);
glDisable(GL_STENCIL_TEST);
....
//Render texture to screen
The result I am getting is just my texture being drawn without any masking applied from the stencil. I had a few questions regarding this issue:
Is is possible to use a stencil buffer with GL_POINTS?
Is is possible to use a stencil buffer when rendering to a texture?
Does the stencil texture have to have any special properties (solid colour, internal format...etc)?
Are there any apparent mistakes with my code?
This is the result I am looking for:
UPDATE:
My problem, as pointed out by the selected answer, was primarily that I did not attach the stencil to the stencil attachment of the FBO:
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_STENCIL_ATTACHMENT,
GL_RENDERBUFFER, stencilBufferId);
I did not know that it was required when rendering to a texture. Secondly I was not using the proper stencil test.
glStencilFunc(GL_EQUAL, 1, 1);
glStencilOp(GL_KEEP, GL_KEEP, GL_KEEP);
Did the job.
Addressing your questions in order:
Is is possible to use a stencil buffer with GL_POINTS?
Yes. The stencil test is applied to all fragments, no matter of the primitive type rendered. The only case where you write to the framebuffer without applying the stencil test is with glClear().
Is is possible to use a stencil buffer when rendering to a texture?
Yes. However, when you render to a texture using an FBO, the stencil buffer of your default framebuffer will not be used. You have to create a stencil renderbuffer, and attach it to the stencil attachment of the FBO:
GLuint stencilBufferId = 0;
glGenRenderbuffers(1, &stencilBufferId);
glBindRenderbuffer(GL_RENDERBUFFER, stencilBufferId);
glRenderbufferStorage(GL_RENDERBUFFER, GL_STENCIL_INDEX8, width, height);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_STENCIL_ATTACHMENT,
GL_RENDERBUFFER, stencilBufferId);
Does the stencil texture have to have any special properties (solid colour, internal format...etc)?
OpenGL ES 2.0 does not have stencil textures. You have to use a renderbuffer as the stencil attachment, as shown in the code fragment above. GL_STENCIL_INDEX8 is the only format supported for renderbuffers that can be used as stencil attachment. ES 3.0 supports depth/stencil textures.
Are there any apparent mistakes with my code?
Maybe. One thing that looks slightly odd is that you never really apply a stencil test in the code that is shown. While you do enable the stencil test, you only use GL_ALWAYS and GL_NEVER for the stencil function. As the names suggest, these functions either always or never pass the stencil test. So you don't let fragments pass/fail depending on the stencil value. I would have expected something like this before the second draw call:
glStencilFunc(GL_EQUAL, 1, 1);
glStencilOp(GL_KEEP, GL_KEEP, GL_KEEP);
This would only render the fragments where the current stencil buffer value is 1, which corresponds to the fragments you rendered with the previous draw call.
In a paint app I am developing, I want my user to be able to draw with a transparent brush, for example black paint over white background should result in grey colour. When more paint is applied, the resulting colour will be closer to black.
However no matter how many times I draw over the place, the resulting colour never turnts black; in fact it stops changing after a few lines. Photoshop says that the alpha of the blob drawn on the left in OpenGL is max 0.8, where I expect it to be 1.
My app works by drawing series of stamps as in Apple's GLPaint sample to form a line. The stamps are blended with the following function:
glBlendFuncSeparate(GL_ONE, GL_ONE_MINUS_SRC_ALPHA, GL_ONE_MINUS_DST_ALPHA, GL_ONE);
glBlendEquation(GL_FUNC_ADD);
My fragment shader:
uniform lowp sampler2D u_texture;
varying highp vec4 f_color;
void main(){
gl_FragColor = texture2D(u_texture, gl_PointCoord).aaaa*f_color*vec4(f_color.aaa, 1);
}
How should I configure the blending in order to get full colour when drawing repeatedly?
Update 07/11/2013
Perhaps I should also note that I first draw to a texture, and then draw the texture onscreen. The texture is generated using the following code:
glGenFramebuffers(1, &textureFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, textureFramebuffer);
glGenTextures(1, &drawingTexture);
glBindTexture(GL_TEXTURE_2D, drawingTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, pixelWidth, pixelHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
Update 02/12/2013
I tried modifying Apple's GLPaint program to and it turned out that this behaviour is observable only on iOS7. As it can be seen on the screenshots bellow, the colours on iOS 7 are a bit pale and don't blend nicely. The
GL_ONE, GL_ONE_MINUS_SRC_ALPHA
blend function does well on iOS6. Can this behaviour be caused by iOS7's implementation of CALAyer or something else and how do I solve it?
Update 10/07/2014
Apple recently updated their GLPaint sample for iOS7 and the issue is observable there, too. I made a separate thread based on their code: Unmodified iOS7 Apple GLPaint example blending issue
Just because your brush does "darken" the image doesn't mean, that this was subtractive blending. This is in face regular additive blending, where the black brush merely overdraws the picture. You want a (GL_ONE, GL_ONE_MINUS_SRC_ALPHA) blending function (nonseparated). The brush is contained only within the alpha channel of the texture, there's no color channels in the texture, the brush color is determined by glColor or an equivalent uniform.
Using the destination alpha value is not required in most cases.
I want to render a yuv image on an iOS device. I presume it can be achieved using openGL. (Actually I have to render multiple such images in succession)
What I understand is that GLKit is an abstraction that iOS created, in which there is a GLKView which will have and handle the render buffer. I am currently trying to use a GLKViewController and frame update is being done successfully with desired fps. This I conform by using glClear function call.
Now the task is to render an image on the view.
There is a class GLKBaseEffect which will have basic shaders. I can't figure out what properties to set, so I just create it and call prepareToDraw before each render.
There is a class for handling textures, GLKTextureLoader, but it appears to me that it only works with Quartz images, i.e., yuv420 planar can't be loaded into the texture using this class.
// create the texture
GLuint texture;
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, [imageNSData bytes]);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texture, 0);
I use this code for generating a texture and binding it, but I don't really know what I am trying to do here. And whatever it is, it doesn't bring up any image on screen, and I don't know what to do next.
I have not created any shaders, assuming baseEffect will have something.
And this https://developer.apple.com/library/ios/documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/WorkingwithEAGLContexts/WorkingwithEAGLContexts.html#//apple_ref/doc/uid/TP40008793-CH103-SW8 tells me that I'll have to use EAGLayer to render images on screen.
Can I use GLKit to render images? If YES, do we have any sample code or tutorial that wouldn't use GLKTextureLoader class (I couldn't find any)? If NO, is there a similar tutorial for render using EAGLLayer (I have not explored about it till now) ?
It sounds like you're really asking about a few different topics here:
How to draw with GLKView vs CAEAGLLayer
How GLKBaseEffect and GLKTextureLoader fit into OpenGL ES drawing in general
How to draw a texture once you have one
How to render YUV image data
I'll try to address each in turn...
GLKView is just fine for any OpenGL ES drawing you want to do -- it does everything that the older documentation you linked to does (setting up framebuffers, CAEAGLLayers, etc) for you so you don't have to write that code. Inside the GLKView drawing method (drawRect: or glkView:drawInRect:, depending on whether you're drawing in a subclass or delegate), you write the same OpenGL ES drawing code you would for CAEAGLLayer (or any other system).
You can use GLKView, GLKTextureLoader, and GLKBaseEffect independently of each other. If you want to write all your own drawing code and use your own shaders, you can draw in a GLKView without using GLKBaseEffect. (You can even mix and match GLKBaseEffect and your own stuff, like you see when you create a new Xcode project with the OpenGL Game template.) Likewise, GLKTextureLoader loads image data and spits out the name you'll need for binding it for drawing, and you can use that regardless of whether you're drawing it with GLKBaseEffect.
Once you get a texture, whether via GLKTextureLoader or reading/decoding the data yourself and providing it to glTexImage2D, there are three basic steps to drawing with it:
Bind the texture name with glBindTexture.
Draw some geometry to be textured (using glDrawArrays, glDrawElements, or similar)
Have a fragment shader that looks up texels and outputs colors.
If you just want to draw an image that fills your view, just draw a quad (two triangles). Here's the code I use to set up a vertex array object with one quad when I want to draw fullscreen:
typedef struct __attribute__((packed)) {
GLKVector3 position;
GLKVector2 texcoord;
} Vertex;
static Vertex vertexData[] = {
{{-1.0f, 1.0f, -1.0f}, {0.0f, 0.0f}},
{{-1.0f, -1.0f, -1.0f}, {0.0f, 1.0f}},
{{ 1.0f, 1.0f, -1.0f}, {1.0f, 0.0f}},
{{ 1.0f, -1.0f, -1.0f}, {1.0f, 1.0f}},
};
glGenVertexArraysOES(1, &_vertexArray);
glBindVertexArrayOES(_vertexArray);
glGenBuffers(1, &_vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertexData), vertexData, GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid *)offsetof(Vertex, position));
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (GLvoid *)offsetof(Vertex, texcoord));
glBindVertexArrayOES(0);
Then, to draw it:
glBindVertexArrayOES(_vertexArray);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
That's just the geometry-drawing part. Combine this with a GLKBaseEffect -- whose transform property is the default identity transform, and whose texture2d0 property is set up with the name of a texture you've loaded via GLKTextureLoader or other means -- and you'll get a view-filling billboard with your texture on it.
Finally, the YUV part... for which I'll mostly punt. Where are you getting your YUV texture data? If it's from the device camera, you should look into CVOpenGLESTexture/CVOpenGLESTextureCache, as covered by this answer. Regardless, you should be able to handle YUV textures using the APPLE_rgb_422 extension, as covered by this answer. You can also look into this answer for some help writing fragment shaders to process YUV to RGB on the fly.
I started learning OpenGL this weekend, and discovered quite a learning curve. Most things I've managed to grapple through, but now I'm stuck...
I have created an array of vertices. Each vertex (vertexT) consists of 3 vectors (position, normal and colour). Each vector (GLKVector3) is a triple of floats (i.e., x,y,z or r,g,b). Since GLKVector3 is defined to be applicable to colors, I am assuming that opengl is happy to work with color values that do not specify a third float (ie alpha)
My function to setup my gl objects looks like this:
glBindVertexArrayOES(_vertexArrayObject);
glGenBuffers(1, &_vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertexT) * _vertexCount, [_vertexData mutableBytes], GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, sizeof(vertexT), BUFFER_OFFSET(0));
glEnableVertexAttribArray(GLKVertexAttribNormal);
glVertexAttribPointer(GLKVertexAttribNormal, 3, GL_FLOAT, GL_FALSE, sizeof(vertexT), BUFFER_OFFSET(sizeof(GLKVector3)));
So far so good. I'm not using the color part of the interleaved array, and the whole object renders as white, using the following calls in my draw function:
glBindVertexArrayOES(_vertexArrayObject);
glDrawElements(GL_TRIANGLES, _triangleCount * 3, GL_UNSIGNED_SHORT, [_triangleData mutableBytes]);
So now I want to set up a per-vertex color for my model, so I added the following:
glEnableVertexAttribArray(GLKVertexAttribColor);
glVertexAttribPointer(GLKVertexAttribColor, 3, GL_FLOAT, GL_FALSE, sizeof(vertexT), BUFFER_OFFSET(sizeof(GLKVector3)*2));
But it is still white. I managed to find a question on SO that sounded like my problem, but the offered solution was to call glEnable with GL_COLOR_MATERIAL and as far as I can tell, this constant is not valid in OpenGL ES (according to the sdk page at Khronos).
I'm sure it is something simple. But I'm not seeing it. A little help?
Eventually found a way to enable color materials in GLKit.
This line does the trick:
self.effect.colorMaterialEnabled = GL_TRUE;
I'm looking for a way to mask my entire viewport using a texture. In my case I would like to use a checkered black and white pattern (or any 2 colors) and only show the parts that are black on the scene.
Would the best way to do this be with a cliping mask, a fragment shaders, or an alpha blending. I've seen on SO this post: How to create Stencil buffer with texture (Image) in OpenGL-ES 2.0 which seems similar to what I need, but I don't completely understand what to do with the discard keyword. Would it apply to my situation.
Let's assume you have a checkered texture of black and white squares. First, you'll want to setup the stencil test to draw the mask:
glEnable(GL_STENCIL_TEST);
glDisable(GL_DEPTH_TEST);
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE);
glDepthMask(GL_FALSE);
glStencilFunc(GL_ALWAYS, 1, -1);
glStencilOp(GL_KEEP, GL_KEEP, GL_REPLACE);
Next draw the checkered texture. The key step here (where discard comes in), is that you set up your fragment shader to only draw fragments where the checkerboard is black (or flip it for white if you prefer). The discard keyword skips rendering of any pixels that don't match your criteria.
//in the fragment shader
if(sampleColor.r > 0.5) { discard; }
After this rendering step you will have a stencil buffer with a checkerboard image where half of the buffer has a stencil value of 0 and the other half has a stencil value of 1.
Then render normally with stencil test enabled to pass when the stencil buffer is == 1.
glStencilOp(GL_KEEP, GL_KEEP, GL_KEEP);
glStencilFunc(GL_EQUAL, 1, -1);