In iOS openGL, how can one create a blend mode similar to kCGBlendModeDarken? - ios

On a white background, drawing lines using kCGBlendModeDarken darkens the areas where different colors meet, like this:
I am trying to recreate this using openGL in iOS, but I don't know how to set the glBlendFunc properties to achieve this. The openGL documentation is hard to grasp for a beginner.
What would be the proper way to achieve this in openGL ES 1_X on iOS?

Assuming that kCGBlendModeDarken is just a regular darkening blend mode, the same effect can be achieved in OpenGL using these two commands:
glBlendFunc(GL_ONE, GL_ONE);
glBlendEquation(GL_MIN);
Depending on your version of OpenGL, GL_MIN might be GL_FUNC_MIN.

I am using glBlendEquationOES(GL_MIN_EXT);.

glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glBlendEquationSeparate(GL_FUNC_ADD, GL_FUNC_ADD);

Related

Problems with transparency when creating a texture from a mixture of textures with SDL_2?

I am trying to create a texture from several textures, but it seems that I have a problem setting the transparency:
SDL_Texture *backgr = SDL_CreateTexture(render,
SDL_PIXELFORMAT_RGBA8888,
SDL_TEXTUREACCESS_TARGET,
width,
height);`<br/>
//After this line, the texture is rendered black
SDL_SetAlphaMod(backgr, 0);
Any ideas?
You need to set blend mode as well to enable Alpha-blending.
SDL_SetTextureBlendMode( backgr , SDL_BLENDMODE_BLEND );
According to the migration guide for SDL2 you shouldn't use SDL_SetAlphaMod with SDL2 but rather SDL_SetTextureAlphaMod (or SDL_SetSurfaceAlphaModfor surfaces). If you need an example on how to use it check out the tutorial by Lazy Foo.

Changing background with OpenGl and GLPaint example

I am using Apple's provided example of OpenGl GLPaint, I did brush width, colors select, without big problems, but can't find any way how I could change background color to image. Any suggestions?
Try to define a clear color using glClearColor in your initGL method.

GLPaint based OpenGL ES Blending Issue

I'm working on an app based on Apple's GLPaint sample code. I've changed the clear color to transparent black and have added an opacity slider, however when I mix colors together with a low opacity setting they don't mix the way I'm expecting. They seem to mix the way light mixes, not the way paint mixes. Here is an example of what I mean:
The "Desired Result" was obtained by using glReadPixels to render each color separately and merge it with the previous rendered image (i.e. using apple's default blending).
However, mixing each frame with the previous is too time consuming to be done on the fly, how can I get OpenGL to blend the colors properly? I've been researching online for quite a while and have yet to find a solution that works for me, please let me know if you need any other info to help!
From the looks of it, with your current setup, there is no easy solution. For what you are trying to do, you need custom shaders. Which is not possible using just GLKit.
Luckily you can mix GLKit and OpenGL ES.
My recommendation would be to:
Stop using GLKit for everything except setting up your rendering
surface with GLKView (which is tedious without GLKit).
Use an OpenGl program with custom shaders to draw to a texture that is backing an FBO.
Use a second program with custom shaders that does post processing (after drawing above texture to a quad which is then rendered to the screen).
A good starting point would be to load up the OpenGl template that comes with Xcode. And start modifying it. Be warned: If you don't understand shaders, the code here will make little sense. It draws 2 cubes, one using GLKit, and one without - using custom shaders.
References to start learning:
Intro to shaders
Rendering to a Texture
Shader Toy - This should help you experiment with your post processing frag shader.
GLEssentials example - This shows how to render to texture using OpenGL ( a bit outdated.)
Finally, if you are really serious about using OpenGL ES to it's full potential, you really should invest the time to read through OpenGL ES 2.0 programming guide. Even though it is 6 years old, it is still relevant and the only book I've found that explains all the concepts correctly.
Your "Current Result" is additive color, which is how OpenGL is supposed to work. To work like mixing paint would be substractive color. You don't have control over this with OpenGL ES 1.1, but you could write a custom fragment shader for OpenGL ES 2.0 that would do substractive color. If you are blending textures images from iOS, you need to know if the image data has been premultiplied by alpha or not, in order to do blending. OpenGL ES expects the non-premultiplied format.
You need to write that code in the function which is called on color change.
and each time you need to set BlendFunc.
CGFloat red , green, blue;
// set red, green ,blue with desire color combination
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glColor4f(red* kBrushOpacity,
green * kBrushOpacity,
blue* kBrushOpacity,
kBrushOpacity);
To do more things by using BlendFunc use this link
Please specify if it works or not. It work for me.

how to draw a string in a GLKView mixed with OpenGL

I want to be able to draw a string over the OpenGL window in my iOS app. What is the easiest way to do this? I know you can define a texture with letters and draw parts of that texture to create text - but this is quite a bit of work for what I am doing. I just want to be able to draw a simple string in the upper left of the window.
Is it possible to mix opengl with 2d drawing commands? I'm using GLKView, so I suspect it involves adding some code to drawInRect.
I am using OpenGL ES 1.1 for this.
Found a solution for this. If you add a label as subview to the GLKView then you can draw text over the opengl.

Replicating UIView drawRect in OpenGL ES

My iOS application draws into a bitmap (same size as my view) using Core Graphics. I want to push updated regions of the bitmap to the screen. (I've used the standard UIView drawRect method but I have some good reasons to switch to OpenGL).
I just want to replicate the same behavior as UIView/CALayer drawRect but in an OpenGL view. Essentially I would like to update dirty rectangles on my OpenGL view. Nothing more.
So far I've been able to create an OpenGL ES 1.1 view and push my entire bitmap on screen using a single quad (texture on a vertex array) for each update of my bitmap. Of course, this is pretty inefficient since I only need to refresh the dirty rectangle, not the whole view.
What would be the most efficient way to do that in OpenGL ES? Should I use a lattice of quads and update the texture of the quads that intersect with my dirty rectangle? (If I were to use that method, should I use VBO?) Is there a better way to do that?
FYI (just in case), I won't need rotation but will need to scale the entire OpenGL view.
UPDATE:
This method indeed works. However, there's a bug in iOS5.x on retina display devices that produces an artifact when using single buffering. The problem has been fixed in iOS6. I don't yet have a work around.
You could simply just update a part of the texture using TexSubImage, and redraw your standard full-screen quad, but with the scissor rect beeing set (glScissor) to the "dirty" part. The GL will then not draw any fragments outside this rect.
For this to work, you must of course use single buffering.

Resources