Single channel textures on iOS and OpenGL ES 3 - ios

These OpenGL ES formats are driving me nuts... I upgraded my project to ES 3 from ES 2, so apparently you have to declare the internal format with a sized type... According to https://www.khronos.org/opengles/sdk/docs/man3/docbook4/xhtml/glTexImage2D.xml these combinations are perfectly valid:
glTexImage2D(GL_TEXTURE_2D, 0, GL_R8, width, height, 0, GL_RED, GL_UNSIGNED_BYTE, NULL);
...
glTexImage2D(GL_TEXTURE_2D, 0, GL_R16F, width, height, 0, GL_RED, GL_HALF_FLOAT, NULL);
But they give me GL_INVALID_OPERATION. Single channel textures in ES are poorly documented by Khronos/Apple and the community barely uses them. If there is another soul out there that attempted to use them and succeeded please let me know. I wish I could just use Metal.

Meh, being on a chip < A7 was the explanation of the errors. No OpenGL 3.0 on that.

Related

Why is glTexImage2D returning GL_INVALID_OPERATION on iOS?

I'm making the following call on iOS using OpenGL ES 3:
glTexImage2D(GL_TEXTURE_2D, // target
0, // level
GL_RGBA, // internalformat
1024, // width
692, // height
0, // border
GL_RGBA, // format
GL_UNSIGNED_BYTE, // type
NULL); // data
However, it is returning GL_INVALID_OPERATION. There are a slew of reasons that GL_INVALID_OPERATION might be returned. However, I can't spot any that are relevant to my situation.
The weird thing is if I just ignore the error, things seem to work anyway. However, I'd like to understand what's going on here because I don't want it to bite me later.
Can anyone explain why I'm getting an error here and how to avoid it?

Performance issue, drawing app on iOS with OpenGL ES 2.0

I’m currently developing a drawing app on iOS, with OpenGL ES 2.0 (I begin using it). I would like to reproduce textured brushes on my app. For that, I decided to use shaders (best choice?). At this stage, I have my textured brushes, but unfortunately, I also have some performance problems after few seconds…
Here is an overview of my app process:
I receive about 140 points each second.
Each time, on the draw function, I browse all of my points (points of contained on the Stroke class, which is contained on the layer) and I redraw it.
Code:
for (int strokeId = 0; strokeId < layer->strokesList.size(); strokeId++) {
Stroke* stroke = layer->strokesList.at(strokeId);
[…]
glVertexAttribPointer(mainProgram.positionSlot, 2, GL_FLOAT, GL_FALSE, 0, stroke->vertices.Position);
glVertexAttribPointer(mainProgram.colorSlot, 4, GL_FLOAT, GL_FALSE, 0, stroke->vertices.Color);
glDrawArrays(GL_TRIANGLES, 0, (int)(stroke->nbVertices));
[…]
}
I am opened to any suggestion to improve this drawing method, thank you!

glDrawArrays from iOS to OSX

I'm trying to get a game I made for iOS work in OSX. And so far I have been able to get everything working except for the drawing of some random generated hills using a glbound texture.
It works perfectly in iOS but somehow this part is the only thing not visible when the app is run in OSX. I checked all coords and color values so I'm pretty sure it has to do with OpenGL somehow.
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisableClientState(GL_COLOR_ARRAY);
glBindTexture(GL_TEXTURE_2D, _textureSprite.texture.name);
glColor4f(_terrainColor.r,_terrainColor.g,_terrainColor.b, 1);
glVertexPointer(2, GL_FLOAT, 0, _hillVertices);
glTexCoordPointer(2, GL_FLOAT, 0, _hillTexCoords);
glDrawArrays(GL_TRIANGLE_STRIP, 0, (GLsizei)_nHillVertices);
glEnableClientState(GL_COLOR_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glEnable(GL_TEXTURE_2D);
You're disabling the texture coordinate (and color) array along with the texturing unit, yet are binding a texture coordinate pointer.
Is this really what you intend to do?
Appearantly it was being drawn after all, only as a 1/2 pixel line. Somehow there is some scaling on the vertices in effect, will have to check my code.

OpenGL ES examples that don't use EAGLContext

I'd like to better understand the creation, allocation, and binding of OpenGL ES framebuffers, renderbuffers, etc under iOS. I understand that the EAGLContext and EAGLSharegroup classes normally manage the allocation and binding of such objects. However, the apple docs suggest that it is possible to do GL offscreen rendering without using the EAGLContext class and I'm interested in how. Does anyone have any pointers to code examples?
I would also be interested in examples showing how to accomplish offscreen rendering with EAGLContext.
The only way to render content using OpenGL ES on iOS, offscreen or onscreen, is to do so through an EAGLContext. From the OpenGL ES Programming Guide:
Before your application can call any OpenGL ES functions, it must
initialize an EAGLContext object and set it as the current context.
I think the following lines might be what are causing some confusion:
The EAGLContext class also provides methods your application uses to
integrate OpenGL ES content with Core Animation. Without these
methods, your application would be limited to working with offscreen
images.
What that means is that if you want to render content to the screen, you use some extra methods only provided by the EAGLContext class, such as -renderbufferStorage:fromDrawable:. You still need an EAGLContext to manage OpenGL ES commands even if you're going to draw offscreen, but these particular methods which are specific to EAGLContext are needed to draw onscreen.
To your second question, how you setup your offscreen rendering will depend on the configuration of this offscreen render (texture-backed FBO, depth buffer, etc.). For example, the following code will set up a simple FBO that has no depth buffer and renders to the already set up outputTexture texture:
glActiveTexture(GL_TEXTURE1);
glGenFramebuffers(1, &filterFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, filterFramebuffer);
glBindTexture(GL_TEXTURE_2D, outputTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (int)currentFBOSize.width, (int)currentFBOSize.height, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, outputTexture, 0);
For code examples, you could look at how I do this within the open source GPUImage framework (which just does simple image rendering) or my open source Molecules application (which does more complex offscreen rendering using depth buffers).

OpenGL ES 2.0, drawing using multiple vertex buffers

I can't find much info on whether drawing from multiple vertex buffers is supported on opengl es 2.0 (i.e use one vertex buffer for position data and another for normal, colors etc). This page http://developer.apple.com/library/ios/#documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/TechniquesforWorkingwithVertexData/TechniquesforWorkingwithVertexData.html and listing 9.4 in particular implies you should be able to, but I can't get it to work on my program. Code for the offending draw call:
glBindBuffer(GL_ARRAY_BUFFER, mPositionBuffer->openglID);
glVertexAttribPointer(0, 4, GL_FLOAT, 0, 16, NULL);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, mTexCoordBuffer->openglID);
glVertexAttribPointer(1, 2, GL_FLOAT, 0, 76, NULL);
glEnableVertexAttribArray(1);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, mIndexBuffer->openglID);
glDrawElements(GL_TRIANGLES, 10788, GL_UNSIGNED_SHORT, NULL);
This draw call will stall or crash with EXC_BAD_ACCESS on the simulator, and gives very weird behavior on the device (opengl draws random triangles or presents previously rendered frames). No opengl call ever returns an error, and I've inspected the vertex buffers extensively and am confident they have the correct sizes and data.
Has anyone successfully rendered using multiple vertex buffers and can share their experience on why this might not be working? Any info on where to start debugging stalled/failed draw calls that don't return any error code would be greatly appreciated.
Access violations generally mean that you are trying to draw more triangles than you have allocated in a buffer. The way you've set up buffers is perfectly fine and should work, I would be checking if your parameters are set properly:
http://www.opengl.org/sdk/docs/man/xhtml/glVertexAttribPointer.xml
http://www.opengl.org/sdk/docs/man/xhtml/glDrawElements.xml
I think your issue is either that you've switched offset and stride in your glVertexAttribPointer calls, or you've miscounted the number of indices you're drawing
Yes, you can use multiple vertex buffer objects (VBOs) for a single draw. The OpenGL ES 2.0 spec says so in section 2.9.1.
Do you really have all those hard-coded constants in your code? Where did that 76 come from?
If you want help debugging, you need to post the code that initializes your buffers (the code that calls glGenBuffers and glBufferData). You should also post the stack trace of EXC_BAD_ACCESS.
It might also be easier to debug if you drew something simpler, like one triangle, instead of 3596 triangles.

Resources