opengl es VBO usage - no geometry rendering - ios

I'm trying to set up VBO for my rendering code to get more fps. It worked for separate VBO's for vertex position, color and texture coords but after transferring to interleaved vertex data there is no geometry rendering.
Here is my setup func:
const GLsizeiptr data_size = NUMBER_OF_CUBE_VERTICES * 9 *sizeof(float);
// allocate a new buffer
glGenBuffers(1, &cubeVBO);
glBindBuffer(GL_ARRAY_BUFFER, cubeVBO);
glBufferData(GL_ARRAY_BUFFER, data_size, data, GL_STATIC_DRAW);
float* ptr = (float*)data;
glVertexAttribPointer(ATTRIB_VERTEX, 3, GL_FLOAT, GL_FALSE, sizeof(struct Vertex), (ptr + 0));
glEnableVertexAttribArray(ATTRIB_VERTEX);
glVertexAttribPointer(ATTRIB_COLOR, 4, GL_FLOAT, GL_FALSE, sizeof(struct Vertex), (ptr + 3));
glEnableVertexAttribArray(ATTRIB_COLOR);
glVertexAttribPointer(ATTRIB_TEXCOORD0, 2, GL_FLOAT, GL_FALSE, sizeof(struct Vertex), (ptr + 7));
glEnableVertexAttribArray(ATTRIB_TEXCOORD0);
glGenBuffers(1, &cubeIBO);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, cubeIBO);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, NUMBER_OF_CUBE_INDICES*sizeof(GLubyte), s_cubeIndices, GL_STATIC_DRAW);
The data array look like this:
static float data[] =
{
// position // // color // //UV//
-1.0, +1.0, +1.0, 255,0,0,255, 0,0,
-1.0, -1.0, +1.0, 0,255,0,255, 0,0,
+1.0, +1.0, +1.0, 255,0,255,255, 0,0,
+1.0, -1.0, +1.0, 255,0,0,255, 0,0,
+1.0, +1.0, +1.0, 255,0,0,255, 0,0,
+1.0, -1.0, +1.0, 255,0,0,255, 0,0,
+1.0, +1.0, -1.0, 255,255,0,255, 0,0,
+1.0, -1.0, -1.0, 255,0,0,255, 0,0,
+1.0, +1.0, -1.0, 255,0,255,255, 0,0,
+1.0, -1.0, -1.0, 255,255,0,255, 0,0,
-1.0, +1.0, -1.0, 0,255,0,255, 0,0,
-1.0, -1.0, -1.0, 255,0,0,255, 0,0,
-1.0, +1.0, -1.0, 0,0,255,255, 0,0,
-1.0, -1.0, -1.0, 255,0,0,255, 0,0,
-1.0, +1.0, +1.0, 255,255,0,255, 0,0,
-1.0, -1.0, +1.0, 255,0,0,255, 0,0,
};
And this is my render code:
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
glBindBuffer(GL_ARRAY_BUFFER, cubeVBO);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, cubeIBO);
glDrawElements(GL_TRIANGLE_STRIP, NUMBER_OF_CUBE_INDICES, GL_UNSIGNED_BYTE, s_cubeIndices);
Also I tried to use DrawArrays function without index buffer but result was the same - no geometry rendered.
There is also GLError 1282 in output window while my program runs.
I'd appreciate any help on my problem, thanks.

When you are using buffer object, last parameter of glVertexAttribPointer should be offset to data you need. For example, yours ATTRIB_VERTEX array would start at offset 0, ATTRIB_COLOR array at offset sizeof(float) * 3 (because position takes three floats), etc...
When you are not using buffer objects, but rather vertex array, you have to unbind currently bound buffer object to GL_ARRAY_BUFFER target by calling
glBindBuffer(GL_ARRAY_BUFFER, 0);

Ok, I got it working using glVertexPointer and glEnableClientState functions. But for glVertexAttribPointer and glEnableVertexAttribArray there is still no geometry rendering for some reason.
Now the code looks like this:
VBO init:
struct Vertex
{
GLfloat x, y, z;
GLubyte r, g, b, a;
};
......
const GLsizeiptr data_size = NUMBER_OF_CUBE_VERTICES *sizeof(struct Vertex);
glGenBuffers(1, &cubeVBO);
glBindBuffer(GL_ARRAY_BUFFER, cubeVBO);
glBufferData(GL_ARRAY_BUFFER, data_size, vertices, GL_STATIC_DRAW);
glVertexPointer(3, GL_FLOAT, sizeof(struct Vertex), (GLvoid*)((char*)NULL));
glColorPointer(4, GL_UNSIGNED_BYTE, sizeof(struct Vertex), (GLvoid*)offsetof(struct Vertex, r));
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
rendering:
glBindBuffer(GL_ARRAY_BUFFER, cubeVBO);
glDrawArrays(GL_TRIANGLE_STRIP, 0, NUMBER_OF_CUBE_VERTICES);
I have no idea why this code not work if I switch to the glVertexAttribPointer / glEnableVertexAttribArray. Any Ideas? Maybe I need to move pointing and enabling functions from initialization part to render part?

Related

Trying to flip an OpenGL texture

I'm trying to invert the Y values of a texture on OpenGL ES 2.0, and have had no luck after several days of experimentation. Here's the code in my didRender block (it's a scene kit scene).
let textureCoordinates: [GLfloat] = [
0.0, 0.0,
1.0, 0.0,
0.0, 1.0,
1.0, 1.0]
let flipVertical: [GLfloat] = [
0.0, 1.0,
1.0, 1.0,
0.0, 0.0,
1.0, 0.0]
glEnableVertexAttribArray(0)
glEnableVertexAttribArray(1)
glVertexAttribPointer(0, 2, GLenum(GL_FLOAT), 0, 0, flipVertical)
glVertexAttribPointer(1, 2, GLenum(GL_FLOAT), 0, 0, textureCoordinates)
glDrawArrays(GLenum(GL_TRIANGLE_STRIP), 0, 4)
glBindTexture(GLenum(GL_TEXTURE_2D), 0)
glFlush()
Is there anything that sticks out to you as wrong? My understanding is that I can flip the texture without having to rewrite to a new texture. Is that true? Thanks!
You don't need a separate vertex attribute to do the flip; just replace the textureCoordinate array with the values from flipVertical (and then delete all of the code related to flipVertical - you don't need it).

OpenGL ES 2.0 Texture Won't Display

So I am currently learning OpenGL ES 2.0 with iOS and working on a maze game. The maze is randomly generated (so not a loaded model) and my struggle is texturing the walls and floor of the maze. My approach is to just treat the maze as a series of cubes, and I have code that draws the individual faces of a cube separately (so I can create a path by simply leaving some faces out).
Using capture GPU frame, I have confirmed that the texture is indeed loading in correctly, the data in the frame buffers is correct and that I'm not getting any errors. I can see my other lighting effects (so the face isn't completely black), but no texture appears.
Here is how I've defined my cube faces
GLfloat rightCubeVertexData[] =
{
0.5f, -0.5f, -0.5f,
0.5f, -0.5f, 0.5f,
0.5f, 0.5f, -0.5f,
0.5f, 0.5f, -0.5f,
0.5f, -0.5f, 0.5f,
0.5f, 0.5f, 0.5f,
};
GLfloat rightCubeNormalData[] =
{
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
-1.0f, 0.0f, 0.0f,
};
GLfloat rightCubeTexCoords[] =
{
0.0, 0.0,
1.0, 0.0,
0.0, 1.0,
0.0, 1.0,
1.0, 0.0,
1.0, 1.0,
};
The other faces are defined essentially the same way (except they are in one array each, splitting up the position, normals, and tex coords was just something I tried; I'm just trying to get one face to texture and then I'll expand to the rest).
Here is how I load the data into the buffer
glGenVertexArraysOES(1, &_rightVertexArray);
glBindVertexArrayOES(_rightVertexArray);
glGenBuffers(3, _rightVertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _rightVertexBuffer[0]);
glBufferData(GL_ARRAY_BUFFER, sizeof(rightCubeVertexData), rightCubeVertexData, GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, 12, BUFFER_OFFSET(0));
glBindBuffer(GL_ARRAY_BUFFER, _rightVertexBuffer[1]);
glBufferData(GL_ARRAY_BUFFER, sizeof(rightCubeNormalData), rightCubeNormalData, GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribNormal);
glVertexAttribPointer(GLKVertexAttribNormal, 3, GL_FLOAT, GL_FALSE, 12, BUFFER_OFFSET(0));
glBindBuffer(GL_ARRAY_BUFFER, _rightVertexBuffer[2]);
glBufferData(GL_ARRAY_BUFFER, sizeof(rightCubeTexCoords), rightCubeTexCoords, GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, 12, BUFFER_OFFSET(0));
Again, using three buffers was an experiment, the rest are defined in one buffer with an offset.
Here is how I load textures
crateTexture = [self setupTexture:#"crate.jpg"];
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, crateTexture);
glUniform1i(uniforms[UNIFORM_TEXTURE], 0);
// Load in and set up texture image (adapted from Ray Wenderlich)
- (GLuint)setupTexture:(NSString *)fileName
{
CGImageRef spriteImage = [UIImage imageNamed:fileName].CGImage;
if (!spriteImage) {
NSLog(#"Failed to load image %#", fileName);
exit(1);
}
size_t width = CGImageGetWidth(spriteImage);
size_t height = CGImageGetHeight(spriteImage);
GLubyte *spriteData = (GLubyte *) calloc(width*height*4, sizeof(GLubyte));
CGContextRef spriteContext = CGBitmapContextCreate(spriteData, width, height, 8, width*4, CGImageGetColorSpace(spriteImage), kCGImageAlphaPremultipliedLast);
CGContextDrawImage(spriteContext, CGRectMake(0, 0, width, height), spriteImage);
CGContextRelease(spriteContext);
GLuint texName;
glGenTextures(1, &texName);
glBindTexture(GL_TEXTURE_2D, texName);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);
free(spriteData);
return texName;
}
Then, at the appropriate time, I simply call glDrawArrays to draw the face. I am completely stumped on this, and it is probably a very silly error, but any help anybody could provide would be much appreciated.
P.S. Here is my fragment shader
varying vec3 eyeNormal;
varying vec4 eyePos;
varying vec2 texCoordOut;
uniform sampler2D texture;
uniform vec3 flashlightPosition;
uniform vec3 diffuseLightPosition;
uniform vec4 diffuseComponent;
uniform float shininess;
uniform vec4 specularComponent;
uniform vec4 ambientComponent;
void main()
{
vec4 ambient = ambientComponent;
vec3 N = normalize(eyeNormal);
float nDotVP = max(0.0, dot(N, normalize(diffuseLightPosition)));
vec4 diffuse = diffuseComponent * nDotVP;
vec3 E = normalize(-eyePos.xyz);
vec3 L = normalize(flashlightPosition - eyePos.xyz);
vec3 H = normalize(L+E);
float Ks = pow(max(dot(N, H), 0.0), shininess);
vec4 specular = Ks*specularComponent;
if( dot(L, N) < 0.0 ) {
specular = vec4(0.0, 0.0, 0.0, 1.0);
}
gl_FragColor = (ambient + diffuse + specular) * texture2D(texture, texCoordOut);
//gl_FragColor = ambient + diffuse + specular;
gl_FragColor.a = 1.0;
}
And yes, all the uniform names are correct and correspond to something in the main code.
EDIT: Here is the vertex shader
precision mediump float;
attribute vec4 position;
attribute vec3 normal;
attribute vec2 texCoordIn;
varying vec3 eyeNormal;
varying vec4 eyePos;
varying vec2 texCoordOut;
uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelViewMatrix;
uniform mat3 normalMatrix;
void main()
{
eyeNormal = (normalMatrix * normal);
eyePos = modelViewMatrix * position;
texCoordOut = texCoordIn;
gl_Position = modelViewProjectionMatrix * position;
}
To sum up the procedure done from the comments...
There is much that can go wrong when dealing with textures and it is good to know how to pinpoint where the issue is.
What to be careful wit the texture itself:
Check if you did set the parameters such as
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
Check setting the uniform as glUniform1i(uniformName, 0) where the last parameter corresponds to the active texture and not the texture ID.
Other include checking if the uniform name is correct, texture is bound.. And if possible check the debugger if the texture is properly loaded.
Next to that there are chances your texture coordinates are messed up and this seems to be a very common issue. To debug that it is best that you replace the color gotten from the texture in your fragment shader with the texture coordinate itself. E.g. replace texture2D(texture, texCoordOut) with vec4(texCoordOut.x, texCoordOut.y, .0, 1.0). Since the texture coordinates should be in range [0,1] you should see nice gradients between red and green color in your scene. If you do not see them then your texture coordinates are messed up: If all is black your coordinates are all zero, if most is yellow then your coordinates are most likely too large.
In your case the texture coordinates were all black which means you were always getting the first pixel from the texture thus a constant color in your scene. What to check at this point is:
Are the coordinates you push to the GPU correct
Is the pointer set correctly as glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, 12, BUFFER_OFFSET(0)) (check all the parameters)
Is the attribute enabled glEnableVertexAttribArray(GLKVertexAttribTexCoord0)
Is the attribute bound to the shader being compiled.
In your case you have forgotten to bind the texture coordinate attribute which is quite natural to miss.
From the information you have given us it is impossible to find this mistake but you should note the procedure I have given you to pinpoint the issue as to where it actually lies. It might be handy in future as well.
Turns out I had forgotten to bind the attribute location when compiling the shader. Needed to add the line
glBindAttribLocation(_program, GLKVertexAttribTexCoord0, "texCoordIn");
to the load shaders method.

OpenGL ES triangles drawing mistake on iOS

I try to draw multiple triangles using OpenGL ES and iOS. I create vertices array with float values with following structure
{x, y, z, r, g, b, a}
for each vertex. Final array for one triangle is:
{x1, y1, z1, r1, g1, b1, a1, x2, y2, z2, r2, g2, b2, a2, x3, y3, z3,
r3, g3, b3, a3}
Here is my update method:
-(void)update {
float aspect = fabsf(self.view.bounds.size.width / self.view.bounds.size.height);
GLKMatrix4 projectionMatrix = GLKMatrix4MakePerspective(GLKMathDegreesToRadians(65.0f), aspect, 1.0, 100.0);
self.effect.transform.projectionMatrix = projectionMatrix;
}
and render:
-(void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
[self drawShapes]; // here I fill vertices array
glClearColor(0.65f, 0.65f, 0.8f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
int numItems = 3 * trianglesCount;
glBindVertexArrayOES(vao);
[self.effect prepareToDraw];
glUseProgram(shaderProgram);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * itemSize, convertedVerts, GL_DYNAMIC_DRAW);
glVertexAttribPointer(vertexPositionAttribute, 3, GL_FLOAT, false, stride, 0);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(vertexColorAttribute, 4, GL_FLOAT, false, stride, (GLvoid*)(3 * sizeof(float)));
glEnableVertexAttribArray(GLKVertexAttribColor);
glDrawArrays(GL_TRIANGLES, 0, numItems);
}
Context setup. Here I bind my vertex array and generate vertex buffer:
-(void)setupContext
{
self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
if(!self.context) {
NSLog(#"Failed to create OpenGL ES Context");
}
GLKView *view = (GLKView *)self.view;
view.context = self.context;
view.drawableDepthFormat = GLKViewDrawableDepthFormat24;
[EAGLContext setCurrentContext:self.context];
self.effect = [[GLKBaseEffect alloc] init];
glEnable(GL_DEPTH_TEST);
glDisable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glGenVertexArraysOES(1, &vao);
glBindVertexArrayOES(vao);
glGenBuffers(1, &vbo);
}
Fragment and vertex shaders are pretty simple:
//fragment
varying lowp vec4 vColor;
void main(void) {
gl_FragColor = vColor;
}
//vertex
attribute vec3 aVertexPosition;
attribute vec4 aVertexColor;
varying lowp vec4 vColor;
void main(void) {
gl_Position = vec4(aVertexPosition, 1.0);
vColor = aVertexColor;
}
Result. Triangles aren't shown:
Where is mistake? I guess problem is with projection matrix. Here is github link to Xcode project.
Downloaded your code and tried it out. I see the purplish screen and no triangles, so I'm guessing that's the problem. I see two things that could be the problem:
1) You'll need to pass glBufferData the total number of bytes you're sending it, like this: glBufferData(GL_ARRAY_BUFFER, sizeof(float) * itemSize * numItems, convertedVerts, GL_DYNAMIC_DRAW);. Any data related to how to chunk the data stays glVertexAttribPointer.
2) That doesn't seem to be the only thing since I still can't get triangles to show up. I've never used GLKit before (I just have a little experience with OpenGL on the desktop platform). That being said, if I replace GLKVertexAttributePosition and GLKVertexAttribColor with 0 and 1 respectively. And apply the glBufferData fix from 1 I see artifacts flashing on the simulator screen when I move the mouse. So there's gotta be something fishy with those enum values and glVertexAttribPointer.
Edit - clarification for 2:
After changing the glBufferData line as described in 1. I also modified the glEnableVertexAttribArray lines so the looked like this:
glVertexAttribPointer(vertexPositionAttribute, 3, GL_FLOAT, false, stride, 0);
glEnableVertexAttribArray(0);
glVertexAttribPointer(vertexColorAttribute, 4, GL_FLOAT, false, stride, (GLvoid*)(3 * sizeof(float)));
glEnableVertexAttribArray(1);
After both of those changes I can see red triangles flickering on the screen. A step closer, since I couldn't see anything before. But I haven't been able to figure it out any further than that :(

glDrawArray is not working

I want to make some changes to an image using OpenGL.
So after loading the image, I prepare the texture and I put the following code.But The image didn't change to a triangle.
What am I doing wrong ?
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
static const Vertex3D vertices[] = {
{-1.0, 1.0, -0.0},
{ 1.0, 1.0, -0.0},
{ 0.0, -1.0, -0.0},
};
static const Vector3D normals[] = {
{0.0, 0.0, 1.0},
{0.0, 0.0, 1.0},
{0.0, 0.0, 1.0},
};
static const GLfloat texCoords[] = {
0.0, 1.0,
1.0, 0.0,
0.0, 0.0,
};
glLoadIdentity();
glTranslatef(0.0, 0.0, -3.0);
glBindTexture(GL_TEXTURE_2D, texture[0]);
glVertexPointer(3, GL_FLOAT, 3, vertices);
glNormalPointer(GL_FLOAT, 0, normals);
glTexCoordPointer(2, GL_FLOAT, 0, texCoords);
//initiate the drawing process, we want a triangle, start at index 0 and draw 3 vertices
glDrawArrays(GL_TRIANGLES, 0, 3);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);`
I am surprised by your glVertexPointer(3, GL_FLOAT, 3, vertices); the second 3 indicates the stride (a kind of spacing between the numbers)... I think it should be 0 instead of 3.
glVertexPointer(3, GL_FLOAT, 0, vertices);
Actually, do you see a triangle or nothing at all ?
Good luck!
Pierre
Ok: I tried your code, so I can say :
1) you shall put glVertexPointer(3, GL_FLOAT, 0, vertices); stride of 0 instead of 3, does clearly not work with 3 (even useless to check it) : there is no gap between your values.
2) it may come from your initialisation of the view (common problem): how do you setup the projection and modelview matrices? For instance, to see the triangle, I have to put
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45.0, 1.0, 1.0, 10.0); // field of view=45°, zNear..zFar = 1 to 10
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(0.0, 0, -3.0);
i.e. don't forget to tell which matrix you wish to set before doing it, and setup the pojection properly: zNear <= min(your vertices.z), zFar >= max(your vertices.z) (Oooo it is over now, with OpenGL 4, no more implicit matrices)
I wish you find the bug.
Cheers

glColorPointer iOS Open GL ES not working?

I am drawing a simple GL_LINE_LOOP on a black background. No matter what I do with the glColorPointer and colors[] array I can't make the lines any other color than white. What am I doing wrong?
I'm relatively new to open gl for iPhone and haven't found an answer on Google or here for my problem so I really appreciate any answers.
//glPushMatrix();
glDisable(GL_TEXTURE_2D);
static const GLubyte colors[] = {
255, 0, 255, 255,
255, 0, 255, 255,
255, 0, 255, 255
};
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState (GL_COLOR_ARRAY);
glColorPointer(4, GL_UNSIGNED_BYTE, 0, colors);
glLineWidth(5.0);
GLfloat vertices[] = { -1.0, -1.0, -1.0, 1.0, 1.0, 1.0, 1.0, -1.0, 1.0 };
glVertexPointer(3, GL_FLOAT, 0, vertices);
glDrawArrays(GL_LINE_LOOP, 0, 3);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_COLOR_ARRAY);
glEnable(GL_TEXTURE_2D);
glPopMatrix();
Try disabling texturing...
glDisable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,0);

Resources