I'm trying to invert the Y values of a texture on OpenGL ES 2.0, and have had no luck after several days of experimentation. Here's the code in my didRender block (it's a scene kit scene).
let textureCoordinates: [GLfloat] = [
0.0, 0.0,
1.0, 0.0,
0.0, 1.0,
1.0, 1.0]
let flipVertical: [GLfloat] = [
0.0, 1.0,
1.0, 1.0,
0.0, 0.0,
1.0, 0.0]
glEnableVertexAttribArray(0)
glEnableVertexAttribArray(1)
glVertexAttribPointer(0, 2, GLenum(GL_FLOAT), 0, 0, flipVertical)
glVertexAttribPointer(1, 2, GLenum(GL_FLOAT), 0, 0, textureCoordinates)
glDrawArrays(GLenum(GL_TRIANGLE_STRIP), 0, 4)
glBindTexture(GLenum(GL_TEXTURE_2D), 0)
glFlush()
Is there anything that sticks out to you as wrong? My understanding is that I can flip the texture without having to rewrite to a new texture. Is that true? Thanks!
You don't need a separate vertex attribute to do the flip; just replace the textureCoordinate array with the values from flipVertical (and then delete all of the code related to flipVertical - you don't need it).
I'm trying to set up VBO for my rendering code to get more fps. It worked for separate VBO's for vertex position, color and texture coords but after transferring to interleaved vertex data there is no geometry rendering.
Here is my setup func:
const GLsizeiptr data_size = NUMBER_OF_CUBE_VERTICES * 9 *sizeof(float);
// allocate a new buffer
glGenBuffers(1, &cubeVBO);
glBindBuffer(GL_ARRAY_BUFFER, cubeVBO);
glBufferData(GL_ARRAY_BUFFER, data_size, data, GL_STATIC_DRAW);
float* ptr = (float*)data;
glVertexAttribPointer(ATTRIB_VERTEX, 3, GL_FLOAT, GL_FALSE, sizeof(struct Vertex), (ptr + 0));
glEnableVertexAttribArray(ATTRIB_VERTEX);
glVertexAttribPointer(ATTRIB_COLOR, 4, GL_FLOAT, GL_FALSE, sizeof(struct Vertex), (ptr + 3));
glEnableVertexAttribArray(ATTRIB_COLOR);
glVertexAttribPointer(ATTRIB_TEXCOORD0, 2, GL_FLOAT, GL_FALSE, sizeof(struct Vertex), (ptr + 7));
glEnableVertexAttribArray(ATTRIB_TEXCOORD0);
glGenBuffers(1, &cubeIBO);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, cubeIBO);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, NUMBER_OF_CUBE_INDICES*sizeof(GLubyte), s_cubeIndices, GL_STATIC_DRAW);
The data array look like this:
static float data[] =
{
// position // // color // //UV//
-1.0, +1.0, +1.0, 255,0,0,255, 0,0,
-1.0, -1.0, +1.0, 0,255,0,255, 0,0,
+1.0, +1.0, +1.0, 255,0,255,255, 0,0,
+1.0, -1.0, +1.0, 255,0,0,255, 0,0,
+1.0, +1.0, +1.0, 255,0,0,255, 0,0,
+1.0, -1.0, +1.0, 255,0,0,255, 0,0,
+1.0, +1.0, -1.0, 255,255,0,255, 0,0,
+1.0, -1.0, -1.0, 255,0,0,255, 0,0,
+1.0, +1.0, -1.0, 255,0,255,255, 0,0,
+1.0, -1.0, -1.0, 255,255,0,255, 0,0,
-1.0, +1.0, -1.0, 0,255,0,255, 0,0,
-1.0, -1.0, -1.0, 255,0,0,255, 0,0,
-1.0, +1.0, -1.0, 0,0,255,255, 0,0,
-1.0, -1.0, -1.0, 255,0,0,255, 0,0,
-1.0, +1.0, +1.0, 255,255,0,255, 0,0,
-1.0, -1.0, +1.0, 255,0,0,255, 0,0,
};
And this is my render code:
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
glBindBuffer(GL_ARRAY_BUFFER, cubeVBO);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, cubeIBO);
glDrawElements(GL_TRIANGLE_STRIP, NUMBER_OF_CUBE_INDICES, GL_UNSIGNED_BYTE, s_cubeIndices);
Also I tried to use DrawArrays function without index buffer but result was the same - no geometry rendered.
There is also GLError 1282 in output window while my program runs.
I'd appreciate any help on my problem, thanks.
When you are using buffer object, last parameter of glVertexAttribPointer should be offset to data you need. For example, yours ATTRIB_VERTEX array would start at offset 0, ATTRIB_COLOR array at offset sizeof(float) * 3 (because position takes three floats), etc...
When you are not using buffer objects, but rather vertex array, you have to unbind currently bound buffer object to GL_ARRAY_BUFFER target by calling
glBindBuffer(GL_ARRAY_BUFFER, 0);
Ok, I got it working using glVertexPointer and glEnableClientState functions. But for glVertexAttribPointer and glEnableVertexAttribArray there is still no geometry rendering for some reason.
Now the code looks like this:
VBO init:
struct Vertex
{
GLfloat x, y, z;
GLubyte r, g, b, a;
};
......
const GLsizeiptr data_size = NUMBER_OF_CUBE_VERTICES *sizeof(struct Vertex);
glGenBuffers(1, &cubeVBO);
glBindBuffer(GL_ARRAY_BUFFER, cubeVBO);
glBufferData(GL_ARRAY_BUFFER, data_size, vertices, GL_STATIC_DRAW);
glVertexPointer(3, GL_FLOAT, sizeof(struct Vertex), (GLvoid*)((char*)NULL));
glColorPointer(4, GL_UNSIGNED_BYTE, sizeof(struct Vertex), (GLvoid*)offsetof(struct Vertex, r));
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
rendering:
glBindBuffer(GL_ARRAY_BUFFER, cubeVBO);
glDrawArrays(GL_TRIANGLE_STRIP, 0, NUMBER_OF_CUBE_VERTICES);
I have no idea why this code not work if I switch to the glVertexAttribPointer / glEnableVertexAttribArray. Any Ideas? Maybe I need to move pointing and enabling functions from initialization part to render part?
I'm using GLKit in an OpenGL project. Everything is based on GLKView and GLKBaseEffect (no custom shaders). In my project I have several views that have GLKViews for showing 3D objects, and occasionally several of those view can be "open" at once (i.e. are in the modal view stack).
While until now everything was working great, in a new view I was creating I needed to have a rectangle with texture to simulate a measuring tape for the 3D world of my app. For some unknown reason, in that view only, the texture isn't loaded right into the opengl context: the texture is loaded right by GLKTextureLoader, but when drawing the rectangle is black, and looking at the OpenGL frame in debug, I can see that an empty texture is loaded (there's a reference to a texture, but it's all zeroed out or null).
The shape I'm drawing is defined by: (it was originally a triangle strip, but I switched for triangles to make sure it's not the issue)
static const GLfloat initTape[] = {
-TAPE_WIDTH / 2.0f, 0, 0,
TAPE_WIDTH / 2.0f, 0, 0,
-TAPE_WIDTH / 2.0f, TAPE_INIT_LENGTH, 0,
TAPE_WIDTH / 2.0f, 0, 0,
TAPE_WIDTH / 2.0f, TAPE_INIT_LENGTH, 0,
-TAPE_WIDTH / 2.0f, TAPE_INIT_LENGTH, 0,
};
static const GLfloat initTapeTex[] = {
0, 0,
1, 0,
0, 1.0,
1, 0,
1, 1,
0, 1,
};
I set the effect variable as:
effect.transform.modelviewMatrix = modelview;
effect.light0.enabled = GL_FALSE;
// Projection setup
GLfloat ratio = self.view.bounds.size.width/self.view.bounds.size.height;
GLKMatrix4 projection = GLKMatrix4MakePerspective(GLKMathDegreesToRadians(self.fov), ratio, 0.1f, 1000.0f);
effect.transform.projectionMatrix = projection;
// Set the color of the wireframe.
if (tapeTex == nil) {
NSError* error;
tapeTex = [GLKTextureLoader textureWithContentsOfFile:[[[NSBundle mainBundle] URLForResource:#"ruler_texture" withExtension:#"png"] path] options:nil error:&error];
}
effect.texture2d0.enabled = GL_TRUE;
effect.texture2d0.target = GLKTextureTarget2D;
effect.texture2d0.envMode = GLKTextureEnvModeReplace;
effect.texture2d0.name = tapeTex.name;
And the rendering loop is:
[effect prepareToDraw];
glDisable(GL_DEPTH_TEST);
glDisable(GL_CULL_FACE);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
glVertexAttribPointer(GLKVertexAttribPosition, COORDS, GL_FLOAT, GL_FALSE, 0, tapeVerts);
glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, 0, tapeTexCoord);
glDrawArrays(GL_TRIANGLES, 0, TAPE_VERTS);
glDisableVertexAttribArray(GLKVertexAttribPosition);
glDisableVertexAttribArray(GLKVertexAttribTexCoord0);
I've also tested the texture itself in another view with other objects and it works fine, so it's not the texture file fault.
Any help would be greatly appreciated, as I'm stuck on this issue for over 3 days.
Update: Also, there are no glErrors during the rendering loop.
After many many days I've finally found my mistake - When using multiple openGL contexts, it's important to create a GLKTextureLoader using a shareGroup, or else the textures aren't necessarily loaded to the right context.
Instead of using the class method textureWithContentOf, every context needs it's own GLKTextureLoader that is initialized with context.sharegroup, and use only that texture loader for that view. (actually the textures can be saved between different contexts, but I didn't needed that feature of sharegroups).
Easy tutorial http://games.ianterrell.com/how-to-texturize-objects-with-glkit/
I think it will help you.
For testing purposes I am trying to set the position of every vertex to zero. But If I try to change more than two dimensions (and it doesn't matter which), the shader crashes silently. Can anybody clue me into what is going on here? My code:
static const float vertices[12] = {
-0.5,-0.5, 0.0,
0.5,-0.5, 0.0,
-0.5, 0.5, 0.0,
0.5, 0.5, 0.0,
};
glVertexAttribPointer(vertexHandle, 3, GL_FLOAT, GL_FALSE, 0, (const GLvoid*)vertices);
glEnableVertexAttribArray(vertexHandle);
glUniformMatrix4fv(mvpMatrixHandle, 1, GL_FALSE, (const GLfloat*)&modelViewProjection.data[0]);
glDrawArrays(GL_POINTS, 0, 4);
And my shader:
attribute vec4 vertexPosition;
uniform mat4 modelViewProjectionMatrix;
void main()
{
vec4 temp = vertexPosition;
temp.x = 0.0;
temp.y = 0.0;
temp.z = 0.0; // Can set any 2 dimensions (e.g. x and y or y and z)
// to zero, but not all three or the shader crashes.
gl_Position = modelViewProjectionMatrix * vec4(temp.xyz, 1.0);
}
Maybe it's because you declare vertexPosition as a vec4, but you are only passing 3 floats per vertex in your C code? I think the part about your temp vector is a red herring.
I am drawing a simple GL_LINE_LOOP on a black background. No matter what I do with the glColorPointer and colors[] array I can't make the lines any other color than white. What am I doing wrong?
I'm relatively new to open gl for iPhone and haven't found an answer on Google or here for my problem so I really appreciate any answers.
//glPushMatrix();
glDisable(GL_TEXTURE_2D);
static const GLubyte colors[] = {
255, 0, 255, 255,
255, 0, 255, 255,
255, 0, 255, 255
};
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState (GL_COLOR_ARRAY);
glColorPointer(4, GL_UNSIGNED_BYTE, 0, colors);
glLineWidth(5.0);
GLfloat vertices[] = { -1.0, -1.0, -1.0, 1.0, 1.0, 1.0, 1.0, -1.0, 1.0 };
glVertexPointer(3, GL_FLOAT, 0, vertices);
glDrawArrays(GL_LINE_LOOP, 0, 3);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_COLOR_ARRAY);
glEnable(GL_TEXTURE_2D);
glPopMatrix();
Try disabling texturing...
glDisable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,0);