Failing to color line using VBO - ios

I am trying and failing to draw a line with color using glkit / OpenGL Es2. I can draw the line, but it comes out as black.
My current approach is attempting to use a VBO containing both positional and color data.
Set-up:
typedef struct {
float Position[2];
float Color[4];
} LinePoint;
LinePoint line[] =
{
{{0.0, 0.0},{1.0,0.0,0.0,1}},
{{2.0, 3.0},{1.0,0.0,0.0,1}}
};
glGenBuffers(1, &_lineBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _lineBuffer);
glBufferData(GL_ARRAY_BUFFER,sizeof(LinePoint)*2,line,GL_STATIC_DRAW);
In my drawInRect method:
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect {
[EAGLContext setCurrentContext:self.context];
[self.effect prepareToDraw];
glClearColor(0.5, 0.5, 0.5, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
glDisable(GL_BLEND);
glDisable(GL_TEXTURE_2D);
glBindBuffer(GL_ARRAY_BUFFER, _lineBuffer);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition,2,GL_FLOAT,GL_FALSE,sizeof(LinePoint),(const GLvoid *) offsetof(LinePoint, Position));
glEnableVertexAttribArray(GLKVertexAttribColor);
glVertexAttribPointer(GLKVertexAttribColor,4,GL_FLOAT,GL_FALSE,sizeof(LinePoint),(const GLvoid *) offsetof(LinePoint, Color));
// Set the line width
glLineWidth(50.0);
// Render the line
glDrawArrays(GL_LINE_STRIP, 0, 2);
glEnable(GL_BLEND);
glEnable(GL_TEXTURE_2D);
}

Related

Smooth changing of line thickness using OpenGL ES

I'm drawing a few lines using OpenGL ES and I need to change their thickness from 1 pixel to 3 pixels smoothly, but glLineWidth doesn't allow to set line thickness between 1.0 and 2.0.
Is it possible?
Here is my code
- (void)setupGL
{
[EAGLContext setCurrentContext:self.context];
self.effect = [[GLKBaseEffect alloc] init];
glEnable(GL_DEPTH_TEST);
glGenBuffers(1, &_vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(thinLines), thinLines, GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 2, GL_FLOAT, GL_FALSE, 0, BUFFER_OFFSET(0));
glBindVertexArrayOES(0);
}
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBindVertexArrayOES(_vertexArray);
self.effect.constantColor = GLKVector4Make(lineR, lineG, lineB, 1.0f);
[self.effect prepareToDraw];
glLineWidth(1 + scaleQ);
glDrawArrays(GL_LINES, 0, thinLinesCount*2);
}
OpenGL ES (including 3.0) does not support antialiased lines. From documentation to glLineWidth:
The actual width is determined by rounding the supplied width to the
nearest integer.
So unfortunately you can't "smoothly" change line thickness.

Getting OpenGL stencil buffer clipping working on glkit

I'm trying to reproduce in a GLKViewController the effect in this stencil buffer tutorial here in which a stencil is used for clipping.
In my drawinrect method I have:
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect {
[self.effect prepareToDraw];
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (const GLvoid *) offsetof(Vertex, Position));
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (const GLvoid *) offsetof(Vertex, TexCoord));
glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer);
glClearColor(0.5, 0.5, 0.5, 1.0);
glClear( GL_DEPTH_BUFFER_BIT | GL_COLOR_BUFFER_BIT);
glClear(GL_DEPTH_BUFFER_BIT);
glEnable(GL_STENCIL_TEST);
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE);
glDepthMask(GL_FALSE);
glStencilFunc(GL_NEVER, 1, 0xFF);
glStencilOp(GL_REPLACE, GL_KEEP, GL_KEEP); // draw 1s on test fail (always)
// draw stencil pattern
glStencilMask(0xFF);
glClear(GL_STENCIL_BUFFER_BIT); // needs mask=0xFF
glDrawElements(GL_TRIANGLES, 1*6, GL_UNSIGNED_SHORT, 0);
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);
glDepthMask(GL_TRUE);
glStencilMask(0x00);
// draw only where stencil's value is 1
glStencilFunc(GL_EQUAL, 1, 0xFF);
glDrawElements(GL_TRIANGLES, nLiveSquares*6, GL_UNSIGNED_SHORT, 0);
glDisable(GL_STENCIL_TEST);
}
The vertexbuffer contains info for squares, which hold textures.
The effect I'm trying to get is to have the first square be used to define the stencil shape and then when I draw the scene, anything which lies outside of the first square is clipped. Hence, I have used drawElements twice - firstly to draw the clipping square and then to draw all the squares.
I also have another method used for the OpenGL set-up, called once at the start of the program. As suggested by an answer to this stackoverflow question, this includes:
GLuint depthStencilRenderbuffer;
glGenRenderbuffers(1, &depthStencilRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthStencilRenderbuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthStencilRenderbuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_STENCIL_ATTACHMENT, GL_RENDERBUFFER, depthStencilRenderbuffer);
Running this results in nothing except a screen containing the grey clear color. It appears that everything has been stenciled out (or is not drawn to the screen).
Solution:
i) I changed the initialization to:
GLuint depthStencilRenderbuffer;
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthStencilRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES,
GL_DEPTH24_STENCIL8_OES,
self.view.bounds.size.width,
self.view.bounds.size.height);
I'm not sure what the difference is, but this seems to work.
ii) In the set up I use:
GLKView *view = (GLKView *)self.view;
view.context = self.context;
view.drawableDepthFormat = GLKViewDrawableDepthFormat24;
view.drawableStencilFormat = GLKViewDrawableStencilFormat8;
Thanks to #rickster for his suggestion.

gldrawelements bad access in xcode when used outside of GLKViewController

I'm pretty new to OpenGL ES, but all I'm trying to do is draw indexed vertices using glDrawElements in a Character class. I've gotten this to work before inside of my GLKViewController class, but when I tried creating a Character class which would perform its own rendering, I got nothing but BAD_ACCESS. Here is my Character class:
#import "Character.h"
#interface Character()
{
GLuint _vertexBuffer;
GLuint _indexBuffer;
GLuint _vertexArray;
}
#property(nonatomic, weak) GLKBaseEffect *effect;
#end
typedef struct {
float Position[3];
float Color[4];
} Vertex;
const Vertex Vertices[] = {
{{1, -1, 0}, {1, 0, 0, 1}},
{{1, 1, 0}, {0, 1, 0, 1}},
{{-1, 1, 0}, {0, 0, 1, 1}},
{{-1, -1, 0}, {0, 0, 0, 1}}
};
const GLushort Indices[] = {
0, 1, 2,
2, 3, 0
};
#implementation Character
- (id)initWithEffect:(GLKBaseEffect *)effect
{
if (self = [super init])
{
self.effect = effect;
[self setupGL];
}
return self;
}
- (void)setupGL
{
glGenVertexArraysOES(1, &_vertexArray);
glBindVertexArrayOES(_vertexArray);
glGenBuffers(1, &_vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices, GL_STATIC_DRAW);
glGenBuffers(1, &_indexBuffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _indexBuffer);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(Indices), Indices, GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (const GLvoid *) offsetof(Vertex, Position));
glEnableVertexAttribArray(GLKVertexAttribColor);
glVertexAttribPointer(GLKVertexAttribColor, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (const GLvoid *) offsetof(Vertex, Color));
glBindVertexArrayOES(0);
}
- (void)teardownGL
{
glDeleteBuffers(1, &_vertexBuffer);
glDeleteBuffers(1, &_indexBuffer);
}
- (void)render
{
self.effect.transform.modelviewMatrix = GLKMatrix4Translate(GLKMatrix4Identity, self.position.x, self.position.y, self.position.z);
self.effect.transform.modelviewMatrix = GLKMatrix4Rotate(self.effect.transform.modelviewMatrix, self.rotation, 0.0f, 0.0f, 1.0f);
[self.effect prepareToDraw];
glBindVertexArrayOES(_vertexArray);
glDrawElements(GL_TRIANGLES, sizeof(Indices) / sizeof(Indices[0]), GL_UNSIGNED_SHORT, 0);
}
#end
Then in ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
GLKView *view = (GLKView *)self.view;
view.context = self.context;
character = [[Character alloc] initWithEffect:self.effect];
character.position = GLKVector3Make(self.view.bounds.size.width / 2, self.view.bounds.size.height / 2, 0.0f);
[self setupGL];
}
rendered using:
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
glClearColor(0.65f, 0.65f, 0.65f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
[character render];
}
I know it isn't something as simple as a miscalculation of byte-size or something because I've been at this for a couple days now.
Hard to say for sure on a quick read, but it looks like the problem is that you don't have a current GL context when you're setting up your VAO in -[Character setupGL].
Creating an EAGLContext object doesn't make it the current context. If the rest of the code in your view controller looks like that from the Xcode "OpenGL Game" template, ViewController doesn't set the current context until its own setupGL method, which you call only after calling into your Character class and its attempts to set up OpenGL ES resources.

GLKit: [drawing pixels] initializing GLfloat array

I am drawing a pixel using GLKit. I can successfully draw the pixel at (10, 10) coordinates if I have:
glClearColor(0.65f, 0.65f, 0.65f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// Prepare the effect for rendering
[self.effect prepareToDraw];
GLfloat points[] =
{
10.0f, 10.0f,
};
glClearColor(1.0f, 1.0f, 0.0f, 1.0f);
GLuint bufferObjectNameArray;
glGenBuffers(1, &bufferObjectNameArray);
glBindBuffer(GL_ARRAY_BUFFER, bufferObjectNameArray);
glBufferData(
GL_ARRAY_BUFFER,
sizeof(points),
points,
GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(
GLKVertexAttribPosition,
2,
GL_FLOAT,
GL_FALSE,
2*4,
NULL);
glDrawArrays(GL_POINTS, 0, 1);
But I want to decide at runtime how many and exactly where I want to draw pixels, so I tried this but it is drawing pixel at (10, 0), something's wrong here:
glClearColor(0.65f, 0.65f, 0.65f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// Prepare the effect for rendering
[self.effect prepareToDraw];
GLfloat *points = (GLfloat*)malloc(sizeof(GLfloat) * 2);
for (int i=0; i<2; i++) {
points[i] = 10.0f;
}
glClearColor(1.0f, 1.0f, 0.0f, 1.0f);
GLuint bufferObjectNameArray;
glGenBuffers(1, &bufferObjectNameArray);
glBindBuffer(GL_ARRAY_BUFFER, bufferObjectNameArray);
glBufferData(
GL_ARRAY_BUFFER,
sizeof(points),
points,
GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(
GLKVertexAttribPosition,
2,
GL_FLOAT,
GL_FALSE,
2*4,
NULL);
glDrawArrays(GL_POINTS, 0, 1);
Kindly help me out.
Edit:
Problem
Actually the problem is: I can't figure out what is the difference between:
GLfloat points[] =
{
10.0f, 10.0f,
};
AND
GLfloat *points = (GLfloat*)malloc(sizeof(GLfloat) * 2);
for (int i=0; i<2; i++) {
points[i] = 10.0f;
}
My bet will be that the problem is the sizeof(points) in the glBufferData data call: in this case it will return the size of the pointer which is a word i.e. 4 bytes (or something like this). You should pass the real size of your array that will be the same what you calculate in the malloc code.

Use of VAO around VBO in Open ES iPhone app Causes EXC_BAD_ACCESS When Call to glDrawElements

I'm trying to take my code to the next level. Following some best practices from Apple, I'm trying to implement Vertex Array Objects around my Vertex Buffer Objects (VBO). I setup my VBOs and VAOs like this:
- (void)setupVBOs {
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glBindVertexArrayOES(0);
{
glGenVertexArraysOES(1, &directArrayObject);
glBindVertexArrayOES(directArrayObject);
// GLuint texCoordBuffer;
glGenBuffers(1, &texCoordBuffer);
glBindBuffer(GL_ARRAY_BUFFER, texCoordBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(DirectVertices), DirectVertices, GL_STATIC_DRAW);
glVertexAttribPointer(directPositionSlot, 2, GL_FLOAT, GL_FALSE, sizeof(DirectVertex), (GLvoid*)offsetof(DirectVertex, position));
glEnableVertexAttribArray(directPositionSlot);
glVertexAttribPointer(texCoordSlot, 2, GL_UNSIGNED_BYTE, GL_FALSE, sizeof(DirectVertex), (GLvoid*)offsetof(DirectVertex, texCoord));
glEnableVertexAttribArray(texCoordSlot);
glGenVertexArraysOES(1, &arrayObject);
glBindVertexArrayOES(arrayObject);
// GLuint vertexBuffer;
glGenBuffers(1, &vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices, GL_STATIC_DRAW);
glVertexAttribPointer(positionSlot, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), 0);
glEnableVertexAttribArray(positionSlot);
glVertexAttribPointer(colorSlot, 4, GL_UNSIGNED_BYTE, GL_FALSE, sizeof(Vertex), (GLvoid*)offsetof(Vertex, Color));
glEnableVertexAttribArray(colorSlot);
// GLuint indexBuffer;
glGenBuffers(1, &indexBuffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(Indices), Indices, GL_STATIC_DRAW);
}
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glBindVertexArrayOES(0);
}
which I took from http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=287977 and then use it like this:
- (void) render:(CADisplayLink*)displayLink {
glClearColor(0, 104.0/255.0, 55.0/255.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glViewport(0, 0, backingWidth, backingHeight);
[directProgram use];
glBindVertexArrayOES(directArrayObject);
glDisable(GL_DEPTH_TEST);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, videoFrameTexture);
// // Update uniform values
glUniform1i(videoFrameUniform, 0);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[program use];
glBindVertexArrayOES(arrayObject);
glDisable(GL_TEXTURE_2D);
glEnable(GL_DEPTH_TEST);
CC3GLMatrix *projection = [CC3GLMatrix matrix];
float h = 4.0f * self.frame.size.height / self.frame.size.width;
[projection populateFromFrustumLeft:-2 andRight:2 andBottom:-h/2 andTop:h/2 andNear:4 andFar:10];
glUniformMatrix4fv(projectionUniform, 1, 0, projection.glMatrix);
CC3GLMatrix *modelView = [CC3GLMatrix matrix];
[modelView populateFromTranslation:CC3VectorMake(sin(CACurrentMediaTime()), 0, -7)];
currentRotation += displayLink.duration * 90;
[modelView rotateBy:CC3VectorMake(currentRotation, currentRotation, 0)];
glUniformMatrix4fv(modelViewUniform, 1, 0, modelView.glMatrix);
glDrawElements(GL_TRIANGLES, sizeof(Indices)/sizeof(Indices[0]), GL_UNSIGNED_BYTE, 0);
glBindVertexArrayOES(0);
BOOL success = [context presentRenderbuffer:GL_RENDERBUFFER];
if(!success)
NSLog(#"present failed");
}
The call to glDrawArrays works, and it fills my texture, however, the call to glDrawElements fails with an EXC_BAD_ACCESS. My shader programs (i use two) are wrapped in a GLProgram object that I took from http://iphonedevelopment.blogspot.com/2010/11/opengl-es-20-for-ios-chapter-4.html
Remove glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0); from the end of your setup function.
Per the specification for OES_vertex_array_object, a vertex array object encapsulates all state except the array buffer binding1, so there is no element buffer bound at the time that you’re drawing, whereas you presumably wanted indexBuffer to be bound. By leaving indexBuffer bound at the time that you bind away from your vertex array object, you ensure that it’ll be rebound when you return to that vertex array object.
1 If you’re wondering why the array buffer binding isn’t tracked in vertex array objects, this is presumably because the currently-bound array buffer isn’t used directly when reading vertex data from arrays—rather, each vertex attribute has its own buffer binding, which is filled out by its respective gl*Pointer function by looking at the array buffer binding when the function is called.

Resources