I have a GLKViewController that has an implementation as follows...
#import "game-gl.h"
....
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
renderFrameLine();
}
game-gl.c ( I started with s = .5)
void renderFrameLine() {
printf("Inside Render Line \n");
if(f>1.0f){
f = 0.0f;
}
else{
f = f+.01;
}
glClearColor(f, 0.65f, f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
if(s >2.0f){
s = -2.0f;
}
else{
s = s + 0.10f;
}
GLfloat vVertices[] = { s, s, 0.0f, s, -s, 0.0f, -s, s,
0.0f};
GLfloat colors[] = {
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f};
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, vVertices);
glEnableVertexAttribArray(0);
glDrawArrays(GL_TRIANGLES, 0, 4);
}
When I run the app the background changes color but the Triangle never shows up. Any idea what I am doing wrong?
Problem appears to be with shaders not compiling...
shader = glCreateShader(type);
if (shader == 0){
#ifdef __APPLE__
printf("Shader Failed %d\n", type);
#endif
return 0;
}
Prints out that the shader failed (type = GL_VERTEX_SHADER)
I commented this out and I still get the same response, only everything else seems to compile ok...
UPDATE
I also see the following warning, I don't think it is related since the compile check passes but....
/Users/me/Development/test3/test3/game-gl.c:97:46: Passing 'GLbyte [148]' to parameter of type 'const char *' converts between pointers to integer types with different sign
But as I said this check seems to pass...
glGetShaderiv(shader, GL_COMPILE_STATUS, &compiled);
if (!compiled) {
UPDATE AGAIN
Ok now we are getting some where...
programObject = glCreateProgram();
if(programObject == 0){
printf("Invalid program returned \n");
}
Prints off that the program is invalid ?!?!?!?
And Again
So I changed to try and grab the program in the .m file but it is still 0.
if(_program == 0){
NSLog(#"Still can not create program");
}
else{
resizeViewport(screenWidth, screenHeight);
InitializeOpenGL(0, 0, _program);
}
Where is your shader code? Are you creating a fragment shader? Creating a program? Linking? if thats not compiling, then thats your first problem. But you can get useful error messages from gl that will help you there.
Besides that, you should enable the attribute pointer before assigning the data.
Also, make sure you're not culling clockwise faces (CCW instead)... If u have face culling enabled that is.
For the GLKViewController you apparently have to set the context before getting the program. Adding this line...
[EAGLContext setCurrentContext:self.context];
before initializing the GL made it work.
Related
I'm trying to draw a 1 pixel width line on iOS, using OpenGL and I faced a problem.
That's how drawn line (zoomed) looks like in simulator:
iPhone 4S
iPhone 6 Plus
As you can see, line is more then 1 pixel width and also it's smoothed.
I think, that problem is not in OpenGL, but in screen scale factor.
I want that the line is a one-pixel on all types of screens.
That's how I draw the line
- (void)drawView {
[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glViewport(0, 0, backingWidth, backingHeight);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glDisable(GL_LINE_SMOOTH);
glDisable(GL_BLEND);
glOrthof(-1.0f, 1.0f, -1.5f, 1.5f, -1.0f, 1.0f);
glMatrixMode(GL_MODELVIEW);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
GLfloat vertices[4];
vertices[0] = -0.5;
vertices[1] = 0;
vertices[2] = 0.5;
vertices[3] = 0;
glVertexPointer(2, GL_FLOAT, 0, vertices);
glEnableClientState(GL_VERTEX_ARRAY);
glColor4f(1.0f, 1.0f, 0.0f, 1.0f);
glLineWidthx(1.0f);
glDrawArrays(GL_LINE_STRIP, 0, 2);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}
Have a look at http://oleb.net/blog/2014/11/iphone-6-plus-screen/
This shows that the iPhone 6 Plus uses a scaling factor and says that you can disable the scaling by "set the contentScaleFactor of your GLKView to the value of UIScreen.nativeScale"
Hopefully that will fix your problem
I'm having an issue where I have an open GL ES 2.0 based view that first draws to a texture and then renders that texture to the screen.
I instantiate this view as a subview of a main UIView with other buttons. The texture for some reason doesn't render immediately when drawn, rather only after one of the buttons is pressed.
Has anyone experienced anything like this? Thank you in advance
Uba Duba
Here is the drawing and rendering code:
// Drawings a line onscreen based on where the user touches
- (void)renderLineFromPoint:(CGPoint)start toPoint:(CGPoint)end
{
static GLfloat* vertexBuffer = NULL;
static NSUInteger vertexMax = 64;
NSUInteger vertexCount = 0,
count,
i;
[EAGLContext setCurrentContext:context];
glBindFramebuffer(GL_FRAMEBUFFER, fbTemp);
// Convert locations from Points to Pixels
CGFloat scale = self.contentScaleFactor;
start.x *= scale;
start.y *= scale;
end.x *= scale;
end.y *= scale;
// Allocate vertex array buffer
if(vertexBuffer == NULL)
vertexBuffer = malloc(vertexMax * 2 * sizeof(GLfloat));
// Add points to the buffer so there are drawing points every X pixels
count = MAX(ceilf(sqrtf((end.x - start.x) * (end.x - start.x) + (end.y - start.y) * (end.y - start.y)) / kBrushPixelStep), 1);
for(i = 0; i < count; ++i) {
if(vertexCount == vertexMax) {
vertexMax = 2 * vertexMax;
vertexBuffer = realloc(vertexBuffer, vertexMax * 2 * sizeof(GLfloat));
}
vertexBuffer[2 * vertexCount + 0] = start.x + (end.x - start.x) * ((GLfloat)i / (GLfloat)count);
vertexBuffer[2 * vertexCount + 1] = start.y + (end.y - start.y) * ((GLfloat)i / (GLfloat)count);
vertexCount += 1;
}
// Load data to the Vertex Buffer Object
glBindBuffer(GL_ARRAY_BUFFER, vboId);
glBufferData(GL_ARRAY_BUFFER, vertexCount*2*sizeof(GLfloat), vertexBuffer, GL_DYNAMIC_DRAW);
glEnableVertexAttribArray(pointShader.positionLoc);
glVertexAttribPointer(pointShader.positionLoc, 2, GL_FLOAT, GL_FALSE, 0, 0);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, brushTexture.id);
// Draw
glUseProgram(pointShader.programObject);
// the brush texture will be bound to texture unit 0
glUniform1i(pointShader.textureLoc, 0);
// viewing matrices
GLKMatrix4 projectionMatrix = GLKMatrix4MakeOrtho(0, backingWidth, 0, backingHeight, -1, 1);
GLKMatrix4 modelViewMatrix = GLKMatrix4Identity; // this sample uses a constant identity modelView matrix
GLKMatrix4 MVPMatrix = GLKMatrix4Multiply(projectionMatrix, modelViewMatrix);
glUniformMatrix4fv(pointShader.MVPLoc, 1, GL_FALSE, MVPMatrix.m);
// point size
glUniform1f(pointShader.pointSizeLoc, brushTexture.width / kBrushScale);
// initialize brush color
glUniform4fv(pointShader.vertexColorLoc, 1, brushColor);
glDrawArrays(GL_POINTS, 0, vertexCount);
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glBindTexture(GL_TEXTURE_2D, 0);
[self drawResult];
}
and the drawResult function that does the rendering:
// Draw current result at the present render buffer.
- (void) drawResult
{
NSLog(#"Calling Draw Result");
// Bind the master frame buffer.
glBindFramebuffer(GL_FRAMEBUFFER, fbMaster);
// Clear the buffer.
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
// Draw the background board.
[self drawTexture:masterTexture ];
// Draw the temporary (drawing) board.
[self drawTexture:tempTexture];
// Bind the render buffer and present it to the view's context.
glBindRenderbuffer(GL_RENDERBUFFER, renderBuffer);
[context presentRenderbuffer:GL_RENDERBUFFER];
}
the draw texture function that binds the texture
// Draw a texture at the full screen quad
- (void) drawTexture:(GLuint) textureID
{
// Positions to draw a texture at the full screen quad.
const GLfloat vVerticesStraight[] = {
-1.0f, 1, 0.0f, // Position 0
0.0f, 1.0f, // TexCoord 0
-1.0f, -1.0f, 0.0f, // Position 1
0.0f, 0.0f, // TexCoord 1
1.0f, -1.0f, 0.0f, // Position 2
1.0f, 0.0f, // TexCoord 2
1.0f, 1.0f, 0.0f, // Position 3
1.0f, 1.0f // TexCoord 3
};
// Index order for draw element.
const GLushort indices[] = { 0, 1, 2, 0, 2, 3 };
// Use the program object
glUseProgram ( drawTexture.programObject );
// Load the vertex position
glVertexAttribPointer ( drawTexture.a_positionLoc, 3, GL_FLOAT,
GL_FALSE, 5 * sizeof(GLfloat), vVerticesStraight );
// Load the texture coordinate
glVertexAttribPointer ( drawTexture.a_texCoordLoc, 2, GL_FLOAT,
GL_FALSE, 5 * sizeof(GLfloat), &vVerticesStraight[3] );
// Enable attribute for the vertex array.
glEnableVertexAttribArray ( drawTexture.a_positionLoc );
glEnableVertexAttribArray ( drawTexture.a_texCoordLoc );
// Bind the texture.
glActiveTexture(GL_TEXTURE0);
glBindTexture ( GL_TEXTURE_2D, textureID);
glUniform1i ( drawTexture.s_textureLoc, 0 );
// Draw the texture.
glDrawElements ( GL_TRIANGLES, 6, GL_UNSIGNED_SHORT, indices );
// Bind the current texture to the default texture.
glBindTexture(GL_TEXTURE_2D, 0);
}
There is no background threading, what's very strange is the first drawing seems to render to the texture but not to the screen, but after pressing another button on the view, the texture renders to the screen.
I am trying to draw a line with OpenGL ES 2.0 GLKit. When I run the following code and use OpenGL ES Analyzer I get the following errors:
"Use of Non-Existent Program"
glDrawArrays(GL_LINE_STRIP,0,4)
"GL Error: Invalid Operation"
GL_INVALID_OPERATION <- glVertexPointer(2,GL_FLOAT,0,NULL)
GL_INVALID_OPERATION <- glEnableClientState(GL_VERTEX_ARRAY)
Here's my code:
#import "GLDrawingView.h"
const float data[] = {0.0f, 1.0f, 0.0f, 0.0f, 1.0f, -0.0f, 0.0f, 1.0f};
#interface GLDrawingView () {
GLuint lineVBO;
}
#end
#implementation GLDrawingView
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
[EAGLContext setCurrentContext:self.context];
glGenBuffers(1, &lineVBO);
glBindBuffer(GL_ARRAY_BUFFER, lineVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(data), data, GL_STATIC_DRAW);
}
return self;
}
- (void)drawRect:(CGRect)rect
{
glVertexPointer(2, GL_FLOAT, 0, NULL);
glEnableClientState(GL_VERTEX_ARRAY);
glDrawArrays(GL_LINE_STRIP, 0, sizeof(data) / sizeof(float) / 2);
}
#end
When you draw something in OpenGL ES 2.0 you must use shader program (glUseProgram) for rendering. You can not render without shaders in GLES2.
I am drawing a pixel using GLKit. I can successfully draw the pixel at (10, 10) coordinates if I have:
glClearColor(0.65f, 0.65f, 0.65f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// Prepare the effect for rendering
[self.effect prepareToDraw];
GLfloat points[] =
{
10.0f, 10.0f,
};
glClearColor(1.0f, 1.0f, 0.0f, 1.0f);
GLuint bufferObjectNameArray;
glGenBuffers(1, &bufferObjectNameArray);
glBindBuffer(GL_ARRAY_BUFFER, bufferObjectNameArray);
glBufferData(
GL_ARRAY_BUFFER,
sizeof(points),
points,
GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(
GLKVertexAttribPosition,
2,
GL_FLOAT,
GL_FALSE,
2*4,
NULL);
glDrawArrays(GL_POINTS, 0, 1);
But I want to decide at runtime how many and exactly where I want to draw pixels, so I tried this but it is drawing pixel at (10, 0), something's wrong here:
glClearColor(0.65f, 0.65f, 0.65f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// Prepare the effect for rendering
[self.effect prepareToDraw];
GLfloat *points = (GLfloat*)malloc(sizeof(GLfloat) * 2);
for (int i=0; i<2; i++) {
points[i] = 10.0f;
}
glClearColor(1.0f, 1.0f, 0.0f, 1.0f);
GLuint bufferObjectNameArray;
glGenBuffers(1, &bufferObjectNameArray);
glBindBuffer(GL_ARRAY_BUFFER, bufferObjectNameArray);
glBufferData(
GL_ARRAY_BUFFER,
sizeof(points),
points,
GL_STATIC_DRAW);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(
GLKVertexAttribPosition,
2,
GL_FLOAT,
GL_FALSE,
2*4,
NULL);
glDrawArrays(GL_POINTS, 0, 1);
Kindly help me out.
Edit:
Problem
Actually the problem is: I can't figure out what is the difference between:
GLfloat points[] =
{
10.0f, 10.0f,
};
AND
GLfloat *points = (GLfloat*)malloc(sizeof(GLfloat) * 2);
for (int i=0; i<2; i++) {
points[i] = 10.0f;
}
My bet will be that the problem is the sizeof(points) in the glBufferData data call: in this case it will return the size of the pointer which is a word i.e. 4 bytes (or something like this). You should pass the real size of your array that will be the same what you calculate in the malloc code.
I'm having issues with Texture2D and I'd like to understand how to use it better.
I've taken the Crashlander Texture2D class from here and a default OpenGL project in XCode 4, forcing it to load OpenGL ES1.1
First, a conceptual question. The size on the Texture2D init method is clearly an OpenGL size, but what relation to the OpenGL world does the fontSize parameter have?
Second, debugging. The result I get from the code below is a black (Or whatever colour I set in glColor) square where the text should be.
Here's the changes I've made in my code:
- (void)awakeFromNib
{
EAGLContext *aContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
if (!aContext) {
aContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
}
self.labelAtTheTop = [[[Texture2D alloc] initWithString:#"Some Text" dimensions:CGSizeMake(1, 1) alignment:UITextAlignmentLeft fontName:#"Helvetica" fontSize:14.0f] autorelease];
if (!aContext)
NSLog(#"Failed to create ES context");
else if (![EAGLContext setCurrentContext:aContext])
NSLog(#"Failed to set ES context current");
self.context = aContext;
[aContext release];
[(EAGLView *)self.view setContext:context];
[(EAGLView *)self.view setFramebuffer];
animating = FALSE;
animationFrameInterval = 1;
self.displayLink = nil;
}
- (void)drawFrame
{
[(EAGLView *)self.view setFramebuffer];
// Replace the implementation of this method to do your own custom drawing.
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glEnableClientState(GL_VERTEX_ARRAY);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glColor4f(0.0f, 0.0f, 0.0f, 1.0f);
glPushMatrix();
glLoadIdentity();
[self.labelAtTheTop drawAtPoint:CGPointMake(0, 0)];
glPopMatrix();
glDisable(GL_COLOR_MATERIAL);
// Disable modes so they don't interfere with other parts of the program
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisableClientState(GL_VERTEX_ARRAY);
glDisable(GL_TEXTURE_2D);
glDisable(GL_BLEND);
[(EAGLView *)self.view presentFramebuffer];
}
Crashlander is really an old code base, so I would suggest avoiding it. There is a perfectly good 2D engine for the iPhone called Cocos2D http://www.cocos2d-iphone.org/. About the code, try commenting glDisable(GL_COLOR_MATERIAL); plus glColor4f(0,0,0,1); actually represents black color, try commenting this too. I think fontSize is the size of font in screen points.
[EDIT]
If you want to learn something about OpenGLES here is a good intro tutorial
http://iphonedevelopment.blogspot.com/2009/05/opengl-es-from-ground-up-table-of.html