I'm trying to find how to clip a texture that is already attached..
For example:
var texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, canvas);
After this, I draw the elements, I want to come in and clip the texture to draw one more time before I clear the context.
How would I clip a texture that is already bound?
I tried to draw over the texture with black using texSubImage2D but the performance is terrible..
Related
I'm currently trying to divide the OpenGL ES 2.0 drawing process onto two halves: the first half where I render an object of interest (i.e. a cube or triangle) to a framebuffer that has a texture attached to it, and the second half where I apply that texture onto the face of a shape drawn in another framebuffer (i.e. another cube or triangle).
I cleared the framebuffer binded to the texture with a green color, and have been able to get that color to appear onto a triangle that I've drawn in another framebuffer that has the main renderbuffer attached and that I call [context presentRenderbuffer: renderbuffer] on. However, no matter what I do I'm not able to additionally draw another shape into that texture after I've cleared it to a green background, and render that onto the shape I've drawn.
For some visual reference, currently I'm drawing a square to the screen in my main framebuffer, and then applying a texture that is supposed to have a green background plus a triangle in the middle, but all that I get is this green screen.
It has everything that I currently want, except there is no triangle that is also in the middle. Essentially, this should look like a big green square with a black triangle in the middle of it, where the green and the triangle all came from the texture (the square would have originally been black).
My texture drawing method and main framebuffer drawing methods are included below (without the setup code):
- (BOOL) loadModelToTexture: (GLuint*) tex {
GLuint fb;
GLenum status;
glGenFramebuffers(1, &fb);
// Set up the FBO with one texture attachment
glBindFramebuffer(GL_FRAMEBUFFER, fb);
glGenTextures(1, tex);
glBindTexture(GL_TEXTURE_2D, *tex);
NSLog(#"Error1: %x", glGetError());
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 128, 128, 0,
GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
GL_TEXTURE_2D, *tex, 0);
status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if (status != GL_FRAMEBUFFER_COMPLETE) {
// Handle error here
NSLog(#"Loading model to texture failed");
return FALSE;
}
glClearColor(0.0f, 1.0f, 0.0f, 1.0f); // Set color's clear-value to red
glClearDepthf(1.0f); // Set depth's clear-value to farthest
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glViewport(0, 0, self.frame.size.width*self.contentsScale, self.frame.size.height*self.contentsScale);
NSLog(#"Error2: %x", glGetError());
// Update attribute values.
glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, vertices);
glEnableVertexAttribArray(ATTRIB_VERTEX);
NSLog(#"Error3: %x", glGetError());
glDrawArrays(GL_TRIANGLES, 0, 3);
return TRUE;
}
- (void) draw {
[EAGLContext setCurrentContext:context];
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glClearColor(0.0f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glViewport(0, 0, self.frame.size.width*self.contentsScale, self.frame.size.height*self.contentsScale);
// Use shader program.
glUseProgram(program);
// Update attribute values.
glEnableVertexAttribArray(ATTRIB_VERTEX);
glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, vertices);
//for some reason this is unneeded, but I'm not sure why
glVertexAttribPointer(ATTRIB_TEXTURE_COORD, 2, GL_FLOAT, 0, 0, texCoords);
glEnableVertexAttribArray(ATTRIB_TEXTURE_COORD);
glBindTexture(GL_TEXTURE_2D, textureName0);
glDrawArrays(GL_TRIANGLES, 0, 3);
glDrawArrays(GL_TRIANGLES, 3, 3);
[context presentRenderbuffer:renderbuffer];
}
What other steps do I need to take to get it to draw correctly to the texture/apply correctly to the main framebuffer drawing?
Never mind, it turns out that I was drawing my triangle to the texture, but the triangle just automatically defaulted to the same color as the background texture, so it's a completely different issue.
I'm trying to add multisampling to my app, but it seems that I've made a mistake, but I can't find what I did wrong.
This is how I setup my frame buffers and render buffers
- (void)setupBuffers {
glGenFramebuffers(1, &_framebuffer);
glGenRenderbuffers(1, &_renderbuffer);
glBindFramebuffer(GL_FRAMEBUFFER, _framebuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _renderbuffer);
[_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:_eaglLayer]; // I already set the current context to _context, and _eaglLayer is just self.layer
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _renderbuffer);
if (YES) { // note if I set this to no, my app properly displays (I don't even have to remove the code in my render method)
glGenFramebuffers(1, &_msaa_framebuffer);
glGenRenderbuffers(1, &_msaa_renderbuffer);
glBindFramebuffer(GL_FRAMEBUFFER, _msaa_framebuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _msaa_renderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER, 2, GL_RGBA8_OES, [AppDelegate screenWidth], [AppDelegate screenHeight]); // yes, this is the proper width and height I tested it
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _msaa_renderbuffer);
}
GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if(status != GL_FRAMEBUFFER_COMPLETE) {
NSLog(#"Failed to make complete framebuffer object %x", status);
exit(1);
}
}
After viewDidLoad is called on my ViewController I call the method setupDisplayLink on my UIView subclass.
- (void)setupDisplayLink {
CADisplayLink* displayLink = [CADisplayLink displayLinkWithTarget:self selector:#selector(render:)];
//displayLink.frameInterval = [[NSNumber numberWithInt:1] integerValue];
[displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
}
This calls my render method which is pretty simple:
- (void)render:(CADisplayLink*)displayLink {
glBindRenderbuffer(GL_RENDERBUFFER, _msaa_framebuffer);
glViewport(0, 0, [AppDelegate screenWidth], [AppDelegate screenHeight]);
glClearColor(188.0f / 255.0f, 226.0f / 255.0f, 232.0f / 255.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
[[OpenGLViewController instance].menu draw:displayLink]; // drawing happens here
glBindFramebuffer(GL_DRAW_FRAMEBUFFER_APPLE, _framebuffer);
glBindFramebuffer(GL_READ_FRAMEBUFFER_APPLE, _msaa_framebuffer);
glResolveMultisampleFramebufferAPPLE();
glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE, 1, (GLenum[1]) { GL_COLOR_ATTACHMENT0 });
glBindRenderbuffer(GL_RENDERBUFFER, _renderbuffer);
NSLog(#"%#", #"HI");
[_context presentRenderbuffer:GL_RENDERBUFFER];
}
It's not hanging at all (the app keeps printing "HI" in the console because I told it to in the render method). For some reason only the first frame is drawn when I add the extra frame buffer and render buffer for multisampling and I can't figure out why. It just freezes on that frame. Why will my app only draw the first frame with MSAA?
This is not surprising to say the least. The only time you have _msaa_framebuffer bound as the DRAW buffer is immediately after you initialize your FBOs.
The first time you call render (...), the following line will be drawn into your _msaa_framebuffer:
[[OpenGLViewController instance].menu draw:displayLink]; // drawing happens here
However, later on in render (...) you set the draw buffer to _framebuffer and you never change it from that point on.
To fix your problem, all you have to do is remember to bind _msaa_framebuffer as your draw buffer at the beginning of your render (...) function:
- (void)render:(CADisplayLink*)displayLink {
glBindFramebuffer(GL_DRAW_FRAMEBUFFER_APPLE, _msaa_framebuffer);
I am new to iOS and OpenGL programming, and I am currently writing a simple program using OpenGL ES 2.0 and GLKit for practicing. Right now I can successfully load a PNG file and display it on the screen.
I used GLKViewController in my program, and did some initialization in viewDidLoad. Here's the code in my glkView:drawInRect method:
glClearColor(115.0/255.0, 171.0/255.0, 245.0/255.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
self.effect.texture2d0.name = self.textureInfo.name;
self.effect.texture2d0.enabled = YES;
[self.effect prepareToDraw];
glEnableVertexAttribArray(GLKVertexAttribPosition);
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
long offset = (long)&_quad;
glVertexAttribPointer(GLKVertexAttribPosition, 2, GL_FLOAT, GL_FALSE, sizeof(ImageVertex), (void*)(offset + offsetof(ImageVertex, geometryVertex)));
glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, sizeof(ImageVertex), (void*)(offset + offsetof(ImageVertex, textureVertex)));
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
The above code works pretty well. Now I want to set the opacity of the PNG image. This may sound simple, but I have no idea how I can change the opacity...
I suspect your fragment shader is giving a constant 1.0 value for the alpha channel to gl_FragColor. That value should vary to produce blending. Please see the answers to this question:
How can I get Alpha blending transparency working in OpenGL ES 2.0?
I am working on a simple iOS game and I am having a problem with UINavigationController and an EAGL-View.
The situation is as follows: I use one EAGL-View in conjunction with multiple controllers.
Whenever I push the MainViewController (which does all the custom openGL drawing), I end up using more memory (around 5MB per push!).
The problem seems to be within [eaglView_ setFramebuffer] - or at least that's where almost all allocations seem to happen (I've checked the live bytes via Instruments - around 70% of memory is allocated in this function).
EAGLView::setFramebuffer:
- (void)setFramebuffer {
if (context) {
[EAGLContext setCurrentContext:context];
if (!defaultFramebuffer)
[self createFramebuffer];
glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, msaaFramebuffer);
glViewport(0, 0, framebufferWidth, framebufferHeight);
}
}
and EAGLView::createFramebuffer:
- (void)createFramebuffer
{
if (context && !defaultFramebuffer) {
[EAGLContext setCurrentContext:context];
// Create default framebuffer object.
glGenFramebuffers(1, &defaultFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
// Create color render buffer and allocate backing store.
glGenRenderbuffers(1, &colorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer *)self.layer];
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &framebufferWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &framebufferHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorRenderbuffer);
//MSAA stuff
glGenFramebuffers(1, &msaaFramebuffer);
glGenRenderbuffers(1, &msaaRenderBuffer);
glBindFramebuffer(GL_FRAMEBUFFER, msaaFramebuffer);
glBindRenderbuffer(GL_RENDERBUFFER, msaaRenderBuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER, 4, GL_RGBA8_OES, framebufferWidth, framebufferHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, msaaRenderBuffer);
glGenRenderbuffers(1, &msaaDepthBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, msaaDepthBuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER, 4, GL_DEPTH_COMPONENT16, framebufferWidth, framebufferHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, msaaDepthBuffer);
if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE)
NSLog(#"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
}
}
As you can see - nothing special there.
I switch my ViewControllers like this: (in the AppDelegate)
- (void) swichToViewController: (UIViewController*) newViewController overlapCurrentView: (bool) overlap {
// get the viewController on top of the stack - popViewController doesn't do anything if it's the rootViewController.
if(!overlap) {
// clear everything that was on the eaglView before
[eaglView_ clearFramebuffer];
[eaglView_ applyMSAA];
[eaglView_ presentFramebuffer];
}
UIViewController* oldViewController = [navController_ topViewController];
// see if the view to be switched to is already the top controller:
if(oldViewController == newViewController)
return;
// if the view is already on the stack, just remove all views on top of it:
if([[navController_ viewControllers] containsObject:newViewController]) {
[oldViewController setView:nil];
[newViewController setView:eaglView_];
[navController_ popToViewController:newViewController animated:!overlap];
return;
}
// else push the new controller
[navController_ popViewControllerAnimated:NO];
[oldViewController setView:nil];
[newViewController setView:eaglView_];
[navController_ pushViewController:newViewController animated:!overlap];
}
Finally, I render my sprites like this: (In my MainViewController.mm):
- (void)drawFrame
{
// When I delete this line, I just get a white screen, even if I have called setFramebuffer earlier(?!)
[(EAGLView *)self.view setFramebuffer];
glClearColor(1, 1, 1, 1);
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
GLfloat screenWidth = [UIScreen mainScreen].bounds.size.width;
GLfloat screenHeight = [UIScreen mainScreen].bounds.size.height;
glOrthof(0, screenWidth, 0, screenHeight, -1.0, 1.0);
glViewport(0, 0, screenWidth, screenHeight);
[gameManager_ playGame];
//MSAA stuff
[(EAGLView *)self.view applyMSAA];
[(EAGLView *)self.view presentFramebuffer];
[(EAGLView *)self.view clearFramebuffer];
}
Something that might be worth mentioning is that I don't allocate the views every time I push them, I keep references to them until the game exits.
[gameManager_ playGame] draws the sprites to the screen - but I've used this method in another project without any memory problems.
Any help would be really appreciated as I've been stuck on this for 2 days :/
Edit:
I've been able to narrow the problem down to a call to gldLoadFramebuffer. This is called whenever I try to draw something on the screen using an openGL function.
It seems to consume more memory when the context changes... But how could I avoid that?
I think I found the problem.
For anyone interested: The MSAA-Buffers weren't correctly deleted on switching the views. That caused the performance to drop significantly after a few pushes, and was also responsible for the increase in memory usage.
I'm trying to draw some OpenGL graphics over the video from camera.
I've modified Apple's GLCameraRipple sample with code that draws a couple of textured triangles. This code works well in my another OpenGL project (but without GLKit).
Unfortunately, it only works here that way: when my app starts I see screen filled with ClearColor with my textured triangles on it (but no video), and in a moment the screen turns to black and I don't see anything.
Could you explain me what's the problem is?
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
glClearColor(0.5, 0.0, 0.0, 0.3);
glClear(GL_COLOR_BUFFER_BIT);
glUseProgram(_program);
glUniform1i(uniforms[UNIFORM_Y], 0);
glUniform1i(uniforms[UNIFORM_UV], 1);
if (_ripple)
{
glDrawElements(GL_TRIANGLE_STRIP, [_ripple getIndexCount], GL_UNSIGNED_SHORT, 0);
}
[self drawAnimations];
}
- (void) drawAnimations{
// Use shader program.
glUseProgram(_texturingProgram);
GLfloat modelviewProj[16];
[self MakeMatrix:modelviewProj
OriginX:100.0
OriginY:100.0
Width:200.0
Height:200.0
Rotation:0.0];
// update uniform values
glUniformMatrix4fv(texturing_uniforms[TEXTURING_UNIFORM_MODEL_VIEW_PROJECTION_MATRIX], 1, GL_FALSE, modelviewProj);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, _animationTexture);
glUniform1i(texturing_uniforms[TEXTURING_UNIFORM_TEXTURE], 0);
glVertexAttribPointer(TEXTURING_ATTRIB_VERTEX,3, GL_FLOAT, GL_FALSE, sizeof(vertexDataTextured), &plain[0].vertex);
glEnableVertexAttribArray(TEXTURING_ATTRIB_VERTEX);
glVertexAttribPointer(TEXTURING_ATTRIB_TEX_COORDS, 2, GL_FLOAT, GL_FALSE, sizeof(vertexDataTextured), &plain[0].texCoord);
glEnableVertexAttribArray(TEXTURING_ATTRIB_TEX_COORDS);
glDrawArrays(GL_TRIANGLES, 0, 6);
if (![self validateProgram:_texturingProgram]) {
NSLog(#"Failed to validate program: (%d)", _texturingProgram);
}
}
You could place a transparent UIView containing your Open GL drawing layer over another UIView containing the camera preview image layer.