Binding multiple buffers in OpenGL ES - ios

It is possible to bind multiple framebuffers and renderbuffers in OpenGL ES? I'm rendering into an offscreen framebuffer/renderbuffer and would prefer to just use my existing render code.
Here's what I'm currently doing:
// create/bind framebuffer and renderbuffer (for screen display)
// render all content
// create/bind framebuffer2 and renderbuffer2 (for off-screen rendering)
// render all content again (would like to skip this)
Here's what I'd like to do:
// create/bind framebuffer and renderbuffer (for screen display)
// create/bind framebuffer2 and renderbuffer2 (for off-screen rendering)
// render all content (only once)
Is this possible?

You cannot render into multiple framebuffers at once. You might be able to use MRTs to render into multiple render targets (textures/renderbuffers) that belong the same FBO by putting out multiple colors in the fragment shader, but not into multiple FBOs, like an offscreen FBO and the default framebuffer. But if I'm informed correctly, ES doesn't support MRTs at the moment, anyway.
But in your case you still don't need to render the scene twice. If you need it in an offscreen renderbuffer anyway, why don't you just use a texture instead of a renderbuffer to hold the offscreen data (shouldn't make a difference). This way you can just render the scene once into the offscreen buffer (texture) and then display this texture to the screen framebuffer by drawing a simple textured quad with a simple pass-through fragment shader.
Though, in OpenGL ES it may make a difference if you use a renderbuffer or a texture to hold the offscreen data, as ES doesn't have a glGetTexImage. So if you need to copy the offscreen data to the CPU you won't get around glReadPixels and therefore need a renderbuffer. But in this case you still don't need to render the scene twice. You just have to introduce another FBO with a texture attached. So you render the scene once into the texture using this FBO and then render this texture into both the offsrceen FBO and the screen framebuffer. This might still be faster than drawing the whole scene twice, though only evaluation can tell you.
But if you need to copy the data to the CPU for processing, you can also just copy it from the screen framebuffer directly and don't need an offscreen FBO. And if you need the offscreen data for GPU-based processing only, then a texture is better than a renderbuffer anyway. So it might be usefull to reason if you actually need an additional offscreen buffer anyway, if it only contains the same data as the screen framebuffer. This might render the whole problem obsolete.

Related

Is my webgl FBO color attachment being cleared?

I'm trying to render something to texture using a library called regl. I manage to render an effect using two render targets and i see the result in one.
Capturing the frame after i've done rendering to the target looks like this, and it represents a screen blit (full screen quad with this texture). This is how i would like this to work.
Once i pass this to some other regl commands, in some future frame, this texture attachment seems to get nuked. This is the same object that i'm trying to render with the same resource, but the data is gone. I have tried detaching the texture from the FBO, but it doesn't seem to be helping. What can i be looking for that would make this texture behave like this?
This ended up being a problem with Regl and WebViz. I was calling React.useState to set the whatever resource that regl uses for the texture. For some reason, this seems like it was invoked, which "resets" the texture to an empty 1x1.

opengles: how to copy existing fbo's colorattachment(renderbuffer) to another fbo's colorattachment(texture2D)

Platform is iPhone OpenGL ES 2.0
the framework already create an main fbo with renderbuffer as it's colorattachment.
And I have my own fbo with texture2D as colorattachment.
I want to copy main fbo's content to my fbo.
I tried common glCopyTexImage2D way, but it's too slow on my device(iPad1).
So I wonder if a faster solution is out there.
If main fbo uses texture2D as colorattachment, I know just draw fullscreen quad using that texture to my fbo, but how to draw it's renderbuffer to my fbo? google quite a while but no specific answer.
RenderBuffers are almost useless on most embedded systems. All you can do with them is read from them with glReadPixels(), which is too slow.
You should use a Texture attachment, as you said, then render with that texture. This artcile will help:
http://processors.wiki.ti.com/index.php/Render_to_Texture_with_OpenGL_ES

What's the easiest way to use glow/blur/wind effect on my scene?

I want to use glow/blur/wind effect on my models (or on my complete scene). How should I do this? What's the easiest way?
You would get a better answer if you provided more concrete details of what you want to implement.
For a full screen pass:
Render scene as normal to off screen texture.
Bind texture containing rendered scene as input texture for next pass:
Render a full-screen quad (two triangles) with a simple vertex shader.
Inside fragment shader you do your blur/glow/whatever effect by sampling texture in interesting ways.
Note if you have any HUD elements you want to render these after the fullscreen effect.

Scaling the contents of OpenGL ES framebuffer

Currently I'm scaling down the contents of my OpenGL ES 1.1 framebuffer like this:
save current framebuffer and renderbuffer references
bind framebuffer2 and smallerRenderbuffer
re-render all contents
now smallerRenderbuffer contains the "scaled-down" contents of
framebuffer
do stuff with contents of smallerRenderbuffer
re-bind framebuffer and renderbuffer
What's an alternative way to do this? Perhaps I can just copy and scale the contents of the original framebuffer and renderbuffer into framebuffer2 and smallerRenderbuffer. Hence avoiding the re-render step. I've been looking at glScalef but I'm not sure where to go from here.
Note: this is all done in OpenGL ES 1.1 on iOS.
You could do an initial render to texture, then render from that to both the frame buffer that you want to be visible and to the small version. Whatever way you look at it, what you're trying to do is use the data that has been rendered as the source for another rendering so rendering to a texture is the most natural thing to do.
You're probably already familiar with the semantics of a render to texture if you're doing work on the miniature version, but for completeness: you'd create a frame buffer object, use glFramebufferTexture2DOES to attach a named texture to a suitable attachment point, then bind either the frame buffer or the texture (ensuring the other isn't simultaneously bound if you want defined behaviour) as appropriate.

How can I access the raw pixel data of an opengl es 2 off screen render buffer?

I can render to the screen but I would like to be able to access the raw pixels that have been rendered by the shader. The only way I know how is to use glReadPixels off of the screen, but I would like to access them before they are draw to screen in order to save frames to disk.
Specifically I want to use shaders to process images that are never displayed: 1)grab image from disk 2)render it 3)output back to disk.
Have you tried to render to the offscreen texture as in here?

Resources