Presently I am trying to read the pixel data from the frame Buffer in order to capture the screen in IOS. GlreadPixels command works fine when using the following code to setup frame buffer :-
//buffers
// Create a depth buffer that has the same size as the color buffer.
glGenRenderbuffersOES(1, &m_depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, m_depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT24_OES, width, height);
// Create the framebuffer object.
glGenFramebuffersOES(1, &framebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES,
GL_RENDERBUFFER_OES, m_colorRenderbuffer);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES,
GL_RENDERBUFFER_OES, m_depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, m_colorRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
but when I use depth buffer and colorbuffer for Multi-Sampling glreadpixels() don't capture any pixel data as with earlier code ....for multi-sampling I use following code :-
glGenFramebuffersOES(1, &sampleFramebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, sampleFramebuffer);
//GLuint sampleColorRenderbuffer;
glGenRenderbuffersOES(1, &sampleColorRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleColorRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_RGBA8_OES, width, height);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, sampleColorRenderbuffer);
//glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_RGBA8_OES, width, height);
//glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, sampleColorRenderbuffer);
GLuint sampleDepthRenderbuffer;
glGenRenderbuffersOES(1, &sampleDepthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_DEPTH_COMPONENT24_OES, width, height);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);
//glBindRenderbufferOES(GL_RENDERBUFFER_OES,sampleColorRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, m_colorRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, sampleFramebuffer);
I use the follwing code to read pixel data :-
CGRect screenBounds = [[UIScreen mainScreen] bounds];
int backingWidth = screenBounds.size.width;
int backingHeight =screenBounds.size.height;
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
NSInteger myDataLength = backingWidth * backingHeight * 4;
buffer= (GLuint *) malloc(myDataLength);
glReadPixels(0, 0, backingWidth, backingHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
Any idea how to capture correct pixel data with multi-Sampling technique....or Am I doing something wrong? Please guide me in right direction.
Thanks
When using multisampled FBOs you cannot just read the sample buffer (as it doesn't contain simple pixels). You first need to resolve the sample buffers into a single buffer.
You do this by creating another non-multisampled FBO (let's call it resultFramebuffer) with the neccessary renderbuffer storage you want to read and then calling:
glBindFramebuffer(GL_READ_FRAMEBUFFER, sampleFramebuffer);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, resultFramebuffer);
glBlitFramebuffer(0, 0, width, height, 0, 0, width, height, GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT, GL_NEAREST);
And then you read from the result buffer (of course the actual constant and function names may contain OES or APPLE). If you don't need the final depth values, the result buffer doesn't need a depth renderbuffer.
EDIT: As you wrote in your comment and what I searched, there is a dedicated function glResolveMultisampleFramebufferAPPLE you have to use instead of glBlitFramebuffer. The rest stays the same.
Related
In an OpenGL ES app I'm working on, when i use glReadPixels to get pixel but got empty buffer.now i don't know what's wrong in my code。Thanks for any help.
- (void)setTextureImage:(UIImage *)image {
self.textureID = [self createTextureWithImage:image];
CAEAGLLayer *layer = [[CAEAGLLayer alloc] init];
layer.frame = CGRectMake(0, 0, self.frame.size.width, self.frame.size.height);
layer.contentsScale = [[UIScreen mainScreen] scale];
layer.opaque = NO;
[self.layer addSublayer:layer];
[self bindRenderLayer:layer];
}
- (void)bindRenderLayer:(CALayer <EAGLDrawable> *)layer {
glGenRenderbuffers(1, &renderBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, renderBuffer);
[self.context renderbufferStorage:GL_RENDERBUFFER fromDrawable:layer];
glGenFramebuffers(1, &frameBuffer);
glBindFramebuffer(GL_FRAMEBUFFER, frameBuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER,
GL_COLOR_ATTACHMENT0,
GL_RENDERBUFFER,
renderBuffer);
}
- (GLint)drawableWidth {
GLint backingWidth;
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &backingWidth);
return backingWidth;
}
- (GLint)drawableHeight {
GLint backingHeight;
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &backingHeight);
return backingHeight;
}
above sample code just part of display texture and it's works fine. The renderBuffer and framebuffer is property of my class.
sample get pixel code here, buffer is empty after use glReadPixels? Is anything I missed to setup?
glBindRenderbuffer(GL_RENDERBUFFER, renderBuffer);
// I'm try to use one or both of the bind method but not worked
//glBindFramebuffer(GL_FRAMEBUFFER, frameBuffer);
NSInteger dataLength = self.drawableWidth * self.drawableHeight * 4;
GLubyte *buffer = (GLubyte *)malloc(dataLength * sizeof(GLubyte));
glReadPixels(0,
0,
self.drawableWidth,
self.drawableHeight,
GL_RGBA,
GL_UNSIGNED_BYTE,
buffer);
I found a solution. When you set CAEAGLLayer's drawableProperties like this:
layer.drawableProperties = #{
kEAGLDrawablePropertyRetainedBacking: #(YES),
kEAGLDrawablePropertyColorFormat: kEAGLColorFormatRGBA8
};
kEAGLDrawablePropertyRetainedBacking = YES makes it so you can get the buffer when finshed rendering.
I have same problem with this one, however with those tips i sitll can not get the data from glReadPixels.
I paste my sources code, my code is almost same with the previous one.And I set GL_READ_FRAMEBUFFER_APPLE before snapshot, but the data returns null.
Create My Frame Buffer
- (void)createFrameBuffer {
glGenRenderbuffers(1, &colorRenderBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderBuffer);
[_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:_eaglLayer];
GLint backingWidth;
GLint backingHeight;
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &backingWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &backingHeight);
glGenFramebuffers(1, &defaultFrameBuffer);
glBindFramebuffer(GL_FRAMEBUFFER, defaultFrameBuffer);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,GL_RENDERBUFFER, colorRenderBuffer);
glGenFramebuffers(1, &sampleFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, sampleFramebuffer);
glGenRenderbuffers(1, &sampleColorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, sampleColorRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER, 4, GL_RGBA8_OES, backingWidth, backingHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, sampleColorRenderbuffer);
glGenRenderbuffers(1, &sampleDepthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, sampleDepthRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER, 4, GL_DEPTH_COMPONENT16, backingWidth, backingHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, sampleDepthRenderbuffer);
if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE){
NSAssert(true, #"buffer is not complete");
NSLog(#"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
}
}
Render Function
- (void)render {
FC_PERFORMANCE_START();
if ([EAGLContext currentContext] !=_context) {
[EAGLContext setCurrentContext:_context];
}
[self destoryFrameBuffer];
[self createFrameBuffer];
[self.director setup:self.frame.size contentScale:self.contentScaleFactor backgroudColor:self.backgroundColor];
[self.director mainloop];
glBindFramebuffer(GL_READ_FRAMEBUFFER_APPLE, sampleFramebuffer);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER_APPLE, defaultFrameBuffer);
glResolveMultisampleFramebufferAPPLE();
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderBuffer);
const GLenum discards[] = {GL_COLOR_ATTACHMENT0,GL_DEPTH_ATTACHMENT};
glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE,2,discards);
glBindFramebuffer(GL_FRAMEBUFFER, sampleFramebuffer);
[_context presentRenderbuffer:GL_RENDERBUFFER];
FC_PERFORMANCE_END("FCViewOpenGL Render");
}
Screen Shot Function
- (nullable UIImage*)snapShot{
__block UIImage *ret = nil;
dispatch_sync(dispatch_get_main_queue(), ^{
glBindFramebuffer(GL_READ_FRAMEBUFFER_APPLE, defaultFrameBuffer);
GLint backingWidth;
GLint backingHeight;
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &backingWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &backingHeight);
NSInteger x = 0, y = 0, width2 = backingWidth, height2 = backingHeight;
NSInteger dataLength = width2 * height2 * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));
CHECK_GL_ERROR_DEBUG();
glReadPixels((GLint)x, (GLint)y, (GLsizei)width2, (GLsizei)height2, GL_RGBA, GL_UNSIGNED_BYTE, data);
GLenum attachments[] = {GL_COLOR_ATTACHMENT0, GL_DEPTH_ATTACHMENT };
glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE, 2, attachments);
CHECK_GL_ERROR_DEBUG();
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGImageRef iref = CGImageCreate(width2, height2, 8, 32, width2 * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast,
ref, NULL, true, kCGRenderingIntentDefault);
NSInteger widthInPoints, heightInPoints;
if (NULL != UIGraphicsBeginImageContextWithOptions) {
CGFloat scale = self.contentScaleFactor;
widthInPoints = width2 / scale;
heightInPoints = height2 / scale;
UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
}
else {
widthInPoints = width2;
heightInPoints = height2;
UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
}
CGContextRef cgcontext = UIGraphicsGetCurrentContext();
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
free(data);
CFRelease(ref);
CFRelease(colorspace);
CGImageRelease(iref);
ret = image;
});
return ret;
}
Can anyone please give me some help?
See OpenGL ES 3.2 Specification; 16.1.2 ReadPixels; page 406
An INVALID_OPERATION error is generated if the value of READ_FRAMEBUFFER_BINDING (see section 9) is non-zero, the read framebuffer is framebuffer complete, and the effective value of SAMPLE_BUFFERS for the read framebuffer is one.
....
If the read framebuffer is multisampled (its effective value of SAMPLE_BUFFERS is one) ...
See also OpenGL-Refpages; OpenGL ES 3.0; glReadPixels:
GL_INVALID_OPERATION is generated if GL_READ_FRAMEBUFFER_BINDING is non-zero, the read framebuffer is complete, and the value of GL_SAMPLE_BUFFERS for the read framebuffer is greater than zero.
This means, that you can't use glReadPixels an a multisample framebuffer. You will gain a GL_INVALID_OPERATION operation error. Use glGetError after glReadPixels to get the error information.
To solve the issue you have to go the way over a conventional framebuffer. Do the following steps:
Create a 2nd framebuffer object, which is conventional (not multisampled).
Bind the multisample framebuffer for reading (GL_READ_FRAMEBUFFER_APPLE).
Bind the conventional framebuffer for drawing (GL_DRAW_FRAMEBUFFER_APPLE).
Use glBlitFramebuffer copy to copy the pixels from the multisample (read) framebuffer to the conventional (draw) framebuffer.
After copying, bind the conventional framebuffer for reading (GL_READ_FRAMEBUFFER_APPLE).
Finally use glReadPixels to read from the conventional framebuffer.
As an alternative, of course you can read the pixels directly from the default framebuffer.
I'm programming on iOS using OpenGL ES and I'm trying to implement MSAA. I have succeeded in doing this. My screen comes out being anti-aliased. However, every frame the following string is logged to my output: "Failed to make complete framebuffer object 8cd6". 0x8CD6 means GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT. So, while it is working, I'm not sure what to make of this. I have read my code a hundred times and I see no mistakes with it.
The error is logged before the first time Render() is called but after Setup() has finished. I don't know exactly where, because what comes in-between those two function calls is code written by Apple.
void Setup() {
// ...
glGenFramebuffers(1, &resolveFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, resolveFramebuffer);
glGenRenderbuffers(1, &resolveColorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, resolveColorRenderbuffer);
[[self context] renderbufferStorage:GL_RENDERBUFFER fromDrawable:layer];
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, resolveColorRenderbuffer);
int backingWidth, backingHeight;
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &backingWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &backingHeight);
if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE)
std::cerr << "Failed to make complete resolve framebuffer." << std::endl;
glGenFramebuffers(1, &sampleFramebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, sampleFramebuffer);
GLuint sampleColorRenderbuffer;
glGenRenderbuffers(1, &sampleColorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, sampleColorRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER, 4, GL_RGBA8_OES, backingWidth, backingHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, sampleColorRenderbuffer);
GLuint sampleDepthRenderbuffer;
glGenRenderbuffers(1, &sampleDepthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, sampleDepthRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER, 4, GL_DEPTH_COMPONENT16, backingWidth, backingHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, sampleDepthRenderbuffer);
if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE)
std::cerr << "Failed to make complete sample framebuffer." << std::endl;
glBindFramebuffer(GL_FRAMEBUFFER, resolveFramebuffer);
}
void Render() {
glBindFramebuffer(GL_FRAMEBUFFER, sampleFramebuffer);
glViewport(0, 0, renderer->DeviceWidth(), renderer->DeviceHeight());
glClearColor(0xfa/255.f, 0xe6/255.f, 0xd4/255.f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBindVertexArrayOES(vertexArray);
glUseProgram(program);
GLuint mvpm = glGetUniformLocation(program, "modelViewProjectionMatrix");
GLuint nm = glGetUniformLocation(program, "normalMatrix");
glUniformMatrix4fv(mvpm, 1, GL_FALSE, modelViewProjectionMatrix.begin());
glUniformMatrix4fv(nm, 1, GL_FALSE, normalMatrix.begin());
glDrawArrays(GL_TRIANGLES, 0, verts.size()/2);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER_APPLE, resolveFramebuffer);
glBindFramebuffer(GL_READ_FRAMEBUFFER_APPLE, sampleFramebuffer);
glResolveMultisampleFramebufferAPPLE();
const GLenum discards[] = {GL_COLOR_ATTACHMENT0, GL_DEPTH_ATTACHMENT};
glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE, 2, discards);
glBindRenderbuffer(GL_RENDERBUFFER, resolveColorRenderbuffer);
[[self context] presentRenderbuffer:GL_RENDERBUFFER];
}
I'm trying to get pixels from framebuffer with multisampling. It returns only zeros. I do call glResolveMultisampleFramebufferAPPLE as suggested here and here, but I can not figure out whats the problem in my case.
first of all I create non-multisampled framebuffer with color attachment:
GLuint framebuffer, colorRenderbuffer;
glGenFramebuffersOES(1, &framebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
glGenRenderbuffersOES(1, &colorRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGBA8_OES, w, h);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, colorRenderbuffer);
then create multisample framebuffer with color and depth attachment:
GLuint sampleFramebuffer, sampleColorRenderbuffer, sampleDepthRenderbuffer;
glGenFramebuffersOES(1, &sampleFramebuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, sampleFramebuffer);
glGenRenderbuffersOES(1, &sampleColorRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleColorRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_RGBA8_OES, w, h);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, sampleColorRenderbuffer);
glGenRenderbuffersOES(1, &sampleDepthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_DEPTH_COMPONENT16_OES, w, h);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);
then clear framebuffers:
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
glClear(GL_COLOR_BUFFER_BIT);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, sampleFramebuffer);
glViewport(0, 0, w, h);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
do my drawing (this is Cocos3D drawing code):
[cc3Layer visit];
then resolve buffers:
glBindFramebufferOES(GL_DRAW_FRAMEBUFFER_APPLE, framebuffer);
glBindFramebufferOES(GL_READ_FRAMEBUFFER_APPLE, sampleFramebuffer);
glResolveMultisampleFramebufferAPPLE();
glBindFramebufferOES(GL_READ_FRAMEBUFFER_APPLE, framebuffer);
and then get all zeros:
glReadPixels(0, 0, w, h, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
I skipped two gl checks for success with creating framebuffer, since they are creating successfully. Where is the error in my code?
You should bind the non-multisample color-renderbuffer BEFORE read pixels.
like that:
glResolveMultisampleFramebufferAPPLE()
glBindFramebufferOES(GL_READ_FRAMEBUFFER_APPLE, framebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
glReadPixels(0, 0, w, h, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
the problem was not in OpenGL, but in Cocos3d rendering (-visit did not set up some properties for drawing, but -drawScene did). Here is the working code:
+(UIImage*) takeScreenshotFromScreenRect:(CGRect)rect withResultSize:(CGSize)outSize
{
CCDirector *director = [CCDirector sharedDirector];
director.nextDeltaTimeZero = YES;
rect.origin.x *= CC_CONTENT_SCALE_FACTOR();
rect.origin.y *= CC_CONTENT_SCALE_FACTOR();
rect.size.width *= CC_CONTENT_SCALE_FACTOR();
rect.size.height *= CC_CONTENT_SCALE_FACTOR();
int w = rect.size.width;
int h = rect.size.height;
int winW = director.winSizeInPixels.width;
int winH = director.winSizeInPixels.height;
GLuint bufferLength = w * h * 4;
GLubyte* buffer = (GLubyte*)malloc(bufferLength);
[director pause];
static GLuint framebuffer = 0, colorRenderbuffer;
static GLuint sampleFramebuffer, sampleColorRenderbuffer, sampleDepthRenderbuffer;
if (framebuffer == 0)
{
glGenFramebuffersOES(1, &framebuffer);
glGenRenderbuffersOES(1, &colorRenderbuffer);
glGenFramebuffersOES(1, &sampleFramebuffer);
glGenRenderbuffersOES(1, &sampleColorRenderbuffer);
glGenRenderbuffersOES(1, &sampleDepthRenderbuffer);
}
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGBA8_OES, winW, winH);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, colorRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, sampleFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleColorRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, director.openGLView.pixelSamples, GL_RGBA8_OES, winW, winH);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, sampleColorRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);
glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, director.openGLView.pixelSamples, GL_DEPTH_COMPONENT16_OES, winW, winH);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
glClear(GL_COLOR_BUFFER_BIT);
glClearColor(150.0/255, 190.0/255, 255.0/255, 1);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, sampleFramebuffer);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glClearColor(150.0/255, 190.0/255, 255.0/255, 1);
[director drawScene];
glBindFramebufferOES(GL_DRAW_FRAMEBUFFER_APPLE, framebuffer);
glBindFramebufferOES(GL_READ_FRAMEBUFFER_APPLE, sampleFramebuffer);
glResolveMultisampleFramebufferAPPLE();
glBindFramebufferOES(GL_READ_FRAMEBUFFER_APPLE, framebuffer);
glReadPixels(rect.origin.x, rect.origin.y, w, h, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
GLenum attachments[] = {GL_COLOR_ATTACHMENT0_OES, GL_DEPTH_ATTACHMENT_OES};
glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE, 2, attachments);
// restoring render buffers from cocos
ES1Renderer *renderer = [[CCDirector sharedDirector].openGLView valueForKey:#"renderer_"];
glBindFramebuffer(GL_FRAMEBUFFER_OES, renderer.msaaFrameBuffer);
glBindFramebuffer(GL_RENDERBUFFER_OES, renderer.msaaColorBuffer);
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, bufferLength, NULL);
[director resume];
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * w;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef iref = CGImageCreate(w, h, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
uint32_t* pixels = (uint32_t*)malloc(bufferLength);
CGContextRef context = CGBitmapContextCreate(pixels, outSize.width, outSize.height, 8, outSize.width * 4, CGImageGetColorSpace(iref), kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGContextTranslateCTM(context, 0, outSize.height);
CGContextScaleCTM(context, 1.0f, -1.0f);
switch (director.deviceOrientation)
{
case CCDeviceOrientationPortrait:
break;
case CCDeviceOrientationPortraitUpsideDown:
CGContextRotateCTM(context, CC_DEGREES_TO_RADIANS(180));
CGContextTranslateCTM(context, -outSize.width, -outSize.height);
break;
case CCDeviceOrientationLandscapeLeft:
CGContextRotateCTM(context, CC_DEGREES_TO_RADIANS(-90));
CGContextTranslateCTM(context, -outSize.height, 0);
break;
case CCDeviceOrientationLandscapeRight:
CGContextRotateCTM(context, CC_DEGREES_TO_RADIANS(90));
CGContextTranslateCTM(context, outSize.width * 0.5f, -outSize.height);
break;
}
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, outSize.width, outSize.height), iref);
CGImageRef imageFromContext = CGBitmapContextCreateImage(context);
UIImage *outputImage = [UIImage imageWithCGImage:imageFromContext];
CGDataProviderRelease(provider);
CGImageRelease(iref);
CGContextRelease(context);
free(buffer);
free(pixels);
return outputImage;
}
I have got a GLKView, where I try to draw a couple of cubes and I create textures from a view and map them onto the cubes. However, when I start the app on a retina device, the textures are correctly sized but they look terrible. I have tried to set the contentScaleFactor of the GLKView to the scale of the main screen - to no avail. I have also tried to multiply the the buffers dimensions by the scale, which resulted in textures that looked crisp, but were only 1/4 of the original size...
Without further ado, I may present you what I have done (without above indicated multiplication):
GLKView
- (void)setupGL {
UIScreen *mainScreen = [UIScreen mainScreen];
const CGFloat scale = mainScreen.scale;
self.contentScaleFactor = scale;
self.layer.contentsScale = scale;
glGenFramebuffers(1, &defaultFrameBuffer);
glBindFramebuffer(GL_FRAMEBUFFER, defaultFrameBuffer);
glGenRenderbuffers(1, &depthBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, self.bounds.size.width, self.bounds.size.height);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthBuffer);
glGenRenderbuffers(1, &colorBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, colorBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA4, self.bounds.size.width, self.bounds.size.height);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ARRAY, GL_RENDERBUFFER, colorBuffer);
glEnable(GL_DEPTH_TEST);
}
Here I load the textures
// make space for an RGBA image of the view
GLubyte *pixelBuffer = (GLubyte *)malloc(
4 *
cV.bounds.size.width *
cV.bounds.size.height);
// create a suitable CoreGraphics context
CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context =
CGBitmapContextCreate(pixelBuffer,
cV.bounds.size.width, cV.bounds.size.height,
8, 4*cV.bounds.size.width,
colourSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colourSpace);
// draw the view to the buffer
[cV.layer renderInContext:context];
// upload to OpenGL
glTexImage2D(GL_TEXTURE_2D, 0,
GL_RGBA,
cV.bounds.size.width, cV.bounds.size.height, 0,
GL_RGBA, GL_UNSIGNED_BYTE, pixelBuffer);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
The answer to this question can be found here
How to create a CGBitmapContext which works for Retina display and not wasting space for regular display?
What I basically did, is to multiply the texture and the buffer by the screen's scale factor and because this only yielded a texture that was 1/4 of the size, I had to multiply the context by the scale factor as well
CGContextScaleCTM(context, scaleFactor, scaleFactor);