OpenGL Screen Recording iOS - ios

I have trouble with screen recording. Right now I'm using "drawViewHierarchyInRect: afterScreenUpdates:" and feeding the pixel buffer to an AVAssetWriterInputPixelBufferAdaptor, this is working fine, but only on an iPhone 5s/5. On iPad and iPhone 4s this method is performing way too bad, 10-15 fps. I need at least 25-30.
My current method is the best method so far. I've been trying glReadPixels and renderInContext (doesn't work with live camera feed.
So I've been around searching on stackoverflow, I found a couple of alternatives and most of them I've tried. But the last one I found, OpenGL ES 2d rendering into image, but I can't get it to work and I don't know if it's worth the time.
if ([[CCDirector sharedDirector] isPaused] || !writerInput || !writerInput.readyForMoreMediaData || !VIDEO_WRITER_IS_READY) {
return;
}
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)[[[CCDirector sharedDirector] openGLView] context], NULL, &rawDataTextureCache);
if (err) {
NSAssert(NO, #"Error at CVOpenGLESTextureCacheCreate %d");
}
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
CVPixelBufferCreate(kCFAllocatorDefault,
(int)esize.width,
(int)esize.height,
kCVPixelFormatType_32BGRA,
attrs,
&renderTarget);
CVOpenGLESTextureRef renderTexture;
CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
rawDataTextureCache,
renderTarget,
NULL, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
(int)esize.width,
(int)esize.height,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
&renderTexture);
CFRelease(attrs);
CFRelease(empty);
glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);
CVPixelBufferLockBaseAddress(renderTarget, 0);
CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
CMTime presentationTime = CMTimeMake(elapsedTime * 30, 30);
if(![adaptor appendPixelBuffer:renderTarget withPresentationTime:presentationTime]) {
NSLog(#"Adaptor FAIL");
}
CVPixelBufferUnlockBaseAddress(renderTarget, 0);
CVPixelBufferRelease(renderTarget);
Above is the code that is relevant, I've been feeding a pixel buffer to my adaptor and it has been working fine up until now.
The adaptor is just failing, and logging "Adaptor FAIL". Don't get any error.
I don't know if I'm completely off, trying to do this with the EAGLContext of a cocos2d app.
Thanks in advance.
* UPDATE *
I changed,
CMTime presentationTime = CMTimeMake(elapsedTime * 30, 30);
to,
CMTime presentationTime = CMTimeMake(elapsedTime * 120, 120);
I believe that 30 is not enough since it was running faster than 30 FPS. I probably was adding multiple frames at the same time because of the increased frame rate which made the adaptor fail. So the adaptor stopped failing now, but the screen still freezes. Though I know where the buttons are and I managed to stop the recording and play the video. It works. But flashing black screen every other frame.

Related

Get upright frames from AVPlayerItemVideoOutput when video has rotated exif orientation

I'm using an AVPlayerItemVideoOutput to get video frames from an AVPlayer and upload them to a GL texture for display. The issue is that AVPlayerItemVideoOutput seems to ignore the video's rotation exif data, so the CVPixelBufferRef it returns isn't upright.
Options:
I could edit my GL code to counter rotate the texture when displaying it, but i'd kind of prefer to get the frames upright in the first place so I don't have to transform the texture coordinates.
Some magic to get AVPlayerItemVideoOutput to give me the frames upright in the first place. Solution must be hardware accelerated.
code:
//
// Setup
//
AVPlayer *player = [AVPlayer playerWithURL:fileURL];
AVPlayerItemVideoOutput *output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:#{
(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32ARGB),
(id)kCVPixelBufferOpenGLCompatibilityKey: #YES,
}];
[player.currentItem addOutput:output];
//
// getting the frame data into a GL texture
//
CMTime currentTime = player.currentTime;
if ([output hasNewPixelBufferForItemTime:currentTime]){
CVPixelBufferRef frame = [output copyPixelBufferForItemTime:currentTime itemTimeForDisplay:NULL];
CVPixelBufferLockBaseAddress(frame, kCVPixelBufferLock_ReadOnly);
GLsizei height = (GLsizei)CVPixelBufferGetHeight(frame);
GLsizei bpr = (GLsizei)CVPixelBufferGetBytesPerRow(frame);
void *data = CVPixelBufferGetBaseAddress(frame);
glBindTexture(GL_TEXTURE_2D, gltexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bpr/4, height, 0, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8, data);
CVPixelBufferUnlockBaseAddress(frame, kCVPixelBufferLock_ReadOnly);
CVPixelBufferRelease(frame);
}

How to restore a GL_RENDERBUFFER?

I am working on storing and restoring my OpenGL ES based application's state.
I have a function to save the GL_RENDERBUFFER to dump the data with the following code:
glBindFramebuffer(GL_FRAMEBUFFER, fboTextureBufferData.framebuffer);
glBindRenderbuffer(GL_RENDERBUFFER, fboTextureBufferData.colorbuffer);
GLint x = 0, y = 0, width2 = backingWidth, height2 = backingHeight;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));
// Read pixel data from the framebuffer
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);
I don't see a glWritePixels function. What is the best way to repopulate the GL_RENDERBUFFER with the GLubyte data populated above? An example would be greatly appreciated.
EDIT 3:
Here is how I am attempting to configure the texture render buffer, and the function used to draw it. As noted in the code, if I specify GL_COLOR_ATTACHMENT1 for the glFramebufferTexture2D parameter, the stored pixel data is restored but I can't get any updates to draw. But if I use GL_COLOR_ATTACHMENT0 instead, I get drawing updates but no pixel data restored.
I have tried various combinations (for instance also using GL_COLOR_ATTACHMENT1 for the glFramebufferRenderbuffer parameter) but then I get an invalid frame buffer error when attempting to render. It seems I am so close, but can't figure out how to get them both restoring and rendering working together.
- (bool)configureRenderTextureBuffer {
[EAGLContext setCurrentContext:context];
glBindRenderbuffer(GL_RENDERBUFFER, viewRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer*)self.layer];
glBindRenderbuffer(GL_RENDERBUFFER, viewRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer*)self.layer];
glGenFramebuffers(1, &fboTextureBufferData.framebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, fboTextureBufferData.framebuffer);
glGenRenderbuffers(1, &fboTextureBufferData.colorbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, fboTextureBufferData.colorbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA8, backingWidth, backingHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, fboTextureBufferData.colorbuffer);
// Generate texture name, stores in .textureID
glGenTextures(1, &fboTextureBufferData.textureID);
glBindTexture(GL_TEXTURE_2D, fboTextureBufferData.textureID);
////////////////// Read Existing texture data //////////////////
NSString *dataPath = [TDTDeviceUtilitesLegacy documentDirectory]; //This just returns the app's document directory
NSData *data = [NSData dataWithContentsOfFile:[NSString stringWithFormat:#"%#/buffer.data", dataPath]];
GLubyte *pixelData = (GLubyte*)[data bytes];
// If I use GL_COLOR_ATTACHMENT1 here, my existing pixel data is restored
// but no drawing occurs. If I use GL_COLOR_ATTACHMENT0, then data isn't
// restored but drawing updates work
glFramebufferTexture2D ( GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT1, GL_TEXTURE_2D, fboTextureBufferData.textureID, 0 );
// Populate with existing data
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, backingWidth, backingHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, &pixelData[0]); //&image[0]
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER) ;
if(status != GL_FRAMEBUFFER_COMPLETE) {
NSLog(#"failed to make complete render texture framebuffer object %x", status);
return false;
}
return true;
}
Here is the code for rendering. The viewFramebuffer is attached to GL_COLOR_ATTACHMENT0 and is used so the texture frame buffer can be zoomed and positioned inside the view.
- (void)renderTextureBuffer {
//Bind the texture frame buffer, if I don't use this, I can't get it to draw
//glBindFramebuffer(GL_FRAMEBUFFER, fboTextureBufferData.framebuffer);
//If I use this instead of binding the framebuffer above, I get no drawing and black background
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, fboTextureBufferData.textureID, 0);
renderParticlesToTextureBuffer();
//Bind the view frame buffer.
glBindFramebuffer(GL_FRAMEBUFFER, viewFramebuffer);
glBindRenderbuffer(GL_RENDERBUFFER, viewRenderbuffer);
drawFboTexture();
[context presentRenderbuffer:GL_RENDERBUFFER];
}
If you need to write data directly to a render target, using a renderbuffer is not a good option. In this case, it's much better to use a texture instead.
Using a texture as a FBO attachment works very similarly to using a renderbuffer. Where you currently use glRenderbufferStorage() to allocate a renderbuffer of the needed dimensions, you create a texture instead, and allocate its storage with:
GLuint texId = 0;
glGenTextures(1, &texId);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0,
GL_RGBA, GL_UNSIGNED_SHORT_5_6_5, 0);
Then you attach it to the framebuffer with:
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texId, 0);
Now, if you later want to fill your render target with data, you can simply call glTexImage2D() again, or even better glTexSubImage2D() if the size is unchanged, to do that.

Fast way to create openGL texture from JPEG-2000?

I need to load large-ish (5 megapixel) jpeg images and create openGL texture from them. They are non-power-of-two, and cannot be pre-processed for this application. Loading is extremely slow, about one second per image on an iPad Air 2. I need to load a dozen or two such images and create a GL texture for each, as quickly as I can.
Profiling shows the bottleneck to be CGContextDrawImage. Previous answers suggest this is a common problem.
This previous answer seems most relevant and (unfortunately) does not leave me hopeful. I haven't tried lib-jpeg (suggested in another answer) yet - trying to keep third party code out for several reasons.
But - that answer was 2014 and things change. Does anybody know of a faster way to create textures from jpegs? Either by changing the arguments to CGContextDrawImage (as in this answer- I've tried the suggested changes with no noticeable speed change) or using a different approach entirely?
The current texture creation block (called asynchronously):
UIImage *image = [UIImage imageWithData:jpegImageData];
if (image) {
GLuint textureID;
glGenTextures(1, &textureID);
glBindTexture( GL_TEXTURE_2D, textureID);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );
GLsizei width = (GLsizei)CGImageGetWidth(image.CGImage);
GLsizei height = (GLsizei)CGImageGetHeight(image.CGImage);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
void *imageData = malloc( height * width * 4 );
CGContextRef imgcontext = CGBitmapContextCreate( imageData, width, height, 8, 4 * width, colorSpace, kCGImageAlphaNoneSkipLast | kCGBitmapByteOrder32Big );
CGColorSpaceRelease( colorSpace );
CGContextDrawImage( imgcontext, CGRectMake( 0, 0, width, height ), image.CGImage );
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData);
CGContextRelease(imgcontext);
free(imageData);
// ... store the textureID for use by the caller
// ...
}
(edited to add)
I tried GLKTextureLoader. I kept getting a nil return value, with error theError NSError * domain: "GLKTextureLoaderErrorDomain" - code: 12.
I've realized that the JPEGs I need to load are JPEG 2000; and that may be the problem. I've played with the GLKTextureLoader approach; I can get it to work non-J2K jpegs, but not the J2K ones I need to load. (FWIW, the files I need to load are packed inside larger files, thus I extract a data subrange from within the file, as such:
NSData *jpegImageData = [data subdataWithRange:NSMakeRange(offset, dataLength)];
GLKTextureInfo *jpegTexture;
NSError *theError;
jpegTexture = [GLKTextureLoader textureWithContentsOfData:jpegImageData options:nil error:&theError];
but, as mentioned, jpegImageData comes back as nil with the aforementioned error. This works on small jpegs, even using the subdataWithRange approach.
Likewise,
UIImage *image = [UIImage imageWithData:jpegImageData];
jpegTexture = [GLKTextureLoader textureWithCGImage:image.CGImage options:nil error:&theError];
returns nil with the same "code 12" error.
This iOS Developer page (Table 1-1) suggests that JPEG-2000 is supported on OS X only, but when I try the
CFArrayRef mySourceTypes = CGImageSourceCopyTypeIdentifiers();
CFShow(mySourceTypes);
approach for showing supported formats, JPEG-2000 is among them (running on my iOS device):
33 : <CFString 0x19d721bf8 [0x1a1da0150]>{contents = "public.jpeg-
Any suggestions for using the faster GLKTextureLoader methods on JPEG-2000?
Did you try the GLKit Framework method?
GLKTexGtureInfo *spriteTexture;
NSError *theError;
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"Sprite" ofType:#"jpg"]; // 1
spriteTexture = [GLKTextureLoader textureWithContentsOfFile:filePath options:nil error:&theError]; // 2
glBindTexture(spriteTexture.target, spriteTexture.name); // 3

OpenGL format texture

I am currently trying to display a video on screen using OpenGL ES 2 on iOS.
I will sum up a bit what I am doing to playback and display the video on screen :
First I have a .mov file recorded using a GPUImageMovieWriter object. When the recording is completed I am going to playback the video using AVPlayer. Therefore I set a AVPlayerItemVideoOutput to be able to retrieve frame from the video :
NSDictionary *test = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey];
self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:test];
I then use the copyPixelBufferForItemTime function from the AVPlayerItemVideoOutput and receive the CVImageBufferRef corresponding to the frame of the initial video at a specific time.
Finally, here is the function I created to create an OpenGL texture from the buffer :
- (void)setupTextureFromBuffer:(CVImageBufferRef)imageBuffer {
CVPixelBufferLockBaseAddress(imageBuffer, 0);
int bufferHeight = CVPixelBufferGetHeight(imageBuffer);
int bufferWidth = CVPixelBufferGetWidth(imageBuffer);
CVPixelBufferGetPixelFormatType(imageBuffer);
glBindTexture(GL_TEXTURE_2D, m_videoTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bufferWidth, bufferHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(imageBuffer));
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
}
By doing this (and also using some non related algorithms to do some augmented reality things) I got a very strange result as if the video has been put in slices(I can't show you because I don't have enough reputation to do so).
It looks like the data are not well interpreted by OpenGL (wrong format ? type ?)
I checked whether it could be a corrupted buffer error by using this function :
- (void)saveImage:(CVPixelBufferRef)pixBuffer
{
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixBuffer];
CIContext *temporaryContext = [CIContext contextWithOptions:nil];
CGImageRef videoImage = [temporaryContext
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
CVPixelBufferGetWidth(pixBuffer),
CVPixelBufferGetHeight(pixBuffer))];
UIImage *uiImage = [UIImage imageWithCGImage:videoImage];
UIImageWriteToSavedPhotosAlbum(uiImage, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}
-> The saved image appeared properly in the photo album.
It may come from the .mov file but what can I do to check if there's something wrong with this file ?
Thanks a lot for your help, I'm really stuck on this problem for hours/days !
You need to use kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange.
Then transfer them to separate chroma and luma OpenGLES textures. Example at https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Listings/AVBasicVideoOutput_APLEAGLView_m.html
I tried using several RGB based options but could not make it work.

Minimal example for creating FBO using OpenGL ES 2.0 on iOS

I am having trouble creating an FBO, and then using that as a texture in my iOS app. I've uploaded a stripped down version of my app on github, which shows the problem. I get just a blank screen with a weird color which I never set. Checking status shows that the FBO was created successfully.
https://github.com/glman74/simpleFBO
I have already looked at related stackoverflow questions, and specifically the link below from datenwolf which shows how to set up an FBO using GLUT, but does not use shaders. I am still not sure what I am doing wrong.
https://github.com/datenwolf/codesamples/blob/master/samples/OpenGL/minimalfbo/
I am also appending here relevant parts of the code. In the renderFBO method, I am just doing a clear. I would have expected that this solid colored (green) texture would be used when I am rendering the polygon in the main render.
FBO setup:
// intialize FBO
- (void)setupFBO
{
fbo_width = 512;
fbo_height = 512;
glGenFramebuffers(1, &fboHandle);
glGenTextures(1, &fboTex);
glGenRenderbuffers(1, &depthBuffer);
glBindFramebuffer(GL_FRAMEBUFFER, fboHandle);
glBindTexture(GL_TEXTURE_2D, fboTex);
glTexImage2D( GL_TEXTURE_2D,
0,
GL_RGBA,
fbo_width, fbo_height,
0,
GL_RGBA,
GL_UNSIGNED_BYTE,
NULL);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glFramebufferTexture2D(GL_DRAW_FRAMEBUFFER_APPLE, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, fboTex, 0);
glBindRenderbuffer(GL_RENDERBUFFER, depthBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT24_OES, fbo_width, fbo_height);
glFramebufferRenderbuffer(GL_DRAW_FRAMEBUFFER_APPLE, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthBuffer);
// FBO status check
GLenum status;
status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
switch(status) {
case GL_FRAMEBUFFER_COMPLETE:
NSLog(#"fbo complete");
break;
case GL_FRAMEBUFFER_UNSUPPORTED:
NSLog(#"fbo unsupported");
break;
default:
/* programming error; will fail on all hardware */
NSLog(#"Framebuffer Error");
break;
}
glBindFramebuffer(GL_FRAMEBUFFER, 0);
}
FBO render:
// render FBO
- (void)renderFBO
{
glBindTexture(GL_TEXTURE_2D, 0);
glEnable(GL_TEXTURE_2D);
glBindFramebuffer(GL_FRAMEBUFFER, fboHandle);
glViewport(0,0, fbo_width, fbo_height);
glClearColor(0.0f, 1.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBindFramebuffer(GL_FRAMEBUFFER, 0);
}
Main Render:
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
// render FBO tex
[self renderFBO];
glViewport(0, 0, self.view.bounds.size.width, self.view.bounds.size.height);
// render main
glClearColor(0.65f, 0.65f, 0.65f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_TEXTURE_2D);
glActiveTexture(GL_TEXTURE0);
glUniform1i(uSamplerLoc, 0);
// Render
glUseProgram(_program);
glUniformMatrix4fv(uniforms[UNIFORM_MODELVIEWPROJECTION_MATRIX], 1, 0, _modelViewProjectionMatrix.m);
glUniformMatrix3fv(uniforms[UNIFORM_NORMAL_MATRIX], 1, 0, _normalMatrix.m);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
glDisable(GL_TEXTURE_2D);
}
An ideas on what I might be doing wrong?
Edit #1:
I have made some small changes to my code at github. Tried using GL_FRAMEBUFFER instead of GL_DRAW_FRAMEBUFFER_APPLE above. Also added a bind texture call, which was missing. Still doesn't work.
Edit #2:
I found a solution. I needed to call in main render:
// reset to main framebuffer
[((GLKView *) self.view) bindDrawable];
Found a solution, and I have updated the code below in case it is useful for someone.
https://github.com/glman74/simpleFBO/
The main issue is that I needed to call in main render:
// reset to main framebuffer
[((GLKView *) self.view) bindDrawable];
This is because I was using GLKView. The situation is different from the CAEAGLLayer examples.

Resources