iOS GLKit post processing? - ios

I have a game running using glKit, and would like to add some post processing effects using a shader after each frame has rendered.
Is it possible to do this under glKit?

This is possible.
You will need to create your own offscreen framebuffer object and associated texture. Then call [GLKView bindDrawable] to point further rendering at GLKView's framebuffer. You can then perform more rendering, including reading from the texture that you just previously rendered to.
The framebuffer API is all standard OpenGL ES calls, which you can read about in any OpenGL ES 2.0 book. Apple also has some iOS-specific documentation at http://developer.apple.com/library/ios/ipad/#documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/

Related

OpenGL ES deprecated in iOS 12 and SKShader

I am very new to the concept and use of shaders in SpriteKit.
I found this tutorial on how to render a Mandelbrot fractal with a custom shader file - Fractal.fsh - attached to a Color Sprite's Custom Shader property.
https://www.weheartswift.com/fractals-Xcode-6/
It works fine and I thought to my self that learning about OpenGL ES and custom shaders in SpriteKit would be a fun exercise.
According to Apple though, OpenGL ES is deprecated as of iOS 12.
https://developer.apple.com/library/archive/documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/BestPracticesforShaders/BestPracticesforShaders.html
My question is this:
Does this mean that custom shaders for use in SpriteKit should be written in Metal as of now?
I have tried to figure out how to rewrite the fractal.fsh shader code, referred to in the first link, in Metal but I have not - yet - been able to find any resources on how to convert existing custom SKShader's from OpenGL ES to Metal. However, I am NOT looking for someone to rewrite that code to use Metal, only a pointer in the right direction.
UPDATE:
Based on the answer from #Knight0fDragon I will try to clarify my question:
The documentation on the SKShader class states that:
"An SKShader object holds a custom OpenGL ES fragment shader."
https://developer.apple.com/documentation/spritekit/skshader
So if a SKShader object holds a custom OpenGL ES fragment shader, what will it hold after the support for OpenGL ES is deprecated?
How would one go on about creating a custom fragment shader to use in SpriteKit if one cannot use OpenGL ES as of iOS 12?
First I thought that the *.fsh file containing the GLSL code could be replaced with a *.metal file containing equivalent metal code but that assessment was clearly too naive (because I tried and I couldn't assign the *.metal file to the Color Sprite's Custom Shader property)
From the documentation on "Executing Shaders in Metal and OpenGL
":
On devices that support it, the GLSL code you provide to SKShader is automatically converted to Metal shading language and run on a Metal renderer.
So, from my understanding, SpriteKit will use Metal as a backend where it is available and convert your shaders for you when compiling them. I did not find an option to directly write the shaders in Metal.
According to Apple (see near bottom of the page at this link)...
Apps built using OpenGL ES will continue to run in iOS 12, but OpenGL
ES is deprecated in iOS 12. Games and graphics-intensive apps that
previously used OpenGL ES should now adopt Metal [emphasis added].
If you are starting from scratch, I suggest you write shaders in Metal.

Isnt clipping planes work in kivy opengl?

Im trying to make render to texture for reflection and refration texture for water shader... but glClipPlane is not defined in kivy opengl.. Here are some ScreenShots
Test with PyOpengl
Test with kivy Opengl
From Clipping-planes in OpenGL ES 2.0, it looks like this wasn't part of OpenGL ES 2.0, which Kivy nominally targets. If you do want to use it, you probably can, but it isn't part of Kivy's exposed API (these low level opengl calls are usually considered internal to Kivy).

GLKView and double buffering

Reading through Apple docs for GLKView it looks like the GLES rendering is single buffered when using GLKView.That's,GLKView creates one standard FBO, as well as MSAA FBO,if requested.That's it?No double buffering when using GLKView?
Now,if this is true,and GLKView is not double buffered,can I make the default FBO setups manually using CAEAGLLayer.In this case,I can setup as many FBOs as I want and swap between them when blitting to screen?Does it make sense?
Is
[context presentRenderbuffer:GL_RENDERBUFFER];
call even asynchronous?
Is double buffering for GL encouraged on mobile platforms (iOS in this case) from the performance point of view?
The questions above may seem trivial,but I can't find any answers in the official docs.

OpenGL Texture Cache source... can be renderbuffer?

Using OpenGL and CVOpenGLESTextureCacheCreateTextureFromImage:
In the docs here it says that the target can be GL_TEXTURE2D or GL_RENDERBUFFER: what does that mean? Can a renderbuffer bound to the framebuffer at color_attachment_0 be used to get an image?
Your question is slightly confusing.
The docs there are saying that you can push video frames into either a GL_TEXTURE2D or a GL_RENDERBUFFER.
In the first case, you can use the sampler2D in your fragment shader to look up colors from the video and put video on polyons.
In the second case, yes, you can bind the renderbuffer to your FBO there and get images using glReadPixels().
However, I wouldn't call either of these a "source" necessarily? They are targets. Unless you mean as a source for later reading?

glReadPixels alternative

I want to screen capture iOS frames into an AVAssetWriter (or even just a UIImage). The traditional method of using glReadPixels works just fine - but very slow. I understand that since iOS 5.0 I can use a different - faster, method.
I followed lots of posts, like : OpenGL ES 2d rendering into image
which mention the use of CVOpenGLESTextureCacheCreate - but can't get it to work.
Right now, right before every call to presentRenderbuffer: - I'm following apple's sample with glReadPixels (http://developer.apple.com/library/ios/#qa/qa1704/_index.html) which works. When trying to follow this post - OpenGL ES 2d rendering into image to get the image, which basically replaces the call to glReadPixels with creating a cache texture, and binding it to a render target (a pixel buffer), and then read from the pixel buffer - it seems to "steal" the images from the screen - so nothing is rendered.
Can anyone shed some light on how to do this? Also, please mention if this works only for OpenGL ES 2.0 - I am looking for a fast alternative which will work also on previous versions.
A code sample would be excellent.

Resources