I've read that some of the downfalls of SpriteKit is that you're unable to develop shaders if you use it.
However, I read a post here on SO that suggest otherwise:
How to apply full-screen SKEffectNode for post-processing in SpriteKit
Can you develop your own shaders if you decide to use SpriteKit?
Thanks
It is not supported in iOS 7, but iOS 8 will support custom shaders. For more information, view the pre-release documentation of SKShader.
An SKShader object holds a custom OpenGL ES fragment shader. Shader objects are used to customize the drawing behavior of many different kinds of nodes in Sprite Kit.
Sprite Kit does not provide an interface for using custom OpenGL shaders. The SKEffectNode class lets you use Core Image filters to post-process parts of a Sprite Kit scene, though. Core Image provides a number of built-in filters that might do some of what you're after, and on OS X you can create custom filter kernels using a language similar to GLSL.
Related
I wonder, which format for Normal Maps is the correct one to use within SceneKit content, for iOS? As referenced here: DirectX vs. OpenGL normal maps.
OpenGL or DirectX? Or does is not matter?
I had to figure it out by testing the OpenGL vs. DirectX Normal Map Typus side by side on planes. This gives me the following results:
This means, if you have the choice between the OpenGL or the DirectX Normal Map, you better choose OpenGL.
Is it possible to combine SpriteKit with Metal? and if it is, how could one achieve to combine metal particles and SKNodes in a physics world so that the collide with each other, what's the usual approach for this kind of requirement.
Thanks
They are two totally different technologies. Sprite Kit is a framework that abstracts all of the rendering work for you as well as provides you with a built-in physics engine. Whereas Metal is purely a low-level GPU-accelerated graphics API which gives you complete control over the rendering process. It is similar to OpenGL ES but with much less overhead.
Sprite Kit will use Metal (on eligible devices) to render your scene. You don't need to do a single thing because Sprite Kit handles all rendering behind-the-scenes.
You don't combine them, they are two totally different frameworks. If you are looking to add physics to Metal then you will either need to write your own physics engine or use an an already existing engine like Box2D (which I believe Sprite Kit uses internally).
This appears to be possible now using SKRenderer which allows you to mix SpriteKit and Metal (by the looks of it adding SpriteKit to Metal and vice versa).
It's iOS 11+, macOS 10.13+ and tvOS 11+.
I am trying to create an app that simulates a woodblock print effect similar to the app Moku Hanga. I have tried many combinations of the built in CoreImage and GPUImage filters but have not had success. I don't have any experience with OpenGL and GLSL, but I understand that it is possible to write custom CoreImage Kernels in iOS8 and custom fragment shaders in GPUImage. I am learning more about the iOS graphics pipeline and OpenGL ES shaders, but I'll still have to understand more about image manipulation so I can mimic this effect.
Does anyone have recommendations on how I could simulate the Moku Hanga effect using one of these frameworks or approaches (filter composition or custom shader)?
In the documentation of SCNView it is stated that:
SceneKit supports OpenGL ES 3.0, but some features are disabled when rendering in a OpenGL ES 3.0 context
I could not find anywhere which features were disabled. I wanted to use my own shader with SceneKit (assigning a SCNProgram to my material) and I tried to use a 3D texture. But I got the following error:
SceneKit: error, C3DBaseTypeFromString: unknown type name 'sampler3D'
So I'm guessing that 3D textures are part of the disabled features but I could not find a confirmation anywhere. Do I have to give up on SceneKit and do all my rendering with OpenGL manually just to use 3D textures?
Bonus question: Why Apple would support only a subset of OpenGL ES 3.0 in SceneKit since iOS has full support?
Some features of SceneKit don't work in an ES3 context. You should still be able to use all ES3 features in your OpenGL code.
This looks like an error in SceneKit detecting the uniform declaration for use with its higher-level APIs... so you won't be able to, say, bind an SCNMaterialProperty to that uniform with setValue:forKey:. However, you should still be able to use the shader program -- you'll have to bind it with glBindTexture/glActiveTexture instead (inside a block you set up with handleBindingOfSymbol:usingBlock:).
we want to make a 3D Game for Apple iPad and we ar thinking about
a possibility, to import 3D-Models from Blender into Xcode.
Is there a way to do that?
We want to use opengl-es 2.0.
XCode isn't a game engine or 3D SDK. You can't 'import' blender files into XCode. What you can do is take your 3D assets and use them within your apps, either directly through OpenGL (rather low level), or using a 3D engine such as Unity (easier).
Either way, there are a number of questions already on Stackoverflow that you might find useful:
Opengl ES how to import a 3D model and map textures to it on runtime
Iphone + OpenGL ES + Blender Model: Rotation by Touch
Choosing 3D Engine for iOS in C++
...I highly recommend you take a look at what options are out there, decide on the best way to implement your 3D game (be it raw OpenGL or an engine), and go from there.