What is the difference between files named "name. Shader" & "name. vsh" & "name. fsh" in scene kit ?when i call some shaders in my project my model would be like a purple mask. What should i do?
there are three kinds of shaders when working with SceneKit.
As for every OpenGL app there are vertex shaders and fragment shaders. Vertex shaders often have the .vert or .vsh extension and fragment shaders often have the .frag or .fsh extension. These shaders are used with the SCNProgram class.
In addition SceneKit exposes the concept of shader modifiers which often have the .shader extension. Shader modifiers can affect either a vertex of fragment shader and are used with the SCNShadable protocol.
These file extensions are just indications and could be really anything you want.
Related
I’m a beginner to MSL, I’ve written a pair of vertex and fragment shaders. The fragment shader takes a material argument to implement phong lighting. I then learned about instancing and modified my vertex function with the instance_id tag, but it seems MSL has no analogous tag for my material argument in the fragment shader. I could put material in my vertex uniform, but I’d like instances to be able to have different materials. I could save the instance_id in my rasterizer struct and pass a material buffer to the fragment shader, but that ruins (my understanding of) the point of instancing. Is there a way to instance a fragment function, or a good-practice alternative?
In OpenGLES or Metal, I render something to the screen using vertex/fragment shaders. But immediately after fragment shader is done I need to pass few vertices and draw polylines connecting those vertices. How do I do this? In other words, is it possible to chain output of shaders and basic OpenGL commands that draws polygons? I could in principal draw the lines by implementing additional logic in the fragment shader and it will involve lot of calculation and if-then-else, which I think is not very clean approach.
In the documentation of SCNView it is stated that:
SceneKit supports OpenGL ES 3.0, but some features are disabled when rendering in a OpenGL ES 3.0 context
I could not find anywhere which features were disabled. I wanted to use my own shader with SceneKit (assigning a SCNProgram to my material) and I tried to use a 3D texture. But I got the following error:
SceneKit: error, C3DBaseTypeFromString: unknown type name 'sampler3D'
So I'm guessing that 3D textures are part of the disabled features but I could not find a confirmation anywhere. Do I have to give up on SceneKit and do all my rendering with OpenGL manually just to use 3D textures?
Bonus question: Why Apple would support only a subset of OpenGL ES 3.0 in SceneKit since iOS has full support?
Some features of SceneKit don't work in an ES3 context. You should still be able to use all ES3 features in your OpenGL code.
This looks like an error in SceneKit detecting the uniform declaration for use with its higher-level APIs... so you won't be able to, say, bind an SCNMaterialProperty to that uniform with setValue:forKey:. However, you should still be able to use the shader program -- you'll have to bind it with glBindTexture/glActiveTexture instead (inside a block you set up with handleBindingOfSymbol:usingBlock:).
I was wondering if it is worth it to use shaders to draw a 2D texture in xna. I am asking because with openGL it is much faster if you use GLSL.
Everything on a modern GPU is drawn using a shader.
For the old immediate-style rendering (ie: glBegin/glVertex), that will get converted to something approximating buffers and shaders somewhere in the driver. This is why using GLSL is "faster" - because it's closer to the metal, you're not going through a conversion layer.
For a modern API, like XNA, everything is already built around "buffers and shaders".
In XNA, SpriteBatch provides its own shader. The source code for the shader is available here. The shader itself is very simple: The vertex shader is a single matrix multiplication to transform the vertices to the correct raster locations. The pixel shader simply samples from your sprite's texture.
You can't really do much to make SpriteBatch's shader faster - there's almost nothing to it. There are some things you can do to make the buffering behaviour faster in specific circumstances (for example: if your sprites don't change between frames) - but this is kind of advanced. If you're experiencing performance issues with SpriteBatch, be sure you're using it properly in the first place. For what it does, SpriteBatch is already extremely well optimised.
For more info on optimisation, see this answer.
If you want to pass a custom shader into SpriteBatch (eg: for a special effect) use this overload of Begin and pass in an appropriate Effect.
Is it possible to write GLSL ES fragment shaders under iOS that generate multiple RGB outputs and have (for example) one sent to the screen and one sent to a texture?
Under normal GLSL I guess this would be done by writing to gl_FragColor[i] from the fragment shader. However, it seems that GLSL ES 2.0 only supports a single color output.
OpenGL ES 2.0 does not support FBOs with multiple render targets. Therefore, GLSL ES also does not support it.
NVIDIA has an extension for it, but obviously that only works on NVIDIA's hardware.