Applying color to a OpenGL ES 2.0 Point Sprite texture in Fragment shader? - ios

I am creating a particle emitter with a texture that is a white circle with alpha. Unable to color the sprites using color passed to the fragment shader.
I tried the following:
gl_FragColor = texture2D(Sampler, gl_PointCoord) * colorVarying;
This seems to be doing some kind of additive coloring.
What I am attempting is porting this:
http://maniacdev.com/2009/07/source-code-particle-based-explosions-in-iphone-opengl-es/
from ES 1.1 to ES 2.0

with your code, consider the following example:
texture2D = (1,0,0,1) = red - fully opaque
colorVarying = (0,1,0,0.5) = green - half transparent
then gl_FragColor would be (0,0,0,0.5) black - half transparent.
Generally, you can use mix to interpolate values, but if I understood your problem then its even easier.
Basically, you only want the alpha channel from your texture and apply it to another color, right? then you could do this:
gl_FragColor = vec4(colorVarying.rgb, texture2D(Sampler, gl_PointCoord).a)

Related

Shadows in Metal, Swift

I’m creating 2d animation using Metal and LiquidFun. I want to simulate petrol. I want my animation to be yellow with gray shadows, similar to this:
Here is my current animation, it's totally yellow without any gray shadows, so it doesn't look realistic:
My fragment shader is very simple now, I only pass yellow color to it:
fragment half4 fragment_shader(VertexOut in [[stage_in]],
float2 pointCoord [[point_coord]]) {
float4 out_color = float4(0.7, 0.5, 0.1, 0.07);
return half4(out_color);
};
I’ve checked various tutorials about adding shadows on MTKView, but they all suggest things that don’t work for me. The first thing that doesn’t work is creating various vertexes and setting color for each of them. In my code, I don’t have definite vertexes, I have a particle system which I pass to the vertex buffer:
particleCount = Int(LiquidFun.particleCount(forSystem: particleSystem))
let positions = LiquidFun.particlePositions(forSystem: particleSystem)
let bufferSize = MemoryLayout<Float>.size * particleCount * 2
vertexBuffer = device.makeBuffer(bytes: positions!, length: bufferSize, options: [])
Another thing I’ve tried is setting ambient, diffuse and specular colors, but it also didn’t work because my animation is 2D, not 3D.
I’ve also tried setting color based on particle position. My code inside fragment shader was close to this:
if (in.position.y < 1500.0) {
out_color = float4(0.7, 0.5, 0.1, 0.07);
} else if (in.position.y > 1500.0) {
out_color = float4(0.6, 0.5, 0.1, 0.07);
}
But it also didn’t work as expected: color transitions were not smooth, it didn’t look like shadows. Plus my animation is increasing, so setting color to definite positions was not a good idea.
Could you please suggest something? I feel like I’m missing something very important.
Any help is appreciated!

SceneKit shader modifier is not modifying the geometry's position

I am trying to apply a simple shader modifier to move the position of the cube. The regular diffuse color of the cube is a light blue, but with this modifier is does turn red, so I know the shader modifier is working (somewhat). However, the cube remains in the center of the screen at the position (0,0,0), so the position is not being modified by the shader modifier. Any ideas?
Here is the code
let modifier = """
_surface.diffuse = float4(1,0,0,1);
_surface.position = float3(10.0,0.0,0.0);
"""
cube.geometry?.shaderModifiers = [SCNShaderModifierEntryPoint.surface : modifier]
The magenta tint is SceneKit's way to indicate that a shader has failed to compile.
_surface.position = float3(10.0,0.0,0.0);
Looking at SCNShadable.h we see that SCNShaderSurface's position is a float4, not a float3.

How can I get normal shading on a SKSpriteNode with a custom shader?

I've been doing some work in SpriteKit, and I can't seem to get custom shaders and the pseudo 3D lighting effects from a normal texture to work at the same time.
I have a pair of PNG textures, representing a shape with its basic coloring, and a normal map of the same image. If I create an SKSpriteNode, using those textures and add a light to the scene, I see the bumpiness and beveled edges I expect.
cactus = SKSpriteNode(imageNamed: "Saguaro.png")
cactus.normalTexture = SKTexture(imageNamed: "Saguaro_n")
cactus.position = sceneCenter
cactus.lightingBitMask = 1
light = SKLightNode()
light.position = CGPoint.zero
light.lightColor = UIColor.white
light.isEnabled = true
light.categoryBitMask = 1
light.ambientColor = UIColor.white
light.falloff = 0.3
If, however, I add a custom shader, I get the flat color of just the colors in the texture image. (Code below)
// Assign a shader to the SpriteNode
// Shader loaded from a file with code below
cactus.shader = myShader
// Shader code
void main(void) {
gl_FragColor = SKDefaultShading();
}
Is there something I can do in the shader to use the built-in lighting effects from the normal map? I'm not too familiar with writing custom fragment shaders, so perhaps there's something obvious I'm not doing.

Fragment shader output interferes with conditional statement

Context: I'm doing all of the following using OpenGLES 2 on iOS 11
While implementing different blend modes used to blend two textures together I came across a weird issue that I managed to reduce to the following:
I'm trying to blend the following two textures together, only using the fragment shader and not the OpenGL blend functions or equations. GL_BLEND is disabled.
Bottom - dst:
Top - src:
(The bottom image is the same as the top image but rotated and blended onto an opaque white background using "normal" (as in Photoshop 'normal') blending)
In order to do the blending I use the
#extension GL_EXT_shader_framebuffer_fetch
extension, so that in my fragment shader I can write:
void main()
{
highp vec4 dstColor = gl_LastFragData[0];
highp vec4 srcColor = texture2D(textureUnit, textureCoordinateInterpolated);
gl_FragColor = blend(srcColor, dstColor);
}
The blend doesn't perform any blending itself. It only chooses the correct blend function to apply based on a uniform blendMode integer value. In this case the first texture gets drawn with an already tested normal blending function and then the second texture gets drawn on top with the following blendTest function:
Now here's where the problem comes in:
highp vec4 blendTest(highp vec4 srcColor, highp vec4 dstColor) {
highp float threshold = 0.7; // arbitrary
highp float g = 0.0;
if (dstColor.r > threshold && srcColor.r > threshold) {
g = 1.0;
}
//return vec4(0.6, g, 0.0, 1.0); // no yellow lines (Case 1)
return vec4(0.8, g, 0.0, 1.0); // shows yellow lines (Case 2)
}
This is the output I would expect (made in Photoshop):
So red everywhere and green/yellow in the areas where both textures contain an amount of red that is larger than the arbitrary threshold.
However, the results I get are for some reason dependent on the output value I choose for the red component (0.6 or 0.8) and none of these outputs matches the expected one.
Here's what I see (The grey border is just the background):
Case 1:
Case 2:
So to summarize: If I return a red value that is larger than the threshold, e.g
return vec4(0.8, g, 0.0, 1.0);
I see vertical yellow lines, whereas if the red component is less than the threshold there will be no yellow/green in the result whatsoever.
Question:
Why does the output of my fragment shader determine whether or not the conditional statement is executed and even then, why do I end up with green vertical lines instead of green boxes (which indicates that the dstColor is not being read properly)?
Does it have to do with the extension that I'm using?
I also want to point out that the textures are both being loaded in and bound properly. I can see them just fine if I just return the individual texture info without blending or even with a normal blending function that I've implemented everything works as expected.
I found out what the problem was (and I realize that it's not something anyone could have known from just reading the question):
There is an additional fully transparent texture being drawn between the two textures you can see above, which I had forgotten about.
Instead of accounting for that and just returning the dstColor in case the srcColor alpha is 0, the transparent texture's color information (which is (0.0, 0.0, 0.0, 0.0)) was being used when blending, therefore altering the framebuffer content.
Both the transparent texture and the final texture were drawn with the blendTest function, so the output of the first function call was then being read in when blending the final texture.

How to use SCNNode filters with ARKit?

I am trying to implement a custom CIFilter to use with an SCNNode in my ARSCNView. Unfortunately it just creates a gray rectangle where the node should be on the screen. I have also tried built-in CIFilters to double check my code to no avail.
On some other SO post I have read that CIFilter only works when OpenGL is selected as the renderingAPI for the SCNView because CoreImage doesn't play well with Metal and as far as I can tell it is impossible to get ARSCNView to run with OpenGL. The said post is from 2016 so I am wondering if anything has changed.
What I am trying to implement is to outline/highlight the object on the screen to feedback the user about object selection. I have achieved something usable by adding a shader modifier but it gives limited control over shading. I really don't want to overtake all shading on myself.
Below is my CIKernel for outlining which works very good on Quartz Composer.
Any help and information is highly appreciated.
kernel vec4 outline(sampler src) {
vec2 texturePos = destCoord();
float alpha = 4.0f * sample(src, texturePos).a;
float thickness = 5.0f;
alpha -= sample(src, texturePos + vec2(thickness, 0.0f)).a;
alpha -= sample(src, texturePos + vec2(-thickness, 0.0f)).a;
alpha -= sample(src, texturePos + vec2(0.0f, thickness)).a;
alpha -= sample(src, texturePos + vec2(0.0f, -thickness)).a;
if (alpha > 0.9f) {
vec4 resultCol = vec4(1.0f, 1.0f, 1.0f, alpha);
return resultCol;
}else{
vec4 resultCol = sample(src, texturePos);
return resultCol;
}
}
I also faced a similar problem. The cause is because we made the following settings. CIFilter could be implemented by removing this setting.
I have not analyzed the details but if it comes to help!
sceneView.antialiasingMode = .multisampling4X
sceneView.contentScaleFactor = 1.3

Resources