How to get normal Map to work using Direct X, in Pixel shader - directx

I have the tangents and the bitangents etc, I think I have the math correct up until the final stuff in the pixel shader.
I have been looking at tutorials but can't seem to get it to work in my own programme.
My normal map code looks like this so far:
float3 normalMap = (nTex.Sample(mySampler, input.UV).rgb);
normalMap = (normalMap * 2) - 1.0f;
input.tangents = normalize(input.tangents - dot(input.tangents, input.normal) * input.normal);
float3 biTang = cross(input.normal, input.tangents);
float3x3 texSpace = float3x3(input.tangents, biTang, input.normal);
//float3 normalWW = float3(normalMap.r, normalMap.g, normalMap.b);
input.normal = normalize(mul(normalMap, texSpace));
float3 plDir = float3(pointLightDir.rgb);
//float final = saturate(dot(-plDir, input.normal));
//float4 finalW = saturate(float4(combined * final));
output.Color = color * combined;
output.Color+= saturate((dot(-plDir, input.normal)*combined)*color);
output.Color.a = 1;
I haven't named things in a good way yet.
"combined" is my phong shading, I have diffuse, specular and ambient added together. "color" is just the sampled diffuse texture.
The result is no different than if I only had the first texture. if I only use the normal map it's displayed so that works.
I'm in directx in c++.

Related

Directx 9 Normal Mapping Pixelshader

I have a question about normal mapping in directx9 shader.
Currently my Terrain shader Output for Normal Map + Diffuse Color only result into this Image.
Which looks good to me.
If i use an empty Normal map image like this one.
My shader output for normal diffuse and color map looks like this.
But if i use 1 including a ColorMap i get a really stange result.
Does anyone have an idea what could cause this issue?
Here is some snippets.
float4 PS_TERRAIN(VSTERRAIN_OUTPUT In) : COLOR0
{
float4 fDiffuseColor;
float lightIntensity;
float3 bumpMap = 2.0f * tex2D( Samp_Bump, In.Tex.xy ).xyz-1.0f;
float3 bumpNormal = (bumpMap.x * In.Tangent) + (bumpMap.y * In.Bitangent) + (bumpMap.z * In.Normal);
bumpNormal = normalize(bumpNormal);
// Direction Light Test ( Test hardcoded )
float3 lightDirection = float3(0.0f, -0.5f, -0.2f);
float3 lightDir = -lightDirection;
// Bump
lightIntensity = saturate(dot( bumpNormal, lightDir));
// We are using a lightmap to do our alpha calculation for given pixel
float4 LightMaptest = tex2D( Samp_Lightmap, In.Tex.zw ) * 2.0f;
fDiffuseColor.a = LightMaptest.a;
if( !bAlpha )
fDiffuseColor.a = 1.0;
// Sample the pixel color from the texture using the sampler at this texture coordinate location.
float4 textureColor = tex2D( Samp_Diffuse, In.Tex.xy );
// Combine the color map value into the texture color.
textureColor = saturate(textureColor * LightMaptest);
textureColor.a = LightMaptest.a;
fDiffuseColor.rgb = saturate(lightIntensity * I_d).rgb;
fDiffuseColor = fDiffuseColor * textureColor; // If i enable this line it goes crazy
return fDiffuseColor;
}

Normal mapping on a large sphere is not entirely correct

So I've been working on a Directx11/hlsl rendering engine with the goal of creating a realistic planet which you can view from both on the surface and also at a planetary level. The planet is a normalized cube, which is procedurally generated using noise and as you move closer to the surface of the planet, a binary-based triangle tree splits until the desired detail level is reached. I got vertex normal calculations to work correctly, and I recently started trying to implement normal mapping for my terrain textures, and I have gotten something that seems to work for the most part. However, when the sun is pointing almost perpendicular to the ground (90 degrees), it is way more lit up
However, from the opposite angle (270 degrees), I am getting something that seems
, but may as well be just as off.
The debug lines that are being rendered are the normal, tangent, and bitangents (which all appear to be correct and fit the topology of the terrain)
Here is my shader code:
Vertex shader:
PSIn mainvs(VSIn input)
{
PSIn output;
output.WorldPos = mul(float4(input.Position, 1.f), Instances[input.InstanceID].WorldMatrix); // pass pixel world position as opposed to screen space position for lighitng calculations
output.Position = mul(output.WorldPos, CameraViewProjectionMatrix);
output.TexCoord = input.TexCoord;
output.CameraPos = CameraPosition;
output.Normal = normalize(mul(input.Normal, (float3x3)Instances[input.InstanceID].WorldMatrix));
float3 Tangent = normalize(mul(input.Tangent, (float3x3)Instances[input.InstanceID].WorldMatrix));
float3 Bitangent = normalize(cross(output.Normal, Tangent));
output.TBN = transpose(float3x3(Tangent, Bitangent, output.Normal));
return output;
}
Pixel shader (Texcoord scalar is for smaller textures closer to planet surface):
float3 FetchNormalVector(float2 TexCoord)
{
float3 Color = NormalTex.Sample(Samp, TexCoord * TexcoordScalar);
Color *= 2.f;
return normalize(float3(Color.x - 1.f, Color.y - 1.f, Color.z - 1.f));
}
float3 LightVector = -SunDirection;
float3 TexNormal = FetchNormalVector(input.TexCoord);
float3 WorldNormal = normalize(mul(input.TBN, TexNormal));
float nDotL = max(0.0, dot(WorldNormal, LightVector));
float4 SampleColor = float4(1.f, 1.f, 1.f, 1.f);
SampleColor *= nDotL;
return float4(SampleColor.xyz, 1.f);
Thanks in advance, and let me know if you have any insight as to what could be the issue here.
Edit 1: I tried it with a fixed blue value instead of sampling from the normal texture, which gives me the correct and same results as if I had not applied mapping (as expected). Still don't have a lead on what would be causing this issue.
Edit 2: I just noticed the strangest thing. At 0, 0, +Z, there are these hard seams that only appear with normal mapping enabled
It's a little hard to see, but it seems almost like there are multiple tangents associated to the same vertex (since I'm not using indexing yet) because the debug lines appear to split on the seams.
Here is my code that I'm using to generate the tangents (bitangents are calculated in the vertex shader using cross(Normal, Tangent))
v3& p0 = Chunk.Vertices[0].Position;
v3& p1 = Chunk.Vertices[1].Position;
v3& p2 = Chunk.Vertices[2].Position;
v2& uv0 = Chunk.Vertices[0].UV;
v2& uv1 = Chunk.Vertices[1].UV;
v2& uv2 = Chunk.Vertices[2].UV;
v3 deltaPos1 = p1 - p0;
v3 deltaPos2 = p2 - p0;
v2 deltaUV1 = uv1 - uv0;
v2 deltaUV2 = uv2 - uv0;
f32 r = 1.f / (deltaUV1.x * deltaUV2.y - deltaUV1.y * deltaUV2.x);
v3 Tangent = (deltaPos1 * deltaUV2.y - deltaPos2 * deltaUV1.y) * r;
Chunk.Vertices[0].Tangent = Normalize(Tangent - (Chunk.Vertices[0].Normal * DotProduct(Chunk.Vertices[0].Normal, Tangent)));
Chunk.Vertices[1].Tangent = Normalize(Tangent - (Chunk.Vertices[1].Normal * DotProduct(Chunk.Vertices[1].Normal, Tangent)));
Chunk.Vertices[2].Tangent = Normalize(Tangent - (Chunk.Vertices[2].Normal * DotProduct(Chunk.Vertices[2].Normal, Tangent)));
Also for reference, this is the main article I was looking at while implementing all of this: link
Edit 3:
Here is an image of the planet from a distance with normal mapping enabled:
And one from the same angle without:

How to understand Exponential Shadow Mapping in HLSL with Directional Light?

I've tried to understand how ESM is working - I have regular Shadow Mapping in Place (occluded/not occluded) in a deferred rendering pipeline and are trying to use ESM instead.
I've tried to adapt this from Cansin:
http://homepage.lnu.se/staff/tblma/Deferred Rendering in XNA 4.pdf
But as he does not use directional lights, I may have a misunderstanding. This is basically my approach on adapting it to directional lights:
Create ShadowMap:
float4 PS(VSO input) : COLOR0
{
float depth = input.Position2D.z / input.Position2D.w;
return exp(depth);
}
I am using an Orthogonal Projection Matrix (same NearFarClip as actual cam), as I do with regular Shadow Mapping (Position2D is ScreenSpace, because it's a directional light, Z is always the distance surface/light, or am I wrong?)
Get Shadow Factor - basically like regular Shadow Mapping, I transform into Light/Screenspace, getting the depth from the ShadowMap
float4 Position = 1;
Position.xy = input.ScreenPosition.xy;
Position.z = Depth; // saved depth from gbuffer
Position = mul(Position, InverseViewProjection);
Position /= Position.w;
float4 LightScreenPos = mul(Position, LightViewProjection);
LightScreenPos /= LightScreenPos.w;
float2 LUV = 0.5f * (float2(LightScreenPos.x, -LightScreenPos.y) + 1.0f);
float shadowDepth = tex2D(sampler_shadow, LUV).r;
float shadow = shadowDepth * exp(-10 * LightScreenPos.z);
Is my thinking fundamentally wrong?

Calculating view space normal instead of world space

I hope you can help with this. I am currently using a deferred renderer and am trying to implement SSAO but that needs view space normals and I currently have world space normals, I am trying to work out how to convert them but keep getting stuck.
This is the main part of my vertex shaders, instanceTransform is either the World matrix or transpose of the instance matrix as the same code is used for instanced models as well.
VertexShaderOutput VertexShaderFunctionCommon(VertexShaderInput input, float4x4 instanceTransform)
{
VertexShaderOutput output = (VertexShaderOutput)0;
float4x4 worldViewProjection = mul(instanceTransform, ViewProjection);
output.Position = mul(float4(input.Position.xyz,1), worldViewProjection);
output.Depth = output.Position.zw;
// calculate tangent space to world space matrix using the world space tangent, binormal, and normal as basis vectors
output.TangentToWorld[0] = normalize(mul(input.Tangent, instanceTransform));
output.TangentToWorld[1] = normalize(mul(input.Binormal, instanceTransform));
output.TangentToWorld[2] = normalize(mul(input.Normal, instanceTransform));
return output;
}
The normal calculation in the pixel shader is:
// read the normal from the normal map
float3 normalFromMap = tex2D(normalSampler, input.TexCoord);
//tranform to [-1,1]
normalFromMap = 2.0f * normalFromMap - 1.0f;
//transform into world space
normalFromMap = mul(normalFromMap, input.TangentToWorld);
//normalize the result
normalFromMap = normalize(normalFromMap);
//output the normal, in [0,1] space
output.Normal.rgb = NormalEncode(normalFromMap);
Can you help please?

Applying a DirectX shader to a rotated texture in XNA

I had a dynamic light shader for which the shaded sprite was fine in my own test program, but started resembling an eclipse once I imported it into my friend's physics based game. I narrowed it down by simplifying the gradient to be purely based on the X value within the shape, and making the outside of the circle in the sprite red, but as you can see, the rotation continues to cause problems (can't post images, so here's links to the album).
Circle at different rotations(not in order, but labelled by radian values): http://imgur.com/a/Preth
Everything I researched about matrix math says I am using the correct formula for rotation, but I figure maybe I'm doing something wrong. Here is my .fx shader code:
float rotationrads; /*assumed rotation is in radians*/
sampler TextureSampler: register(s0);
float4 staticlight(float2 Tex: TEXCOORD0) : COLOR0
{
float4 Color = tex2D(TextureSampler, Tex);
float2 NewTex;
/*Get the new X and Y values by applying the UV formula with the rotation*/
NewTex.x = (Tex.x * cos(rotationrads)) - (Tex.y * sin(rotationrads));
NewTex.y = (Tex.y * sin(rotationrads)) + (Tex.y * cos(rotationrads));
if(Color.a > 0.0)
{
Color.r = (Color.r * NewTex.x);
Color.g = (Color.g * NewTex.x);
Color.b = (Color.b * NewTex.x);
}
else
{
Color.r = 100;
Color.g = 0;
Color.b = 0;
Color.a = 100;
}
return Color;
}
technique StaticLightOnly
{
pass Pass1
{
PixelShader = compile ps_2_0 staticlight();
}
}
If anyone has experience with sprite-based rotation in 2d shaders, I'd appreciate any help with this! Thanks in advanced!
Because rotations are performed about the origin, you have to move the rotation center (0.5, 0.5) to the origin, execute the rotation and then undo the translation.

Resources