I have some code in .fx file whening using fxcomposer,and I met some code like this:
float4x4 WorldITXf : WorldInverseTranspose < string UIWidget="None"; >
what is the < and > containning meaning?
< string UIWidget="None"; > is an annotation that is not used by the HLSL complier or Direct3D. It's just used by FXComposer to determine how to provide interactive controls for the shader in tools.
It's an aspect of the legacy Direct3D 9 era Effects system, and was intended to make it easier for Digital Content Creation (DCC) tools like FXComposer, RenderMonkey, Autodesk 3DS MAX, etc. to provide more art-driven UI for controlling shader behavior.
More modern solutions for art-driven shader production have moved away from just having a programmer mark up a shader with 'tweaks', and are more often done with flow diagrams like Visual Studio's Shader Designer--although VS's solution here is not nearly as robust as say the visual shader designer in Unreal Engine.
See MSDN
Related
I am familiar with C or C++
However, I am a quite newbie for WEBGL and GLSL.
It might be too basic or silly question though... I appreciate you give me some hints.
I found this source code which makes the great CG graphics.
I want to compile this by myself, However honestly saying I am not sure what it is , and what with compiler I can compile and make this work .
#define R rotate2D
for(float i,e,g,s;i++<1e2;g+=e*.2)
{
vec3 n,p=vec3((FC.xy-.5*r)/r.y*g,g-4.);
p.zy*=R(.4);
p.z+=t;
e=p.y;
for(s=1.;s<6e3;s+=s)p.xz*=R(s),n.xz*=R(s),n+=cos(p*s),e+=dot(sin(p.xz*s),r/r/s/3.);
n.y=1.;
n=normalize(n);
e-=snoise2D(p.xz*40.)*n.y*n.y*.4;
o-=exp(-e*9.-5.);
}
o++;
So at first, I started to learn about webGL.
And finished learning basic webGL
However I cant make it work with webGL.
Next, I started to learn about SLGL on unity,,, however it's not hit the spot.
What should I learn to compile this???
This looks a piece of a GLSL shader, and you would compile that with gl.compileShader()
But it's incomplete. First of all to be completely compilable it needs a main function that sets gl_FragColor. And I see a reference to snoise2D and rotate2D which should be functions, but are missing. And a least a few local undefined variables like FC and r. So without those pieces, this won't compile.
If you can solve those issues, you should be able to plug into something like ShaderToy without have to invent a whole WebGL application from scratch. But if this shader is meant to interact with specific geometry that might not work at all.
It's hard to advise more without more info on what this is, how its intended and what all that single letter variable names are supposed to mean.
I have previously worked on gltf 1.0 and is now trying to update my application to render gltf2.0 sample models provided by khronos. I understand that shaders(glsl) and techniques are no longer part of the core properties in gltf 2.0.
So my question is that:
Are shader information now separated from .gltf? I know there is KHR_technique_webgl extensions which consist of the technique and shader properties(exactly like how gltf1.0 represent shader), are we suppose to be use that if our material arent pbr?
How do rendering engines now grab shader information from normal .gltf now(without the extensions)? Do we do it like old school way ie loading our own shader and manually map the model attributes to shaders attribute?
The KHR_technique_webgl extension will eventually be finished, and will provide a way to include custom shaders with your glTF2.0 model. But as of this writing, the extension is not fully defined and tools cannot implement it.
The more general case (and recommended if it suits your needs) would be to use PBR or Blinn-Phong materials. These are declared abstractly in glTF, so that rendering engines can build their own shaders for these material types, and will generally integrate better with engines' own lighting and/or shadows.
In 11.1 and later Microsoft removed a lot of helpers for loading textures (fromfile, fromstream etc).
I'm trying to port my code over to 11.2 and everything works fine except for this line :
var texture = Texture2D.FromFile<Texture2D>(device, #"C:\Texture.png");
Now all i could find was guidance telling me to use WIC instead but i can't seem to find anything providing me with a Texture2D that is nearly as versatile (all the samples i found require passing in the pixel format among other things).
I'm looking for a solution that would let me load files (without knowing their format or layout before hand) and get a Texture2D out of it just like FromFile allowed. Is there anything like that or even close? I'm assuming there has to be "something" as they wouldn't just deprecate such a feature if it wasn't superfluous.
Any help is most appreciated.
Edit : I'd like to avoid using the SharpDX.Toolkit too, i'm looking for a raw DirectX / WIC etc solution as i don't want to add a dependency on the Toolkit. I'm however perfectly fine on adding any .net Framework 4.0 or 4.5 assembly as a dependency.
There is no easy solution apart from writing the whole WIC interop yourself. Even if you don't want to use the Toolkit, the source code is available and the class WicHelper.cs responsible for decoding images is fairly easy to adapt.
If you want a very simple solution that doesn't handle all WIC cases (format mappings...etc.), you can have a look at TextureLoader.cs.
Provided you have both a Texture2D and a WPF BitmapImage lying around, something like this will help:
var map = device.ImmediateContext.MapSubresource(texture, 0, MapMode.WriteDiscard, MapFlags.None);
bitmap.CopyPixels(Int32Rect.Empty, map.DataPointer, bitmap.PixelWidth * bitmap.PixelHeight * 4, bitmap.PixelWidth * 4);
device.ImmediateContext.UnmapSubresource(source, 0);
The function you mention was probably something fairly similar under the hood.
If you're not sure about the bitmap being Bgra, you have to convert it first.
What is Direct2d command analog to OpenGl's SwapBuffers? I am using this in a VCL environment such as Delphi and CPP Builder. Thanks
d3ddev->Present(NULL, NULL, NULL, NULL);
There are a couple of ways you can do the equivalent in Direct2D. The simplest way is to create an ID2D1HwndRenderTarget. See http://msdn.microsoft.com/en-us/library/windows/desktop/dd371275(v=vs.85).aspx for details. You'll be interested in the D2D1_HWND_RENDER_TARGET_PROPERTIES parameter. This has a D2D1_PRESENT_OPTIONS field, which can be set to different values depending on the behavior you want. See http://msdn.microsoft.com/en-us/library/windows/desktop/dd368144(v=vs.85).aspx for details. With this in place, the rough equivalent of SwapBuffers is ID2D1RenderTarget::EndDraw.
The other option is using Direct3D interop. In this case you create a DXGI surface render target. (I'd post a link to the docs, but I don't have enough StackOverflow reputation to post more than two hyperlinks. Google "ID2D1Factory::CreateDxgiSurfaceRenderTarget" for the docs). This allows you to use Direct2D to issue 2D rendering commands to the surface, but then present using Direct3D/DXGI. This is more complicated but gives you more flexibility.
UPDATE 2: It now appears this is more of a modelling issue than a programming one. Whoops.
I'm new to XNA development, and despite my C# experience, I've been stuck at one spot for going on two days now.
The situation: I've created a model in 3D Studio Max 2010 which uses two materials, both are of type DirectX Shader. The model exports to FBX without error and Visual Studio compiles it properly. When I ran the Draw() method initially, it threw an exception on the 'BasicEffect' portion of one of my loops, demonstrating (at least to me) that it was loading the .fx file correctly, which must be embedded in the FBX file or something.
The problem:
When using the following code
foreach (ModelMesh mesh in map.Meshes)
{
foreach (Effect effect in mesh.Effects)
{
effect.CurrentTechnique = effect.Techniques["DefaultTechnique"];
effect.Begin();
effect.Parameters["World"].SetValue(Matrix.CreateTranslation(Vector3.Zero));
effect.Parameters["View"].SetValue(ActiveCamera.ViewMatrix);
effect.Parameters["Projection"].SetValue(ActiveCamera.ProjectionMatrix);
effect.Parameters["WorldViewProj"].SetValue(Matrix.Identity * ActiveCamera.ProjectionMatrix);
effect.Parameters["WorldView"].SetValue(Matrix.Identity * ActiveCamera.ViewMatrix);
foreach (EffectPass ep in effect.CurrentTechnique.Passes)
{
ep.Begin();
// something goes here?
ep.End();
}
effect.End();
}
mesh.Draw();
}
The only thing that happens is a white box appears covering the bottom half of the screen, regardless of camera position or angle. I got the name of the effect parameters of the default.fx file specified in Max (it's at [program files]\autodesk\3ds Max 2010\maps\fx).
I get the feeling I'm setting one or all of these incorrectly. I've tried to look up tutorials and follow their code, however none of it seems to work for my model.
Any help or ideas?
UPDATE:
By making these changes:
effect.Parameters["WorldViewProj"].SetValue(Matrix.CreateTranslation(Vector3.Zero) * ActiveCamera.ViewMatrix * Conductor.ActiveCamera.ProjectionMatrix);
effect.Parameters["WorldView"].SetValue(Matrix.CreateTranslation(Vector3.Zero) * ActiveCamera.ViewMatrix);
The model was able to draw. However, everything is completely white :(
Unfortunately, especially without seeing your shader and/or knowing what error it is that you're getting, it's going to be pretty difficult to figure out what's wrong here. There are a number of things that could be going wrong.
My suggestion is to start with a simpler test. Make a box, apply a very simple shader ... and make that render. Then, add some parameter that (for example) multiplies the red component of the pixel shader by the amount passed in. And make that render successfully.
By simplifying the problem set, you are figuring out the nuances of the shaders that max exports and how you set the properties. At some point, you'll realize what you're doing wrong and will be able to apply that to your more complex shader.
I'm quite interested in hearing how this goes ... make sure you comment on this once you've fixed it so I see the outcome. Good luck! :-)