I used glDepthRange(1.0, 0.0) in a Mac OS X program to give myself a right-handed coordinate system. Apparently I don't have that option with iOS using OpenGL ES 2.0. Is there a quick fix so that higher z-values show up in front, or do I have to rework all of my math?
well you can try glDepthFunc. the default value is GL_LESS, if you use GL_GREATER, pixels with higher z values will be rendered.
glDepthFunc(GL_GREATER);
alternatively, you can add this line on your vertex shader
gl_Position.z = -gl_Position.z;
Related
Is this official iOS coordinate system or its the case only when working with CoreGraphic?
(X positive is on the right and Y positive is down)
This is the UIKit coordinate space. Core Graphics (also Core Text) puts the origin in the lower left by default. On iOS it is common for the coordinate space to be flipped for Core Graphics so that it matches UIKit.
Yes, it uses modified coordinates
https://developer.apple.com/library/archive/documentation/GraphicsImaging/Conceptual/drawingwithquartz2d/dq_overview/dq_overview.html#//apple_ref/doc/uid/TP30001066-CH202-TPXREF101
I am working on a Swift Playground with a 3D model view (Scenekit) and have recently been introduced to shaders (.shader files) for iOS. I've made shaders for Unity before and I was hoping for some help replicating a Unity shader from Shader Graph in the form of a .shader file for a shark. Thanks!
Here's a Unity Shader Graph:
This is the original shader code that I tried for the sinusoidal movement, which didn't work too well, instead it rippled in all directions. I tried changing the axis but after that nothing moved for the shader.
_geometry.position.xz += _geometry.position.xz
sin(30.0 * _geometry.position.y - 3.0 * u_time) * 0.05
(u_time < 3.0 ? u_time / 3.0 : 1.0);
This shader was saved as a .shader file and added to my mesh.
I'm using SceneKit on iOS and I have a geometry I want to render as a wireframe. So basically I want to draw only the lines, so no textures.
I figured out that I could use the shaderModifiers property of the used SCNMaterial to accomplish this. Example of a shader modifier:
material.shaderModifiers = [
SCNShaderModifierEntryPointFragment: "_output.color.rgb = vec3(1.0) - _output.color.rgb;"
]
This example apparently simply inverts the output colors. I know nothing about this 'GLSL' language I have to use for the shader fragment.
Can anybody tell me what code I should use as the shader fragment to only draw near the edges, to make the geometry look like a wireframe?
Or maybe there is a whole other approach to render a geometry as a wireframe. I would love to hear it.
Try setting the material fillMode to .lines (iOS 11+, and macOS 10.13+):
sphereNode.geometry?.firstMaterial?.fillMode = .lines
Now it is possible (at least in Cocoa) with:
gameView.debugOptions.insert(SCNDebugOptions.showWireframe)
or you can do it interactively if enabling the statistics with:
gameView.showsStatistics = true
(gameView is an instance of SCNView)
This is not (quite) an answer, because this a question without an easy answer.
Doing wireframe rendering entirely in shader code is a lot more difficult than it seems like it should be, especially on mobile where you don't have a geometry shader. The problem is that the vertex shader (and subsequently the fragment shader) just doesn't have the information needed to know where polygon edges are.
I know nothing about this 'GLSL' language I have to use for the shader fragment.
If you really want to tackle this problem, you'll need to learn some more about GLSL (the OpenGL Shading Language). There are loads of books and tutorials out there for that.
Once you've got some GLSL under your belt, take a look at some of the questions (like this one pulled from the Related sidebar) and other stuff people have written about the problem. (Note that when you're looking for mobile-specific limitations, OpenGL ES has the same limitations as WebGL on the desktop.)
With SceneKit, you have the additional wrinkle that you probably don't have a barycentric-coordinates vertex attribute (aka SCNGeometrySource) for the geometry you're working with, and you probably don't want to do the hard work of generating one. In OS X, you can use an SCNProgram with a geometryShader to add barycentric coordinates before the vertex/fragment shaders run — but then you have to do your own shading (i.e. you can't piggyback on the SceneKit shading like you can with shader modifiers). And that isn't available in iOS — the hardware there doesn't do geometry shaders. You might be able to fake it using texture coordinates if those happen to be lined up right in your geometry.
It might be easier to just draw the object using lines — try making a new SCNGeometry from the sources and elements of your original (solid) geometry, but when recreating the SCNGeometryElement, use SCNPrimitiveTypeLine.
On iOS4 GL_OES_standard_derivatives is only supported on the device (from what I see when I output the extensions), is there a way to be able to:
Detect in the fragment shader if the extension is supported or not
If not supported, does anyone have a the code for the dFdx and dFdy? Can't seems that find anything on google.
TIA!
I had the same issue for antialiasing SDM fonts. You can calculate a similar dfdx/dfdx by
Translating 2 2d vectors using the current transform matrix :
vec2 p1(0,0); vec2 p2(1,1);
p1=TransformUsingCurrentMatrix(p1);
p2=TransformUsingCurrentMatrix(p2);
float magic=35; // you'll need to play with this - it's linked to screen size I think :P
float dFdx=(p2.x-p1.x)/magic;
float dFdy=(p2.y-p1.y)/magic;
then send dFdx/dFdy to your shader as uniforms - and simply multiply with your parameter to get the same functionality i.e.
dFdx(myval) now becomes
dFdx*myval;
dFdy(myval) dFdy*myval;
fwidth(myval) abs(dFdx*myval)+abs(dFdy*myval);
In a Direct3D application filled with Sprite objects from D3DX, I would like to be able to globally adjust the brightness and contrast. The contrast is important, if at all possible.
I've seen this question here about OpenGL: Tweak the brightness/gamma of the whole scene in OpenGL
But that doesn't give me what I would need in a DirectX context. I know this is something I could probably do with a pixel shader, too, but that seems like shooting a fly with a bazooka and I worry about backwards compatibility with older GPUs that would have to do any shaders in software. It seems like this should be possible, I remember even much older games like the original Half Life having settings like this well before the days of shaders.
EDIT: Also, please note that this is not a fullscreen application, so this would need to be something that would just affect the one Direct3D device and that is not a global setting for the monitor overall.
A lot of games do increasing brightness by, literally, applying a full screen poly over the screen with additive blending. ie
SRCBLEND = ONE
DESTBLEND = ONE
and then applying a texture with a colour of (1, 1, 1, 1) will increase the brightness(ish) of every pixel by 1.
To adjust contrast then you need to do similar but MULTIPLY by a constant factor. This will require blend settings as follows
SRCBLEND = DESTCOLOR
DESTBLEND = ZERO
This way if you blend a pixel with a value of (2, 2, 2, 2) then you will change the contrast.
Gamma is a far more complicated beast and i'm not aware of a way you can fake it like above.
Neither of these solutions is entirely accurate but they will give you a result that looks "slightly" correct. Its much more complicated doing either way properly but by using those methods you'll see effects that will look INCREDIBLY similar to various games you've played, and that's because that's EXACTLY what they are doing ;)
For a game I made in XNA, I used a pixel shader. Here is the code. [disclaimer] I'm not entirely sure the logic is right and some settings can have weird effects (!) [/disclaimer]
float offsetBrightness = 0.0f; //can be set from my C# code, [-1, 1]
float offsetContrast = 0.0f; //can be set from my C# code [-1, 1]
sampler2D screen : register(s0); //can be set from my C# code
float4 PSBrightnessContrast(float2 inCoord : TEXCOORD0) : COLOR0
{
return (tex2D(screen, inCoord.xy) + offsetBrightness) * (1.0 + offsetContrast);
}
technique BrightnessContrast
{
pass Pass1 { PixelShader = compile ps_2_0 PSBrightnessContrast(); }
}
You didn't specify which version of DirectX you're using, so I'll assume 9. Probably the best you can do is to use the gamma-ramp functionality. Pushing the ramp up and down increases or decreases brightness. Increasing or decreasing the gradient alters contrast.
Another approach is to edit all your textures. It would basically be the same approach as the gamma map, except you apply the ramp manually to the pixel data in each of your image files to generate the textures.
Just use a shader. Anything that doesn't at least support SM 2.0 is ancient at this point.
The answers in this forum may be of help, as I have never tried this.
http://social.msdn.microsoft.com/Forums/en-US/windowsdirectshowdevelopment/thread/95df6912-1e44-471d-877f-5546be0eabf2