How can I make my WebGL Coordinate System "Top Left" Oriented? - webgl

Because of computation efficiency, I use a fragment shader to implement a simple 2D metaballs algorithm. The data of the circles to render is top-left oriented.
I have everything working, except that the origin of WebGL's coordinate system (bottom-left) is giving me a hard time: Obviously, the rendered output is mirrored along the horizontal axis.
Following https://webglfundamentals.org/webgl/lessons/webgl-2d-rotation.html (and others), I tried to rotate things using a vertex shader. Without any success unfortunately.
What is the most simple way of achieving the reorientation of WebGL's coordinate system?
I'd appreciate any hints and pointers, thanks! :)
Please find a working (not working ;) ) example here:
https://codesandbox.io/s/gracious-fermat-znbsw?file=/src/index.js

Since you are using gl_FragCoord in your pixels shader, you can't do it from the vertex shader becasuse gl_FragCoord is the canvas coordinates but upside down. You could easily invert it in javascript in your pass trough to WebGL
gl.uniform3fv(gl.getUniformLocation(program, `u_circles[${i}]`), [
circles[i].x,
canvas.height - circles[i].y - 1,
circles[i].r
]);
If you want to do it in the shader and keep using gl_FragCoord then you should pass the height of the canvas to the shader using a uniform and do the conversion of y there by doing something like
vec2 screenSpace = vec2(gl_FragCoord.x, canvasHeight - gl_FragCoord.y - 1);
The -1 is because the coordinates start at 0.

Related

WebGL - Get fragment coordinates within a shape in triangle mode? GL_FragCoord doesn't work

I'm trying to create a WebGL shader that can both output solid rectangles as well as hollow rectangles (with a fixed border width) within the same draw call, and so far, the best way I've thought of how to do it is as follows:
In the vertex shader, send in a uniform value uniform float borderWidth
and then inside the fragment shader, I need a coordinate space that is x = [ 0, 1] and y = [0, 1] where x=0 when we are the leftmost, and y=0 when we are at the topmost of the shape's borders, or something like that. After I have that then drawing the lines is straightforward and I can figure it out from there, I can use something like:
1a - Have a smooth step from the fragment's x=0 coordinate to x=borderWidth for the vertical lines and x=1-borderWidth to x=1 for the vertical lines
1b - Something similar for the horizontal lines and the y coordinate
The Problem
The problem I'm facing is I can't create that coordinate space. I tried using gl_FragCoord but I think it's undefined for shapes rendering in TRIANGLES mode. So I'm kinda lost. Anyone have any suggestions?
gl_FragCoord is never undefined, it is the position of the fragment in the output buffer (like your screen), if you're rendering to the center of a FullHD screen gl_FragCoord would be vec3(940,540,depth), however this data is of no use for what you're trying to do.
What you describe sounds like you need barycentric coordinates that you have to define as additional attributes next to your vertex positions, then pass through to the fragment shader as varyings so they're linearly interpolated across the vertices. If you render non-indexed geometry and use webgl 2 you can derive the barycentrics using gl_VertexID % 3 instead.

Modifying Individual Pixels with SKShader

I am attempting to write a fragment shader for the app that I am working on. I pass my uniform into the shader which works but it works on the entire object. I want to be able to modify the object pixel by pixel. So my code now is....
let shader = SKShader( fileNamed: "Shader.fsh" );
shader.addUniform( SKUniform( name: "value", float: 1.0 ) );
m_image.shader = shader;
Here the uniform "value" will be the same for all pixels. But, for example, let's say I want to change "value" to "0.0" after a certain amount of pixels are drawn. So for example....
shader.addUniform( SKUniform( name: "value", float: 1.0 ) );
// 100 pixels are drawn
shader.addUniform( SKUniform( name: "value", float: 0.0 ) );
Is this even possible with SKShader? Would this have to be done in the shader source?
One idea I was thinking of was using an array uniform but it doesn't appear that SKShader allows this.
Thanks for any help is advance.
In general, the word uniform means unchanging — something that's the same in all cases or situations. Such is the way of shader uniforms: even though the shader code runs independently (and in parallel) for each pixel in a rendered, images, the value of a uniform variable input to the shader is the same across all pixels.
While you could, in theory, pass an array of values into the shader representing the colors for every pixel, that's essentially the same as passing an image (or just setting a texture image on the sprite)... at that point you're using a shader for nothing.
Instead, you typically want your GLSL(ish*) code to, if it's doing anything based on pixel location, find out the pixel coordinates it's writing to and calculate a result based on that. In a shader for SKShader, you get pixel coordinates from the vec2 v_tex_coord shader variable.
(This looks like a decent tutorial (with links to others) for getting started on SpriteKit shaders. If you follow other tutorials or shader code libraries for help doing cool stuff with pixel shaders, you'll find ideas and algorithms you can reuse, but the ways they find the current output pixel will be different. In a shader for SpriteKit, you can usually safely replace gl_FragCoord with v_tex_coord.)
* SKShader doesn't use actual GLSL per se, It actually uses a subset of GLSL that automatically translates to appropriate GPU code for the device/renderer in use.

Projective texture mapping in WebGL

I wrote two simple WebGL demos which use a 512x512 image as texture. But the result is not what I want. I know the solution is to use projective texture mapping(or any other solutions?) but I have no idea how to implement it in my simple demos. Anyone can help?
The results are as follows(both of them are incorrect):
Codes of demos are here: https://github.com/jiazheng/WebGL-Learning/tree/master/texture
note: Both the model and texture could not be modified in my case.
In order to get perspective-correct texture mapping, you must actually be using perspective. That is, instead of narrowing the top of your polygon along the x axis, move it backwards along the z axis, and apply a standard perspective projection matrix.
I'm a little hazy on the details myself, but my understanding is that the way the perspective matrix maps the z coordinate into the w coordinate is the key to getting the GPU to interpolate along the surface “correctly”.
If you have already-perspective-warped 2D geometry, then you will have to implement some method of restoring it to 3D data, computing appropriate z values. There is no way in WebGL to get a perspective quadrilateral, because the primitives are triangles and there is not enough information in three points to define the texture mapping you're looking for unambiguously — your code must use the four points to work out the corresponding depths. Unfortunately, I don't have enough grasp of the math to advise you on the details.
You must specify vec4 texture coordinates not vec2. The 4th field in each vec4 will be homogeneous w that when divided into x and y produce your desired coordinate. This in turn should allow the perspective correction division in hardware to give you a non affine mapping within the triangle provided your numbers are correct. Now, if you use a projection matrix to transform a vec4 with w=1 in your vertex shader you should get the correct vec4 numbers ready for perspective correction going into setup and rasterization for your fragment shader. If this is unclear then you need to seek out tutorials on projective texture transformation and homogeneous coordinates in projection.

HLSL correct pixel position in deferred shading

In OpenGL, I am using the following in my pixel shaders to get the correct pixel position, which is used to sample diffuse, normal, position gbuffer textures:
ivec2 texcoord = ivec2(textureSize(unifDiffuseTexture) * (gl_FragCoord.xy / UnifAmbientPass.mScreenSize));
So far, this is what I do in HLSL:
float2 texcoord = input.mPosition.xy / gScreenSize;
Most notably, in GLSL I am using textureSize() to get accurate pixel position. I am wondering, is there a HLSL equivalent to textureSize()?
In HLSL, you have GetDimensions
But it may be costlier than reading it from a constant buffer, even if it looks easier to use at first to do quick tests.
Also, you have alternative, using SV_Position and Load, just use the xy as an uint2, you remove the need of an user interpolator carrying a texture coordinate to index the screen.
Here the full documentation of a TextureObject

Will WebGL ever render points as circles?

On desktop OpenGL, points will sometimes be rendered as circles (if you have set gl_PointSize in the vertex shader). I am tinkering with WebGL and it seems to consistently render points as squares (when gl_PointSize is set). Is there a way to get them to render as circles?
Yes, there is a solution. You can do that using point sprites. Just send texture to shader and using alpha blending cut of unnecessary part of sprite.
Normally (in desktop OpenGL) you may see points rendered as circles when you have got MSAA and POINT_SMOOTH feature enabled.
Below you have links where you can get all informations you need :)
OpenGL ES 2.0 Equivalent for ES 1.0 Circles Using GL_POINT_SMOOTH?
http://klazuka.tumblr.com/post/249698151/point-sprites-and-opengl-es-2-0

Resources