SceneKit Directional Light causing flickering - ios

I'm trying to add a directional light in Scenekit to cast shadows, but it is causing weird artefacts on objects.
The orange block below has a material with default settings and the diffuse set to orange.
The directional light is pointing downwards, and the scale is increased, otherwise it has default settings. (Making the scale smaller still has the same issue).
When I pan the camera around the texture is covered in flickering lines and dots, it looks terrible.
This isn't visible on the simulator, only the device. What is going on and how can I fix it?

Thanks to Toyos I now know that self-shadowing is what's causing the lines. The docs for shadowBias say setting this value should correct it, but for me it made no difference.
In the end I fixed it by rotating the directional light by 2 degrees. It was originally at -90, pointing straight down. Changing this to -88 has completely removed all the artefacts.

Configure the zNear/zFar range of your light to make it as small as possible (but not clipping your world). The smaller the zRange is the more precision you will get.
You can also play with the shadowBias to limit the self shadowing artefacts

Related

SCNParticleSystem partially hidden or occluded in wrong manner

I have an issue with particle systems, which can be, in rare cases, kind of occluded in a wrong manner. The particle system you see on the print-screens is a sphere (with invisible material, material transparency = 0.0), that emits particles from its surface. Like 250 particles per second - no magic - and the particle systems works in 99% as it should.
You see also a floor (which is a SCNPlane) that has a very large diameter, of like 100m x 100m. The occlusion happens when the camera is flying by and the angle of the view changes a little bit, because the camera moves smoothly. Depending on the camera angle, it can happen - as you see on the second image - the particle system is occluded partially in a wrong manner, like it would stay behind the horizon - but it does not - it hoovers 2m above the floor and has a radius of 1m.
Did anyone ran into a similar issue? Is there something that could be done, to make this render correctly in all cases (from all viewing angles).
Sometimes the particle system disappears even completely. i.Ex when the camera looks from (20m) above directly on the particle system.
(The scene uses physically based rendering using SceneKit - the background is a simple skybox)
You asked if anyone ran into a similar issue?
I can answer yes!
Depending on the point of view (camera position), and the object on which the SCNParticleSystem is attached, I'm getting weird occlusions of the emitted particles.
I have no SCNPlane, but I have a large SCNSphere around the scene showing a 360 video. If I remove the sphere, the bug doesn't occur anymore.
It might be a regression with iOS 14.x and macOS 11.2, as the same application running under iOS 13.6.1 doesn't show the problem !
If somebody needs. I had similar problem and was trying to set different settings of the particle system for a while.
One of solutions was increased "Rendering order" of the node, that contains the particles, but particles disappear if you change camera orientation.
By chance, I discovered that bug happens when I add specific node to the scene. One difference I found, this node had a material with Transparency mode "Dual layer". I tried this one to make transparent texture.
I changed mode to "Default" and it helped.
Xcode screen

Wierd gaps appearing in between SKSpriteNode - swift

Note: I have tried this answer: Gap between SKSpriteNodes in SpriteKit collision detection
I am getting gaps in between my SKSpriteNodes, after 5 minutes of letting my game run. Here is my code to make the node:
let tileNode = SKSpriteNode(imageNamed: "world1Tile\(tileNumber).png")
tileNode.position.x = x
tileNode.position.y = y
tileNode.size.width = 128
tileNode.size.height = 128
tileNode.zPosition = 10
tileNode.physicsBody = SKPhysicsBody(rectangleOf: tileNode.size)
tileNode.physicsBody!.isDynamic = false
tileNode.physicsBody!.restitution = 0
tileNode.physicsBody!.allowsRotation = false
tileNode.name = "Tile"
row.append(tileNode)
When I remove the physics body, it is running fine. Here is are some images to show you what I mean:
This image has a physics body, and was taken after immediately after running the app.
This image was taken 5 minutes after running the app.
Why is this happening? I assume it has something to do with the physics body, because my app looks exactly like the first picture, even an hour after running the app if there is no physics body. What physics body property should I change to stop this from happening? Any help would be appreciated.
I had a similar issue not too long ago, where gaps were appearing between nodes that were tiled (although I didn't use physics). Based on this answer, I found that if you want perfect alignment between nodes, it is best to ensure that the positions of nodes as well as the nodes' width and height are whole numbers.
I would suggest to round-off the x and y values of the position of tileNode and see if it will make any difference .
I'm guessing there is no gap. you probably have 'showPhysics' to true in your gameviewcontroller, and the line appears as a gap to me.
compare position with and without the pb to verify.
I had similar problem where gaps between sprites started appearing after around 5 minutes of scrolling with constant speed (game with infinite scroll). I did not use physics and I even had all positions, widths, heights rounded to integer value. I was scrolling the camera and adding new sprites one right after another and everything was working fine except after around 5 minutes of that infinite scrolling gaps begin to appear just as in your case. I spent some time in looking for a solution and it turned out that the problem was that when positions of my objects were becoming big that is in my case X position in the scene was around 150000 then those gaps started to appear and also I noticed that this problem occurred only on devices which had to scale the scene. I was using aspect fill with default scene size for iPhone 6 resolution and those gaps only appeared on iPhone 5 but on iPhone 6 I did not notice them. I was able to fix that issue by subtracting some constant value from X position of all objects (including camera position) from time to time so that everything on the scene relatively did not change position to the camera and look the same but actually absolute positions were changed to keep them low. I guess that larger position values like 150000 and scene scaling cause some floating point rounding issue in SpriteKit and that is why gaps are then becoming visible.
So based on my experience if you have similar gaps I recommend using integer values for all positions, widths, heights and additionally keep values of objects positions of all objects low.
For future reference in case someone is still searching for this, here are my experiences:
If tiles have PhysicsBodies, they are prone to making gaps. A solution for me was making a blank SKNode as a child of the tile, and assigning the PhysicsBody to that.
If possible, make sure bit masks are set in a way that tiles can't collide with each other.
As stated in a previous answer, make sure all measurements are integers and rounded in a way that doesn't leave a one unit gap between them.
A related problem is also SpriteKit's PhysicsBody drifting. There are some threads about this (e.g. https://forums.developer.apple.com/thread/27057 ), and it seems to be abug. In my case, the problem was a combination of PhysicsBodies causing random small gaps, and the drifting making some of them larger. The steps above removed the small gaps. Unfortunately the only workaround for the drifting problem in my case was to only generate PhysicsBodies for nodes that are within a certain distance from the player and destroying them after they are left behind.
For future reference for anyone who needs, I found an different answer specific to my problem. As JohnV said, I may need to round of values when setting the position, but when running the code, I found out that I also need to do that when running SKActions.

How to avoid dynamics deformation using UIDynamicAnimator

I have a simple tetris-like app where I am dropping square colored labels from the top of the view and they pile up once they collide with the bottom border of the window.
I am using the iOS dynamics framework to simulate the gravity and the collision.
I have 2 questions:
How can I eliminate completely the bouncing effect when the blocs collide with the bottom border? I have tried setting the elasticity of the collision behaviour to 0 ( the documentation say that this implies no bouncing at all) but the blocs still bounce a bit.
So I guess it is a bug in the documentation at least. can anybody confirm this before someone will suggest to try other workarounds?
Another technique I have tried are to set a very high resistance when the object starts the collision and resetting it to low resistance when the collision ends. The problem with this approach is that this behaviour which is generic for all the blocks, would cause the other falling blocks to be affected ( and slowed down) every time there is a collision happening at the bottom
the second question is about how can I stop the animation engine to squeeze the blocks while the pile up ( simulating the real-world effect of gravity over non completely rigid bodies) In my app I can clearly see that the blocks are not aligned because they get squeezed up by the weight of the blocks above.
How can I avoid this behaviour? I have tried to set the density to 1 and the elasticity to 0 without luck.
I have also noticed that some blocks at the bottom have the y coordinate = 481 which means that they have been pushed out of their parent view. How is this possible given that the bottom is considered as a collision boundary?
To eliminate bouncing effect, you need to add a UIDynamicItemBehavior to your animator. And set the property "elasticity" on the UIDynamicItemBehavior to 0. You can also set the "resistance" property on your UIDynamicItemBehavior to 1 or the max, CGFLOAT_MAX.

WebGL z-buffer artifacts?

We are working on a Three.js based WebGL project, and have trouble understanding how transparency is handled in WebGL. The image shows a doublesided surface drawn with alpha = 0.7, which behaves correctly on its right side. However closer to the middle strange artifacts appear, and on the left side the transparency does not seem to work at all.
http://emilaxelsson.se/sandbox/vis1/alpha.png
The problem can also be seen here:
http://emilaxelsson.se/sandbox/vis1/
Has anyone seen anything similar before? What could the reason be?
Your problem is that transparent objects needs to be sorted and rendered in a back-to-front order (if you try to change the opacity of your mesh from 0.7 (transparent) to 1.0 (opaque), you can see that the z-buffer works just fine).
See:
http://www.opengl.org/wiki/Transparency_Sorting
http://www.opengl.org/archives/resources/faq/technical/transparency.htm (15.050)
In your case it might be less trivial to solve, since I assume that you only have one mesh.
Edit: Just to summarize the discussion below. It is possible to achieve correct rendering of such a double-sided transparent mesh. To do this, you need to create 6 versions of the mesh, corresponding to 6 sides of a cube. Each version needs to be sorted in a back-to-front order based on the 'side of the cube' (front, back, left, right, top, bottom).
When rendering choose the correct mesh (based on the camera viewing direction) and render that single mesh.
The easy solution for your case (based on the picture you attached), without going to expensive sorting and multiple meshes, is to disable depth test and enable face culling. That produces acceptable results if you do not have any opaque objects in front of the mesh.

OpenGL ES: Drawing small objects

To best illustrate the issue I'm having, I created a short screen grab. Watch it here: http://cl.ly/1o3p3x2e2J1a1d3d2N1Q
Basically, the stars on the screen, as they're animated across the screen from right to left, are dimming and brightening on their own. I'm not intending on this happening. When you zoom in, the issue disappears.
My hunch is that this has to do with the size of the objects being drawn and the pixel boundaries. Is this correct? What is the best way to go about fixing this issue?
Thanks!
---Edit---
Here's how I'm loading the texture: http://pastebin.com/RDc8x7Te
And, here's how I'm setting up OpenGL ES: http://pastebin.com/SpvAqPqA
You use nearest and linear for scaling textures, which are both not very accurate. You might want to use linear for both, or build mipmaps. Also in case you use an orthogonal view, try aligning your geometry on pixels.

Resources