SCNParticleSystem partially hidden or occluded in wrong manner - ios

I have an issue with particle systems, which can be, in rare cases, kind of occluded in a wrong manner. The particle system you see on the print-screens is a sphere (with invisible material, material transparency = 0.0), that emits particles from its surface. Like 250 particles per second - no magic - and the particle systems works in 99% as it should.
You see also a floor (which is a SCNPlane) that has a very large diameter, of like 100m x 100m. The occlusion happens when the camera is flying by and the angle of the view changes a little bit, because the camera moves smoothly. Depending on the camera angle, it can happen - as you see on the second image - the particle system is occluded partially in a wrong manner, like it would stay behind the horizon - but it does not - it hoovers 2m above the floor and has a radius of 1m.
Did anyone ran into a similar issue? Is there something that could be done, to make this render correctly in all cases (from all viewing angles).
Sometimes the particle system disappears even completely. i.Ex when the camera looks from (20m) above directly on the particle system.
(The scene uses physically based rendering using SceneKit - the background is a simple skybox)

You asked if anyone ran into a similar issue?
I can answer yes!
Depending on the point of view (camera position), and the object on which the SCNParticleSystem is attached, I'm getting weird occlusions of the emitted particles.
I have no SCNPlane, but I have a large SCNSphere around the scene showing a 360 video. If I remove the sphere, the bug doesn't occur anymore.
It might be a regression with iOS 14.x and macOS 11.2, as the same application running under iOS 13.6.1 doesn't show the problem !

If somebody needs. I had similar problem and was trying to set different settings of the particle system for a while.
One of solutions was increased "Rendering order" of the node, that contains the particles, but particles disappear if you change camera orientation.
By chance, I discovered that bug happens when I add specific node to the scene. One difference I found, this node had a material with Transparency mode "Dual layer". I tried this one to make transparent texture.
I changed mode to "Default" and it helped.
Xcode screen

Related

Unity ARKit automatically positions terrain on startup

I just started learning ARKit with Unity. I've downloaded SDK from Asset store, imported it, opened demo scene and added a terrain. I've added it under HitCubeParent as a child:
http://shrani.si/f/40/UP/1q7QqoFl/1/capture.jpg
I've added a Unity AR Hit Test Example Script on a terrain and linked HitCubeParent to it:
http://shrani.si/f/6/133/3w5sasQA/1/capture1.jpg
When I build a game on iPhone, ARKit is working, but one thing that bothers me is that terrain is positioned automatically when scene starts (even though i don't tap on the screen). It causes bad positioning like terrain floating in the air or similar issues. I would like to modify the kit so when the scene starts, only generated blue plane is visible. User should then adjust the position of a plane to a table or similar flat surface and tap on the screen to position the terrain on that plane.
Like this:
https://www.youtube.com/watch?v=OCzuNnejwy4
Any good tutorials on this ? I've searched a lot but couldn't find anything usefull.
Disable the Terrain and enable it after the first successful ARHitTestResult. See line 68 in UnityARHitTestExample.cs:
if (HitTestWithResultType (point, resultType))
{
return;
}
This is actually confusing since this HitTest method actually positions the m_HitTransform and is not merely a test.
In this if block you could enable your terrain, after you disabled it in the Awake method.

SceneKit Directional Light causing flickering

I'm trying to add a directional light in Scenekit to cast shadows, but it is causing weird artefacts on objects.
The orange block below has a material with default settings and the diffuse set to orange.
The directional light is pointing downwards, and the scale is increased, otherwise it has default settings. (Making the scale smaller still has the same issue).
When I pan the camera around the texture is covered in flickering lines and dots, it looks terrible.
This isn't visible on the simulator, only the device. What is going on and how can I fix it?
Thanks to Toyos I now know that self-shadowing is what's causing the lines. The docs for shadowBias say setting this value should correct it, but for me it made no difference.
In the end I fixed it by rotating the directional light by 2 degrees. It was originally at -90, pointing straight down. Changing this to -88 has completely removed all the artefacts.
Configure the zNear/zFar range of your light to make it as small as possible (but not clipping your world). The smaller the zRange is the more precision you will get.
You can also play with the shadowBias to limit the self shadowing artefacts

Cross Viewport oclussion culling

I'm currently working on a XNA project where I need to create a Picture in picture Overlay to displaying a 3D scene from multiple angles. currently I'm trying to use 2 viewports to do this. The main one fills whole screen and is working as desired. The second is placed in one of the corners of the first (overlapping that corner) and is less than a 5th of the size as the first. apart from the size and placement of the viewports the only thing really difference between the 2 is the placement of their cameras with-in the scene.
As long as the second viewport is drawn second and there are no objects close to the camera of the first viewport in the overlapping corner this actually works greate. However if there is an object close to the camera and in the corner of the first viewport objects seem to experience occlusion culling as a result of the first viewport's object. The occluding object of the first viewport is not shown in the second viewports space though.
My question is how would one prevent the "cross viewport culling" from happening? I've searched all over and the closes threads I could find suggest drawing the second viewport to a RenderTarget2D and using a SpriteBatch to display the resulting texture. Though doing so does fix the occlusion issue it does mayhem on the z ordering, CCW culling, and my water effects all of which I've never had issue with using the default render target.
This issue, at least in my case, Seems to have been resolved by simply clearing the depth buffer between drawing the 2 Viewports. I did this by adding the following line between the function calls that draws the individual Viewports
GraphicsDevice.Clear(ClearOptions.DepthBuffer, Color.Black, 1, 0);
I ran in to this solution while trying to use the Stencil buffer to prevent the Main viewport from drawing under the second, but everything I did before this discovery did not really have a noticeable effect. After I removed all the Stencil code and it looks like I'm getting the desired effect. Sorry but I can't really explain how or why this works without messing up the first Viewport cause I have no clue myself.

Why does gaps between tiles in an orthogonal tilemap cocos2d game appear when running on iPhone?

I'm trying to make a tilemap-based game using cocos2d 2.1 and Tiled 0.9.1. The game runs perfectly on the simulator, but I have gaps (artifact lines) between the tiles when running on the device.
Please see the screenshot.
The diff is the difference (made in photoshop) between the original tile (taken straight from the png of the tileset) and the tile as rendered by cocos2d. As you can see, in simulator they are 100% identical. However, on the device it seems that cocos2d shrinks the tile texture vertically by just a little bit. The 1 pixel stripe is actually the texture above the troublesome tile in the tileset.
Any idea what caused this and how to fix it?
While using this answer In my case enabling CC_FIX_ARTIFACTS_BY_STRECHING_TEXEL was not enough.
I also added the following code to AppDelegate::applicationDidFinishLaunching() function and rounded values passed to setPosition(x, y) function to nearest int.
Director::getInstance()->setProjection(Director::Projection::_2D);
I use cocos2d-x 3.4.
Not certain why this happens on devices only, but you should read in ccConfig.h for parameter CC_FIX_ARTIFACTS_BY_STRECHING_TEXEL. This in itself is a bad kludge, but it gives you a hint as to where to look.
Basically, you should make certain that all your positions are on an exact pixel boundary, ie on non-retina devices cast them to int, and on retina devices round to the nearest exact multiple of .5. Best way to ensure that is to make all your textures w,h even numbers ... the onus is on the artist for anything that will not move. If you move things, and the final position is calculated (for example in a ccTouches move,end), make certain you do this rounding there. Beware of batch nodes : the node itself, and all its children should be on pel boundary.

OpenGL ES: Drawing small objects

To best illustrate the issue I'm having, I created a short screen grab. Watch it here: http://cl.ly/1o3p3x2e2J1a1d3d2N1Q
Basically, the stars on the screen, as they're animated across the screen from right to left, are dimming and brightening on their own. I'm not intending on this happening. When you zoom in, the issue disappears.
My hunch is that this has to do with the size of the objects being drawn and the pixel boundaries. Is this correct? What is the best way to go about fixing this issue?
Thanks!
---Edit---
Here's how I'm loading the texture: http://pastebin.com/RDc8x7Te
And, here's how I'm setting up OpenGL ES: http://pastebin.com/SpvAqPqA
You use nearest and linear for scaling textures, which are both not very accurate. You might want to use linear for both, or build mipmaps. Also in case you use an orthogonal view, try aligning your geometry on pixels.

Resources