The goal is to simulate lighting similar to these images:
http://i.stack.imgur.com/4Kh0S.jpg
http://i.stack.imgur.com/LMePj.jpg
http://i.stack.imgur.com/mGfva.jpg
There is little documentation on SceneKit lighting, and how different lighting types interact with each other (e.g., what happens if you add a spot light to a scene with an ambient light already there), so through painful trial-and-error, we have gotten this far:
As shown in the Scene Graph, there is an ambient light and a spot light. (The omni light and the directional light are hidden.) The shadows and lighting are pretty good inside the spot's cone, but everything beyond the cone of light is black.
Question 1: how do you make it so the area outside the spot's cone is not black? There is an ambient light in the scene (not the default one, one was explicitly added), so shouldn't that brighten the areas outside the cone?
Question 2: Ideally, the whole scene would be litas if inside the cone while preserving the shadows. Is this possible? Moving the spot to a high Y value (e.g., 1000) lights up the whole scene, but the cool shadows vanish.
Question 3: In the screenshot below, enabling the omni light washes out the spot's cone. Is this expected behavior? How can you combine the lights so they don't wash each other out?
Screenshot 2 (enabling omni light washes out spot lighting):
You can add additional light source to the scene with ambient type and low intensity.
Here is swift 4 example:
let light = SCNLight()
light.type = .ambient
let node = SCNNode()
node.light = alight
self.scene.rootNode.addChildNode(node)
Related
I'm trying to animate a trajectory of a particle on a spherical surface in manim. I made the sphere semi-transparent so it's easy to say when the particle is on the front side and when it is on the back side of the sphere. However after I plot the trajectory, even the parts that are supposed to be behind the sphere are the same color as the ones in front of it.
As you can see this issue is not present in the coordinate axes. The parts of axes that are inside of the sphere have different color, because the semi-transparent surface is between them and camera.
I'm using the following code
S=Sphere(center=(0,0,0), radius=1.09,resolution=(15, 15)).set_opacity(0.4)
S.set_color(GRAY)
self.add(axes,S)
for i in range(100):
self.play(Create(Traj[i]))
Where Traj is an array consisting of line elements of the trajectory.
Even if I set the sphere opacity to 1, I can still see the whole trajectory, even if most of it should be behind the spehre. How to make the sphere cover the back part of the trajectory?
I have a SCNLight with type SCNLightTypeDirectional. When scene rendered, model casts shadows on itself and It wasn't I expectation. How to exclude model's shadows on itself?
Or how to smooth the shadows edge? It looks very unnatural now.
There is the scenario :
Well, I find a simple way to achieve this but loss some material details.
Change the light model of material to SCNLightingModelConstant and exclude model from lighting calculation of your SCNLight.
1. set light model
SCNLightingModelConstant only consider ambient light to shading, so We need ambient lights to keep model visible.
model.geometry.materials.firstObject.lightingModelName = SCNLightingModelConstant;
2. set category bit mask of model and lights
model.categoryBitMask = 1;
directionalLight.categoryBitMask = ~1UL;
If results of bitwise AND of categoryBitMask is zero, node will not take consideration into light illumination, so there no self-shadows anymore. Shadows model casted will still remain in scene.
Would love help understanding directional lights and scene shadows in Scene Kit.
The class reference on SCNLight says zFar represents the maximum distance between the light and a visible surface for casting shadows. It further suggests this value only applies to spot lights.
However, in the Xcode Scene Editor, under the Attributes Inspector, there is a field for Far Clipping. Changing this value affects shadows projected by a directional light as illustrated by the screenshots below.
The scenes below were produced by dragging a directional light into the scene and changing the X Euler Angle value to -60 and ticking the "Casts Shadows" box. The floor texture is taken from the WWDC Fox demo,
Is Far Clipping the same as zFar? If not, what's the difference?
Since directional lights ignore the position property, why does changing the Far Clipping value affect the shadows produced by a directional light?
The goal is to light the whole scene, and project shadows on nodes, as if the sun was at 3 PM in the afternoon on a cloudless day. Is it possible to use a directional light to achieve this? So far, using directional lights can achieve the look where the whole scene is lit, but cannot control shadows as well as a spotlight.
Screenshot #1: Far Clipping value is 10.
Screenshot #2: Far Clipping value is 30.
Despite what Apple's documentation says, the position of a directional light is very important when it casts shadows. zNear and zFar are distances from the directional light position.
To remove the artifact you are seeing, you will need to increase zFar or move the directional light closer to the ground. The artifact you are seeing is caused by the shadowed part being further away from the directional light than zFar.
I'm trying to detect an orange ball regardless of the lighting conditions. I wanted to point out that in my algorithm I convert an RGB image to HSV (which should be independent of the brightness) but not in optimum conditions I can not find the ball.
Update: these are two image the with different light condition.
If I find the ball in the first image I can not find it in the second image and vice versa.
Update: this is the result using HougCircle
circles = cv2.HoughCircles(img,cv2.cv.CV_HOUGH_GRADIENT,1,100,param1=75,param2=16,minRadius=100,maxRadius=1000)
However i need to know what is the color of the ball. Is there a method to find the color of the circle found with with HoughCircles
One approach to your problem could be to look for circles using the hough transform. This approach is based on the observation that there is clear border of the ball against the background.
I am writing simple hex engine for action-rpg in XNA 3.1. I want to light ground near hero and torches just as they were lighted in Diablo II. I though the best way to do so was to calculate field-of-view, hide any tiles and their's content that player can't see and draw special "Light" texture on top of any light source: Texture that is black with white, blurred circle in it's center.
I wanted to multiply this texture with background (as in blending mode: multiply), but - unfortunately - I do not see option for doing that in SpriteBatch. Could someone point me in right direction?
Or perhaps there is other - better - way to achive lighting model as in Diablo II?
If you were to multiply your light texture with the scene, you will darken the area, not brighten it.
You could try rendering with additive blending; this won't quite look right, but is easy and may be acceptable. You will have to draw your light with a fairly low alpha for the light texture to not just over saturate that part of the image.
Another, more complicated, way of doing lighting is to draw all of your light textures (for all the lights in the scene) additively onto a second render target, and then multiply this texture with your scene. This should give much more realistic lighting, but has a larger performance overhead and is more complex.
Initialisation:
RenderTarget2D lightBuffer = new RenderTarget2D(graphicsDevice, screenWidth, screenHeight, 1, SurfaceFormat.Color);
Color ambientLight = new Color(0.3f, 0.3f, 0.3f, 1.0f);
Draw:
// set the render target and clear it to the ambient lighting
graphicsDevice.SetRenderTarget(0, lightBuffer);
graphicsDevice.Clear(ambientLight)
// additively draw all of the lights onto this texture. The lights can be coloured etc.
spriteBatch.Begin(SpriteBlendMode.Additive);
foreach (light in lights)
spriteBatch.Draw(lightFadeOffTexture, light.Area, light.Color);
spriteBatch.End();
// change render target back to the back buffer, so we are back to drawing onto the screen
graphicsDevice.SetRenderTarget(0, null);
// draw the old, non-lit, scene
DrawScene();
// multiply the light buffer texture with the scene
spriteBatch.Begin(SpriteBlendMode.Additive, SpriteSortMode.Immediate, SaveStateMode.None);
graphicsDevice.RenderState.SourceBlend = Blend.Zero;
graphicsDevice.RenderState.DestinationBlend = Blend.SourceColor;
spriteBatch.Draw(lightBuffer.GetTexture(), new Rectangle(0, 0, screenWidth, screenHeight), Color.White);
spriteBatch.End();
As far as I know there is no way to do this without using your own custom shaders.
A custom shader for this would work like so:
Render your scene to a texture
Render your lights to another texture
As a post process on a blank quad, sample the two textures and the result is Scene Texture * Light Texture.
This will output a lit scene, but it won't do any shadows. If you want shadows I'd suggest following this excellent sample from Catalin Zima
Perhaps using the same technique as in the BloomEffect component could be an idea.
Basically what the effect does is grabbing the rendered scene, calculates a bloom image from the brightest areas in the scene, the blurs and combines the two. The result is highlighting areas depending on color.
The same approach could be used here. It will be simpler since you won't have to calculate the bloom image based on the background, only based on the position of the character.
You could even reuse this further to provide highlighting for other light sources as well, such as torches, magic effects and whatnot.