I'm currently working on an ARKit project where I would like to darken the actual camera feed so the objects in my 3D scene stand out more.
2 Solutions I found so far:
A) Manually applying CIFilter to the camera frames and setting those as background image to the SceneKit scene as answered in this SO post
The problem here is that fps tanks significantly.
B) Set a background color like so:
sceneView.scene.background.contents = UIColor(white: 0.0, alpha: 0.2)
Sadly, colors with alpha <1 are still opaque, so no matter what alpha I set I can't see anything of the camera feed.
Can anyone think of a different trick to darken the camera feed?
Your option B doesn't work for two reasons:
the scene view is opaque, so there's nothing behind it for a partially transparent background color to blend with.
sceneView.scene.background is what actually displays the camera image, so if you set it to a color you're not displaying the camera image at all anymore.
Some other options (mostly untested) you might look into:
As referenced from the answer you linked, use SCNTechnique to set up multipass rendering. On the first pass, render the whole scene with the excludeCategoryMask (and your scene contents) set up to render nothing but the background, using a shader that dims (or blurs or whatever) all pixels. On the second pass, render only the node(s) you want to appear without that shader (use a simple pass-through shader).
Keep your option B, but make the scene view non-opaque. Set the view's backgroundColor (not the scene's background) to a solid color, and set the transparency of the scene background to fade out the camera feed against that color.
Use geometry to create a "physical" backdrop for your scene — e.g. an SCNPlane of large size, placed as a child of the camera at some fixed distance that's much farther than any other scene content. Set the plane's background color and transparency.
Use multiple views — the ARSCNView, made non-opaque and with a clear background, and another (not necessarily an SceneKit view) that just shows the camera feed. Mess with the other view (or drop in other fun things like UIVisualEffectView) to obscure the camera feed.
File a bug with Apple about sceneView.background not getting the full set of shading customization options that nodes and materials get (filters, shader modifiers, full shader programs) etc, without which customizing the background is much more difficult than customizing other aspects of the scene.
I achieved this effect by creating a SCNNode with a SCNSphere geometry and keeping it attached to the camera using ARSCNView.pointOfView.
override func viewDidLoad() {
super.viewDidLoad()
let sphereFogNode = makeSphereNode()
arView.pointOfView!.addChildNode(sphereFogNode)
view.addSubview(arView)
}
private static func makeSphereGeom() -> SCNSphere {
let sphere = SCNSphere(radius: 5)
let material = SCNMaterial()
material.diffuse.contents = UIColor(white: 1.0, alpha: 0.7)
material.isDoubleSided = true
sphere.materials = [material]
return sphere
}
private static func makeSphereNode() -> SCNNode {
SCNNode(geometry: makeSphereGeom())
}
Clipping Outside Sphere
This darkens the camera along with anything outside the bounds of the sphere. Hit testing (ARFrame.hitTest) does not respect the sphere boundary. You can receive results from outside the sphere.
Things which are outside your sphere will be seen through the sphere's opacity. It seems that non-transparent things will become transparent outside the sphere.
The white part is the plane inside a sphere and the grey is the plane outside the sphere. The plane is a solid white and non-transparent. I tried using SCNScene.fog* to clip SceneKit graphics outside the sphere, but it seems that fog doesn't occlude rendered content, just affects its appearance. SCNCamera.zFar doesn't work also as it clips based on the Z-distance, not on the straight line distance between the camera and target.
Just make your sphere big enough and everything will look fine.
For those who want to implement rickster's option "Use geometry to create a "physical" backdrop for your scene" here is my code (plane and transparency of it made in .scnassets)
guard let backgroundPlane = sceneView.scene.rootNode.childNode(withName: "background", recursively: false) else { return }
backgroundPlane.removeFromParentNode()
backgroundPlane.position = SCNVector3Make(0, 0, -2)
sceneView.pointOfView?.addChildNode(backgroundPlane)
Related
This question already has answers here:
ARKit hide objects behind walls
(5 answers)
SceneKit Culling Plane
(2 answers)
Closed 5 years ago.
So to be clear on my goals, since I don't have any code to share... Lets say I have a SCNNode which is positioned between the camera and another SCNNode. The first SCNNode is a SCNBox, but has no texture, thus the second SCNNode can be seen behind it. I want to give the first node a transparent material, but to have it also occlude all nodes behind it, as though it was opaque. In a regular scene, this would mean that you could see the scene background color, black perhaps, but I'm planning on doing this in ARKit, which makes more sense as that means you'd simply see the real world behind it.
You can use material with clear color:
extension SCNMaterial {
convenience init(color: UIColor) {
self.init()
diffuse.contents = color
}
convenience init(image: UIImage) {
self.init()
diffuse.contents = image
}
}
let clearMaterial = SCNMaterial(color: .clear)
boxNode.materials = [clearMaterial]
I've tested my idea from the comments and it seems to work, not be perfectly but I'll expand later on this point.
To support the rendering process SceneKit uses a depth buffer and render a point only if it will be in front of what is saved in said buffer so we have to tell SceneKit to render your see-through cube first then all the other nodes, so leave your cube node renderingOrder property to 0 (the default value) then set all the other nodes renderingOrder to a higher value, i.e. 1, 10... Normally for transparent objects you don't want to write to the depth buffer so you can see objects behind but it's not the case so leave your cube material writeToDepthBuffer property to true (the default value). Last thing to do is to make your cube transparent, you can use the default material and then add
cube.geometry?.firstMaterial?.transparency = 0.00000001
As I've said before this method is not perfect and it feels more of a workaround... but it works. The reason why we don't set the transparency to exactly 0 is that if we do so is like the cube is not even there, that is fully transparent pixel are not saved to the depth buffer.
I am writing an IOS app for chess programming using SpriteKit in Swift. While adding texture of a chess piece to an existing coloured SKSpriteNode, the SKView's background colour overwrites the SKSpriteNode's background colour.
Every chessboard square is an SKSpriteNode with Green or White colour. The expected behaviour is to retain the SKSpriteNode's colour(Green or White) in the chess board square even after adding a texture (chess piece) on top of the chess board square. (i.e) The texture's background colour must dynamically match the chessboard square's current colour (Green or White).
SKView's background colour is set to browncolor
backgroundColor = SKColor.brownColor()
SKSpriteNode for chessboard squares (green and white squares) created inside nested loop.
boardSquare.square = SKSpriteNode(color: currentColor, size: boardSquare.squareSize)
Code to add optional chess piece texture. getPieceTexture returns optional SKTexture? for the initial square positions.
boardSquare.square.texture = getPieceTexture(boardSquare)
I could not find a way to get past the issue I mentioned in the beginning. I have also attached a screenshot of the chess board which shows, Chessboard squares, Chess Pieces and SKView's background colour. Can somebody please help me get a solution for this issue?
Kindly excuse me if there are any errors in this post. This is my first post.
Thanks,
ArtBajji
Partial screenshot of the ChessBoard
Created a new SKSpriteNode with the piece texture and placed the new node in the same position as the Chess board square node. The background colour of the Chess board square did not change to SKView's background colour but remained same as the square's colour. Hope this helps others facing similar issues. Thanks.
I am currently trying to set up a rotating ball in scene kit. I have created the ball and applied a texture to it.
ballMaterial.diffuse.contents = UIImage(named: ballTexture)
ballMaterial.doubleSided = true
ballGeometry.materials = [ballMaterial]
The current ballTexture is a semi-transparent texture as I am hoping to see the back face roll around.
However I get some strange culling where only half of the back facing polygons are shown even though the doubleSided property is set to true.
Any help would be appreciated, thanks.
This happens because the effects of transparency are draw-order dependent. SceneKit doesn't know to draw the back-facing polygons of the sphere before the front-facing ones. (In fact, it can't really do that without reorganizing the vertex buffers on the GPU for every frame, which would be a huge drag on render performance.)
The vertex layout for an SCNSphere has it set up like the lat/long grid on a globe: the triangles render in order along the meridians from 0° to 360°, so depending on how the sphere is oriented with respect to the camera, some of the faces on the far side of the sphere will render before the nearer ones.
To fix this, you need to force the rendering order — either directly, or through the depth buffer. Here's one way to do that, using a separate material for the inside surface to illustrate the difference.
// add two balls, one a child of the other
let node = SCNNode(geometry: SCNSphere(radius: 1))
let node2 = SCNNode(geometry: SCNSphere(radius: 1))
scene.rootNode.addChildNode(node)
node.addChildNode(node2)
// cull back-facing polygons on the first ball
// so we only see the outside
let mat1 = node.geometry!.firstMaterial!
mat1.cullMode = .Back
mat1.transparent.contents = bwCheckers
// my "bwCheckers" uses black for transparent, white for opaque
mat1.transparencyMode = .RGBZero
// cull front-facing polygons on the second ball
// so we only see the inside
let mat2 = node2.geometry!.firstMaterial!
mat2.cullMode = .Front
mat2.diffuse.contents = rgCheckers
// sphere normals face outward, so to make the inside respond
// to lighting, we need to invert them
let shader = "_geometry.normal *= -1.0;"
mat2.shaderModifiers = [SCNShaderModifierEntryPointGeometry: shader]
(The shader modifier bit at the end isn't required — it just makes the inside material get diffuse shading. You could just as well use a material property that doesn't involve normals or lighting, like emission, depending on the look you want.)
You can also do this using a single node with a double-sided material by disabling writesToDepthBuffer, but that could also lead to undesirable interactions with the rest of your scene content — you might also need to mess with renderingOrder in that case.
macOS 10.13 and iOS 11 added SCNTransparencyMode.dualLayer which as far as I can tell doesn't even require setting isDoubleSided to true (the documentation doesn't provide any information at all). So a simple solution that's working for me would be:
ballMaterial.diffuse.contents = UIImage(named: ballTexture)
ballMaterial.transparencyMode = .dualLayer
ballGeometry.materials = [ballMaterial]
Background
I want to move a SCNNode and a UIView containing an image synchronously. The UIView with the UIImageView are positioned on the Node, so that it looks like they are the texture of the SCNNode (a cube).
Code
let move:Float = 15
let moveCube: SCNAction = SCNAction.moveTo(SCNVector3Make(cube.position.x-move, cube.position.y, cube.position.z), duration: 1)
What I tried / How I do it right now
I animate the UIView using:
var move:Float = 15
var projMove = scnView.projectPoint(SCNVector3(x: move, y: 0, z: 0)) //converts 3D coordSystem into 2D
UIView.animateWithDuration(1, delay: 0, options: nil | UIViewAnimationOptions.CurveEaseOut, animations: {
self.myView.center.x = CGFloat(-projMove.x)
}, completion: { finished in
})
This works, the cube moves to the left, the UIView as well.
But this code is not really the best solution I think.
Question(s)
Is there a better way to let the cube move left, including the UIView?
Again, I want to move it both at the same time, best would be with one code segment
Can I possibly set one surface's (the front e.g.) texture to the image instead?
I want to set the image as only one side's texture
Could I even set the overall texture to an image and then put the image above it using it's alpha channel?
adding up to #2, I would like to set the cube's texture to a color and above that color I want to project the image (it has alpha layers so the color should still be viewable.)
Thanks in advance :)
Is there a better way to let the cube move left, including the UIView?
Yes. You can unproject the coordinates of the view and use this as a reference for the movement of the cube. See Unproject Point
Can I possibly set one surface's (the front e.g.) texture to the image instead?
Yes, simply set the diffuse channel of the material of the cube to your UIImage.
Could I even set the overall texture to an image and then put the image above it using it's alpha channel?
Maybe, I am not quite sure what you are talking about, would you mind expanding on that? If I understand a little bit Spritekit would be your best bet.
Here are updated answers for the comments:
I do use the unprojected points before projecting them in the SCNAction. And I meant more like moving both at once instead of a separate animation for each.
I don't think there is. You could animate a third party property and change its setter to change both the view and the node. You can also use blocks but in the end you cannot link the two directly.
Well, I want to set the image only to one side of the cube.
You can simply provide an array of 6 materials, one being your image and the 5 other ones a second material to fill. You'll need to play with the order to find where the image needs to be in the array.
That relates to #2, I want to set the texture to a color and then set one side's texture to the image I want to use.
There are two ways for this. You can use a shader that will add your image on top of a solid color, or you can make a second cube that is slightly smaller (less than 1%), and make that cube the background color you want. Then, use a transparency image on the larger one.
Im using cocos2d to create a simple drawing app, I create a 32 bit texture in code in the shape of the brush i need (circle) with a simple hardness gradient (alpha = 1 in middle and alpha = 0 close to edge) The texture is obviously square so alpha is 0 outside the circle. The user touches the screen to draw and a the texture is repeated according to a separation constant.
i tried without alpha blending enabled and i get horrible result.
With (ccblendfunc){GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA} i get boundaries appearing between each instance of the texture. This effect does not happen if i manually create each point by tapping instead of dragging.
And with (ccblendfunc){GL_SRC_ALPHA,GL_ONE} i get good results but at the edges the colors are added together, blue and green = cyan, red and green = yellow and red and blue = purple
how can i simply create a blend mode that works like photoshop's standard????