Blend material onto another material in SceneKit using PBR iOS - ios

I have already added a material to my geometry of SCNNode and now I want to add another material to it and set it to blend mode 'multiply'.
I tried a lot but unable to find a way to do this. If we blend the texture as multiply then it works for other light setups but not with the PBR.
material.lightingModel = .physicallyBased
let image = UIImage(named: "1.PNG")
material.multiply.contents = image
material.multiply.contentsTransform = SCNMatrix4MakeScale(10, 10, 0)
material.multiply.wrapT = .repeat
material.multiply.wrapS = .repeat
material.multiply.intensity = 1.0
Any help on this?
Thanks

Related

How to apply an image or color texture to a MDLMesh box or plane?

I am working to learn some more about Metal, and have been following some great tutorials (including Apple's built-in Metal-based renderers for use with ARKit, when creating a new project as a File -> New -> AR -> Metal Content Technology). One of the things I would like to do, however, is create a simple box or plane in my Metal renderer, and texture it with either an image from my local bundle, or even a simple color.
I am creating my MDLMesh within my renderers loadAsset() method, and I can confirm this is working to display my mesh tethered to found ARAnchor, but isn't updating the material properly.
// Use ModelIO to create a mesh as our object
let mesh = MDLMesh(planeWithExtent: vector3(0.33, 0.33, 0.075), segments: vector2(1, 1), geometryType: .triangles, allocator: metalAllocator)
The plane is, by default, seemingly rendered in a solid green color. I'd like to change the plane to be red. I've tried;
let scatteringFunction = MDLScatteringFunction()
let material = MDLMaterial(name: "material-test", scatteringFunction: scatteringFunction)
material.setProperty(MDLMaterialProperty(name: "baseColor", semantic: .baseColor, color: UIColor.red.cgColor))
for submesh in mesh.submeshes as! [MDLSubmesh] {
submesh.material = material
}
This seems to have no effect on the texture color of the plane. Additionally, my end goal is to have a transparent PNG image rendered as the texture of the plane, though similar attempts, like so, also do not seem to do anything;
guard let textureURL = Bundle.main.url(forResource: "building", withExtension: "png") else { return }
let scatteringFunction = MDLScatteringFunction()
let material = MDLMaterial(name: "material-test", scatteringFunction: scatteringFunction)
material.setProperty(.init(name: "baseColor", semantic: .baseColor, url: textureURL))
for submesh in mesh.submeshes as! [MDLSubmesh] {
submesh.material = material
}
Any advice on where I'm going wrong would be greatly appreciated!

Turn on occlusion with lines for mesh in scenekit

I would like to display a mesh with iOS ARKit with SceneKit and not RealityKit.
I am able to add a material to the mesh, like so which displays the mesh just fine
let edgesMaterial = SCNMaterial()
edgesMaterial.fillMode = .lines
edgesMaterial.lightingModel = .constant
edgesMaterial.transparency = 1.0
edgesMaterial.diffuse.contents = UIColor.red
scnGeom.materials = [edgesMaterial]
However, this adds overlapping lines of the wireframes in the scene. With RealityKit the fix is rather simple by just adding
arView.environment.sceneUnderstanding.options.insert(.occlusion)
How can I get the same effect with a scenekit scene? Basically I want to add a occlusion material with lines. Any pointers?
UPDATE:
I can get something close but not quite the same visualization by using this:
sceneView.debugOptions.insert([.showWireframe])
let occMaterial = SCNMaterial()
occMaterial.colorBufferWriteMask = SCNColorMask(rawValue: 0)
scnGeom.materials = [occMaterial]
The lines appear white and also any object I place in the scene will render the wireframe automatically now with this

Normal mapping in Scenekit

I am trying to add normal map for a 3D model in swift using SCNMaterial properties. The diffuse property is working but no other property including normal property is visible on the screen. When I debug to check if the node's material consists of the normal property, it shows the property exists with the image that I added.
I have also checked if the normal image that I am using is correct or not in the SceneKit Editor where it works fine.
I have added the code that I am using.
let node = SCNNode()
node.geometry = SCNSphere(radius: 0.1)
node.geometry!.firstMaterial!.diffuse.contents = UIColor.lightGray
node.geometry!.firstMaterial!.normal.contents = UIImage(named: "normal")
node.position = SCNVector3(0,0,0)
sceneView.scene.rootNode.addChildNode(node)
This is the output I am getting
I am expecting something like this
I got the solution. Since I did not enable DefaultLighting, there was no lighting in the scene. Added this to the code.
sceneView.autoenablesDefaultLighting = true
Given the screenshot, it seems like there is no lighting in the scene, or the material does not respond to lighting, since the sphere is not shaded. For a normal map to work, lighting has to be taken into account, because it responds to lighting direction. Have you tried creating an entirely new SCNMaterial and played with its properties? (I.E. https://developer.apple.com/documentation/scenekit/scnmaterial/lightingmodel seems interesting)
I would try setting
node.geometry!.firstMaterial!.lightingModel = .physicallyBased
Try this.
let scene = SCNScene()
let sphere = SCNSphere(radius: 0.1)
let sphereMaterial = SCNMaterial()
sphereMaterial.diffuse.contents = UIImage(named: "normal.png")
let sphereNode = SCNNode()
sphereNode.geometry = sphere
sphereNode.geometry?.materials = [sphereMaterial]
sphereNode.position = SCNVector3(0.5,0.1,-1)
scene.rootNode.addChildNode(sphereNode)
sceneView.scene = scene

Can I make shadow that can look through transparent object with scenekit and arkit?

I made transparent object with scenekit and linked with arkit.
I made a shadow with lightning material but can't see the shadow look through the transparent object.
I made a plane and placed the object on it.
And give the light to a transparent object.
the shadow appears behind the object but can not see through the object.
Here's code that making the shadow.
let light = SCNLight()
light.type = .directional
light.castsShadow = true
light.shadowRadius = 200
light.shadowColor = UIColor(red: 0, green: 0, blue: 0, alpha: 0.3)
light.shadowMode = .deferred
let constraint = SCNLookAtConstraint(target: model)
lightNode = SCNNode()
lightNode!.light = light
lightNode!.position = SCNVector3(model.position.x + 10, model.position.y + 30, model.position.z+30)
lightNode!.eulerAngles = SCNVector3(45.0, 0, 0)
lightNode!.constraints = [constraint]
sceneView.scene.rootNode.addChildNode(lightNode!)
And the below code is for making a floor under the bottle.
let floor = SCNFloor()
floor.reflectivity = 0
let material = SCNMaterial()
material.diffuse.contents = UIColor.white
material.colorBufferWriteMask = SCNColorMask(rawValue:0)
floor.materials = [material]
self.floorNode = SCNNode(geometry: floor)
self.floorNode!.position = SCNVector3(x, y, z)
self.sceneView.scene.rootNode.addChildNode(self.floorNode!)
I think it can be solved with simple property but I can't figure out.
How can I solve the problem?
A known issue with deferred shading is that it doesn’t work with transparency so you may have to remove that line and use the default forward shading again. That said, the “simple property” you are looking for is the .renderingOrder property on the SCNNode. Set it to 99 for example. Normally the rendering order doesn’t matter because the z buffer is used to determine what pixel is in front of others. For the shadow to show up through the transparant part of the object you need to make sure the object is rendered last.
On a different note, assuming you used some of the material settings I posted on your other question, try setting the shininess value to something like 0.4.
Note that this will still create a shadow as if the object was not transparent at all, so it won’t create a darker shadow for the label and cap. For additional realism you could opt to fake the shadow entirely, as in using a texture for the shadow and drop that on a plane which you rotate and skew as needed. For even more realism, you could fake the caustics that way too.
You may also want to add a reflection map to the reflective property of the material. Almost the same as texture map but in gray scale, where the label and cap are dark gray (not very reflective) and a lighter gray for the glass portion (else it will look like the label is on the inside of the glass). Last tip: use a Shell modifier (that’s what it’s called in 3Ds max anyway) to give the glass model some thickness.

How do you play a video with alpha channel using AVFoundation?

I have an AR application which uses SceneKit, and imports a video on to scene using AVPlayer and thereby adding it as a child node of an SKVideo node.
The video is visible as it is supposed to, but the transparency in the video is not achieved.
Code as follows:
let spriteKitScene = SKScene(size: CGSize(width: self.sceneView.frame.width, height: self.sceneView.frame.height))
spriteKitScene.scaleMode = .aspectFit
guard let fileURL = Bundle.main.url(forResource: "Triple_Tap_1", withExtension: "mp4") else {
return
}
let videoPlayer = AVPlayer(url: fileURL)
videoPlayer.actionAtItemEnd = .none
let videoSpriteKitNode = SKVideoNode(avPlayer: videoPlayer)
videoSpriteKitNode.position = CGPoint(x: spriteKitScene.size.width / 2.0, y: spriteKitScene.size.height / 2.0)
videoSpriteKitNode.size = spriteKitScene.size
videoSpriteKitNode.yScale = -1.0
videoSpriteKitNode.play()
spriteKitScene.backgroundColor = .clear
spriteKitScene.addChild(videoSpriteKitNode)
let background = SCNPlane(width: CGFloat(2), height: CGFloat(2))
background.firstMaterial?.diffuse.contents = spriteKitScene
let backgroundNode = SCNNode(geometry: background)
backgroundNode.position = position
backgroundNode.constraints = [SCNBillboardConstraint()]
backgroundNode.rotation.z = 0
self.sceneView.scene.rootNode.addChildNode(backgroundNode)
// Create a transform with a translation of 0.2 meters in front of the camera.
var translation = matrix_identity_float4x4
translation.columns.3.z = -0.2
let transform = simd_mul((self.session.currentFrame?.camera.transform)!, translation)
// Add a new anchor to the session.
let anchor = ARAnchor(transform: transform)
self.sceneView.session.add(anchor: anchor)
What could be the best way to implement the transparency of the Triple_Tap_1 video in this case.
I have gone through some stack overflow questions on this topic, and found the only solution to be a KittyBoom repository that was created somewhere in 2013, using Objective C.
I'm hoping that the community can reveal a better solution for this problem. GPUImage library is not something I could get to work.
I've came up with two ways of making this possible. Both utilize surface shader modifiers. Detailed information on shader modifiers can be found in Apple Developer Documentation.
Here's an example project I've created.
1. Masking
You would need to create another video that represents a transparency mask. In that video black = fully opaque, white = fully transparent (or any other way you would like to represent transparency, you would just need to tinker the surface shader).
Create a SKScene with this video just like you do in the code you provided and put it into material.transparent.contents (the same material that you put diffuse video contents into)
let spriteKitOpaqueScene = SKScene(...)
let spriteKitMaskScene = SKScene(...)
... // creating SKVideoNodes and AVPlayers for each video etc
let material = SCNMaterial()
material.diffuse.contents = spriteKitOpaqueScene
material.transparent.contents = spriteKitMaskScene
let background = SCNPlane(...)
background.materials = [material]
Add a surface shader modifier to the material. It is going to "convert" black color from the mask video (well, actually red color, since we only need one color component) into alpha.
let surfaceShader = "_surface.transparent.a = 1 - _surface.transparent.r;"
material.shaderModifiers = [ .surface: surfaceShader ]
That's it! Now the white color on the masking video is going to be transparent on the plane.
However you would have to take extra care of syncronizing these two videos since AVPlayers will probably get out of sync. Sadly I didn't have time to address that in my example project (yet, I will get back to it when I have time). Look into this question for a possible solution.
Pros:
No artifacts (if syncronized)
Precise
Cons:
Requires two videos instead of one
Requires synchronisation of the AVPlayers
2. Chroma keying
You would need a video that has a vibrant color as a background that would represent parts that should be transparent. Usually green or magenta are used.
Create a SKScene for this video like you normally would and put it into material.diffuse.contents.
Add a chroma key surface shader modifier which will cut out the color of your choice and make these areas transparent. I've lent this shader from GPUImage and I don't really know how it actually works. But it seems to be explained in this answer.
let surfaceShader =
"""
uniform vec3 c_colorToReplace = vec3(0, 1, 0);
uniform float c_thresholdSensitivity = 0.05;
uniform float c_smoothing = 0.0;
#pragma transparent
#pragma body
vec3 textureColor = _surface.diffuse.rgb;
float maskY = 0.2989 * c_colorToReplace.r + 0.5866 * c_colorToReplace.g + 0.1145 * c_colorToReplace.b;
float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
float maskCb = 0.5647 * (c_colorToReplace.b - maskY);
float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
float Cr = 0.7132 * (textureColor.r - Y);
float Cb = 0.5647 * (textureColor.b - Y);
float blendValue = smoothstep(c_thresholdSensitivity, c_thresholdSensitivity + c_smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));
float a = blendValue;
_surface.transparent.a = a;
"""
shaderModifiers = [ .surface: surfaceShader ]
To set uniforms use setValue(:forKey:) method.
let vector = SCNVector3(x: 0, y: 1, z: 0) // represents float RGB components
setValue(vector, forKey: "c_colorToReplace")
setValue(0.3 as Float, forKey: "c_smoothing")
setValue(0.1 as Float, forKey: "c_thresholdSensitivity")
The as Float part is important, otherwise Swift is going to cast the value as Double and shader will not be able to use it.
But to get a precise masking from this you would have to really tinker with the c_smoothing and c_thresholdSensitivity uniforms. In my example project I ended up having a little green rim around the shape, but maybe I just didn't use the right values.
Pros:
only one video required
simple setup
Cons:
possible artifacts (green rim around the border)

Resources