I have used shader modifiers for Plane but its not working. Can anyone suggest me how to solve it?
let myShaderfragment = "#pragma transparent;\n" + "_output.color.a = 0.0;"
let myShaderSurface = "#pragma transparent;\n" + "_surface.diffuse.a = 0.0;"
material.shaderModifiers = [SCNShaderModifierEntryPoint.fragment : myShaderfragment, SCNShaderModifierEntryPoint.surface : myShaderSurface]
The SceneKit: What's New session from WWDC 2017 explains how to do that.
For the plane, use a material with constant as its lightingModel. It's the cheapest one.
This material will have writesToDepthBuffer set to true and colorBufferWriteMask set to [] (empty option set). That way the plane will write in the depth buffer, but won't draw anything on screen.
Set the light's shadowMode to deferred so that shadows are not applied when rendering the objects themselves, but as a final post-process.
There's a dedicated lighting model now (SCNLightingModelShadowOnly) to only render shadows
Related
I'm using metal to draw some lines, my drawing canvas has a texture in MTLRenderPassDescriptor and when I draw inside it blending is enabled MTLRenderPipelineDescriptor and I'm using alphaBlendOperation = .max
renderPassDescriptor = MTLRenderPassDescriptor()
let attachment = renderPassDescriptor?.colorAttachments[0]
attachment?.texture = self.texture
attachment?.loadAction = .load
attachment?.storeAction = .store
let rpd = MTLRenderPipelineDescriptor()
rpd.colorAttachments[0].pixelFormat = .rgba8Unorm
let attachment = rpd.colorAttachments[0]!
attachment.isBlendingEnabled = true
attachment.rgbBlendOperation = .max
attachment.alphaBlendOperation = .max
I can change the properties in brush (size, opacity, hardness "blur"). However first two brushes are working really great as in the image bellow
But I have only one weird behavior when I use blurred brush with faded sides where lines are connected the faded areas is not blending as expected and an empty small line created between the connection. the image bellow described this issue, please check the single line and single point and then check the connections you can see this behavior very clear
MTLRenderPassDescriptor Should choose even the bellow alpha from down texture or brush alpha but when tap in the second and third point its making empty line instead of choosing a one of the alpha, Its like making alpha zero in these areas.
This is my faded brush you can see there is a gradian of color but i don't know if there is a problem with it
Please share with me any idea you have to solve it
For whatever reason I am having issues with alpha blending in metal. I am drawing to a MTKView and for every pipeline that I create I do the following:
descriptor.colorAttachments[0].blendingEnabled = YES;
descriptor.colorAttachments[0].rgbBlendOperation = MTLBlendOperationAdd;
descriptor.colorAttachments[0].alphaBlendOperation = MTLBlendOperationAdd;
descriptor.colorAttachments[0].sourceRGBBlendFactor = MTLBlendFactorSourceAlpha;
descriptor.colorAttachments[0].sourceAlphaBlendFactor = MTLBlendFactorSourceAlpha;
descriptor.colorAttachments[0].destinationRGBBlendFactor = MTLBlendFactorOneMinusSourceAlpha;
descriptor.colorAttachments[0].destinationAlphaBlendFactor = MTLBlendFactorOneMinusSourceAlpha;
However for whatever reason that is not causing alpha testing to happen. You can even check in the frame debugger and you will see vertices with an alpha of 0 that are being drawn black rather than transparent.
One thought I had is that some geometry ends up on the exact same z plane so if alpha blending does not work on the same z plane that might cause an issue. But I dont think that is a thing.
Why is alpha blending not working?
I am hoping to blend as if they were transparent glass. Think like this.
Alpha blending is an order-dependent transparency technique. This means that the (semi-)transparent objects cannot be rendered in any arbitrary order as is the case for (more expensive) order-independent transparency techniques.
Make sure your transparent 2D objects (e.g., circle, rectangle, etc.) have different depth values. (This way you can define the draw ordering yourself. Otherwise the draw ordering depends on the implementation of the sorting algorithm and the initial ordering before sorting.)
Sort these 2D objects based on their depth value from back to front.
Draw the 2D objects from back to front (painter's algorithm) using alpha blending. (Of course, your 2D objects need an alpha value < 1 to actually see some blending.)
Your blend state for alpha blending is correct:
// The blend formula is defined as:
// (source.rgb * sourceRGBBlendFactor ) rgbBlendOperation (destination.rgb * destinationRGBBlendFactor )
// (source.a * sourceAlphaBlendFactor) alphaBlendOperation (destination.a * destinationAlphaBlendFactor)
// <=>
// (source.rgba * source.a) + (destination.rgba * (1-source.a))
descriptor.colorAttachments[0].blendingEnabled = YES;
descriptor.colorAttachments[0].rgbBlendOperation = MTLBlendOperationAdd;
descriptor.colorAttachments[0].alphaBlendOperation = MTLBlendOperationAdd;
descriptor.colorAttachments[0].sourceRGBBlendFactor = MTLBlendFactorSourceAlpha;
descriptor.colorAttachments[0].sourceAlphaBlendFactor = MTLBlendFactorSourceAlpha;
descriptor.colorAttachments[0].destinationRGBBlendFactor = MTLBlendFactorOneMinusSourceAlpha;
descriptor.colorAttachments[0].destinationAlphaBlendFactor = MTLBlendFactorOneMinusSourceAlpha;
Important: please note that this question is about VECTOR map. Not height map.
I'm trying to implement Vector displacement in Scenekit, as described on apple presentation:
https://www.youtube.com/watch?v=uli814Qugm8&app=desktop
Apple presentation on Scenekit vector displacement
My code is:
material?.diffuse.contents = UIImage(named: "\(materialFilePrefix)-albedo.jpg")
material?.displacement.contents = UIImage(named: "(materialFilePrefix)-displacement.exr")
material?.displacement.textureComponents = .all
My Xcode project:
enter image description here
But I don't get the displacement... Anything wrong with the code?
From checking out SCNMaterial’s header and the presentation, it looks like you might need to enable tessellation on your node’s geometry for displacement to work. That’d look like this:
let tessellator = SCNGeometryTesselator()
tessellator.edgeTessellationFactor = 2 // may not need this line (default is 1), or may need to set it higher to get a smooth result
tessellator.edgeTessellationFactor = 2 // ditto
sphereNode.geometry?.tessellator = tessellator
I want to make glass effect in SceneKit.
I searched in google but there's no perfect answer.
So I'm finding SceneKit warrior who can solve my problem clearly.
There's an image that I'm going to make.
It should be looks like real.
The glass effect, reflection and shadow are main point here.
I have obj and dae file already.
So, Is there anyone to help me?
Create a SCNMaterial and configure the following properties and assign it to the bottle geometry of a SCNNode :
.lightingModel = .blinn
.transparent.content = // an image/texture whose alpha channel defines
// the area of partial transparency (the glass)
// and the opaque part (the label).
.transparencyMode = .dualLayer
.fresnelExponent = 1.5
.isDoubleSide = true
.specular.contents = UIColor(white: 0.6, alpha: 1.0)
.diffuse.contents = // texture image including the label (rest can be gray)
.shininess = // somewhere between 25 and 100
.reflective.contents = // glass won’t look good unless it has something
// to reflect, so also configure this as well.
// To at least a gray color with value 0.7
// but preferably an image.
Depending on what else is in your scene, the background, and the lighting used, you will probably have to tune the values above to get the desired results. If you want a bottle without the label, use the .transparency property (set its contents to a gray color) instead of the .transparent property.
My ultimate goal is to have an SCNNode representing an image floating in space. This is more or less easily accomplished with the current code I have below, but the problem is that the back side of the image isn't rendered and is thus transparent from the back. I want to be able to display a different image on the back so that there is something to see from both sides. the isDoubleSided property doesn't work here because it simply mimics what's on the front. Any Ideas? I looked into the idea of creating my own geometry from Sources and Elements but it seemed very complex for what should be really simple.
My current code:
private func createNode() -> SCNNode{
let scaleFactor = image.size.width/0.2
let width = image.size.width/scaleFactor
let height = image.size.height/scaleFactor
let geometry = SCNPlane(width: width, height: height)
let material = SCNMaterial()
material.diffuse.contents = image
geometry.materials = [material]
return SCNNode(geometry: geometry)
}
Thanks!
Since you want different images, you need to use different materials. SceneKit allows specifying material per geometry element. SCNPlane has only one element, that's why isDoubleSided just mirrors image on the back side. You have two options here:
Create two SCNPlane geometries, orient them back to back and assign different images to each geometry.firstMaterial.diffuse.contents
Create custom SCNGeometry from SCNGeometrySource (4 vertices of plane) and two SCNGeometryElements (one for each side: 2 triangles, 6 indices), and assign array of two materials (different images) to geometry.
The first option is easier, but looks more like a workaround.