Scenekit - vector/tangent displacement map - ios

Important: please note that this question is about VECTOR map. Not height map.
I'm trying to implement Vector displacement in Scenekit, as described on apple presentation:
https://www.youtube.com/watch?v=uli814Qugm8&app=desktop
Apple presentation on Scenekit vector displacement
My code is:
material?.diffuse.contents = UIImage(named: "\(materialFilePrefix)-albedo.jpg")
material?.displacement.contents = UIImage(named: "(materialFilePrefix)-displacement.exr")
material?.displacement.textureComponents = .all
My Xcode project:
enter image description here
But I don't get the displacement... Anything wrong with the code?

From checking out SCNMaterial’s header and the presentation, it looks like you might need to enable tessellation on your node’s geometry for displacement to work. That’d look like this:
let tessellator = SCNGeometryTesselator()
tessellator.edgeTessellationFactor = 2 // may not need this line (default is 1), or may need to set it higher to get a smooth result
tessellator.edgeTessellationFactor = 2 // ditto
sphereNode.geometry?.tessellator = tessellator

Related

Glass effect in SceneKit material

I want to make glass effect in SceneKit.
I searched in google but there's no perfect answer.
So I'm finding SceneKit warrior who can solve my problem clearly.
There's an image that I'm going to make.
It should be looks like real.
The glass effect, reflection and shadow are main point here.
I have obj and dae file already.
So, Is there anyone to help me?
Create a SCNMaterial and configure the following properties and assign it to the bottle geometry of a SCNNode :
.lightingModel = .blinn
.transparent.content = // an image/texture whose alpha channel defines
// the area of partial transparency (the glass)
// and the opaque part (the label).
.transparencyMode = .dualLayer
.fresnelExponent = 1.5
.isDoubleSide = true
.specular.contents = UIColor(white: 0.6, alpha: 1.0)
.diffuse.contents = // texture image including the label (rest can be gray)
.shininess = // somewhere between 25 and 100
.reflective.contents = // glass won’t look good unless it has something
// to reflect, so also configure this as well.
// To at least a gray color with value 0.7
// but preferably an image.
Depending on what else is in your scene, the background, and the lighting used, you will probably have to tune the values above to get the desired results. If you want a bottle without the label, use the .transparency property (set its contents to a gray color) instead of the .transparent property.

Can we render shadow on a transparent Plane in SceneKit

I have used shader modifiers for Plane but its not working. Can anyone suggest me how to solve it?
let myShaderfragment = "#pragma transparent;\n" + "_output.color.a = 0.0;"
let myShaderSurface = "#pragma transparent;\n" + "_surface.diffuse.a = 0.0;"
material.shaderModifiers = [SCNShaderModifierEntryPoint.fragment : myShaderfragment, SCNShaderModifierEntryPoint.surface : myShaderSurface]
The SceneKit: What's New session from WWDC 2017 explains how to do that.
For the plane, use a material with constant as its lightingModel. It's the cheapest one.
This material will have writesToDepthBuffer set to true and colorBufferWriteMask set to [] (empty option set). That way the plane will write in the depth buffer, but won't draw anything on screen.
Set the light's shadowMode to deferred so that shadows are not applied when rendering the objects themselves, but as a final post-process.
There's a dedicated lighting model now (SCNLightingModelShadowOnly) to only render shadows

Box2D: How to use b2ChainShape for a tile based map with squares

Im fighting here with the so called ghost collisions on a simple tile based map with a circle as player character.
When applying an impulse to the circle it first starts bouncing correctly, then sooner or later it bounces wrong (wrong angle).
Looking up on the internet i read about an issue in Box2D (i use iOS Swift with Box2d port for Swift).
Using b2ChainShape does not help, but it looks i misunderstood it. I also need to use the "prevVertex" and "nextVertex" properties to set up the ghost vertices.
But im confused. I have a simple map made up of boxes (simple square), all placed next to each other forming a closed room. Inside of it my circle i apply an impulse seeing the issue.
Now WHERE to place those ghost vertices for each square/box i placed on the view in order to solve this issue? Do i need to place ANY vertex close to the last and first vertice of chainShape or does it need to be one of the vertices of the next box to the current one? I dont understand. Box2D's manual does not explain where these ghost vertices coordinates are coming from.
Below you can see an image describing the problem.
Some code showing the physics parts for the walls and the circle:
First the wall part:
let bodyDef = b2BodyDef()
bodyDef.position = self.ptm_vec(node.position+self.offset)
let w = self.ptm(Constants.Config.wallsize)
let square = b2ChainShape()
var chains = [b2Vec2]()
chains.append(b2Vec2(-w/2,-w/2))
chains.append(b2Vec2(-w/2,w/2))
chains.append(b2Vec2(w/2,w/2))
chains.append(b2Vec2(w/2,-w/2))
square.createLoop(vertices: chains)
let fixtureDef = b2FixtureDef()
fixtureDef.shape = square
fixtureDef.filter.categoryBits = Constants.Config.PhysicsCategory.Wall
fixtureDef.filter.maskBits = Constants.Config.PhysicsCategory.Player
let wallBody = self.world.createBody(bodyDef)
wallBody.createFixture(fixtureDef)
The circle part:
let bodyDef = b2BodyDef()
bodyDef.type = b2BodyType.dynamicBody
bodyDef.position = self.ptm_vec(node.position+self.offset)
let circle = b2CircleShape()
circle.radius = self.ptm(Constants.Config.playersize)
let fixtureDef = b2FixtureDef()
fixtureDef.shape = circle
fixtureDef.density = 0.3
fixtureDef.friction = 0
fixtureDef.restitution = 1.0
fixtureDef.filter.categoryBits = Constants.Config.PhysicsCategory.Player
fixtureDef.filter.maskBits = Constants.Config.PhysicsCategory.Wall
let ballBody = self.world.createBody(bodyDef)
ballBody.linearDamping = 0
ballBody.angularDamping = 0
ballBody.createFixture(fixtureDef)
Not sure that I know of a simple solution in the case that each tile can potentially have different physics.
If your walls are all horizontal and/or vertical, you could write a class to take a row of boxes, create a single edge or rectangle body, and then on collision calculate which box (a simple a < x < b test) should interact with the colliding object, and apply the physics appropriately, manually calling the OnCollision method that you would otherwise specify as the callback for each individual box.
Alternatively, to avoid the trouble of manually testing intersection with different boxes, you could still merge all common straight edge boxes into one edge body for accurate reflections. However, you would still retain the bodies for the individual boxes. Extend the boxes so that they overlap the edge.
Now here's the trick: all box collision handlers return false, but they toggle flags on the colliding object (turning flags on OnCollision, and off OnSeparation). The OnCollision method for the edge body then processes the collision based on which flags are set.
Just make sure that the colliding body always passes through a box before it can touch an edge. That should be straightforward.

SKEffectNode - CIFilter Blur Size Limit - Big Black Box

I am trying to blur multiple SKNode objects. I do this by having a parent SKEffectNode with a CIFilter set to #"CIGaussianBlur". Like so:
- (SKEffectNode *)createBlurNode
{
SKEffectNode *blurNode = [[SKEffectNode alloc] init];
blurNode.shouldRasterize = YES;
[blurNode setShouldEnableEffects:NO];
[blurNode setFilter:[CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:#"inputRadius", #10.0f, nil]];
return blurNode;
}
This works fine for a bunch of nodes currently onscreen. But when I space these notes far away from each other (about 3000 pixels), the blurring no longer happens and I get a big black box. This happens regardless of whether the SKNodes I'm blurring are SKShapeNodes or SKSpriteNodes. Here's a sample project with this issue: Sample Project. (By the way, thanks to BobMoff for the initial version found here):
Here's happy blur (when nodes are less than 3000 pixels away from each other):
Sad blur (when nodes are more than 3000 pixels away from each other):
UPDATE
This behavior occurs whenever an SKEffectNode is the parent. It doesn't matter if it's enabling effects, blurring, etc. If the parent node is an SKNode, it's fine. i.e. Even if the parent blur node is created like it is below, you will get the blackness:
- (SKEffectNode *)createBlurNode
{
SKEffectNode *blurNode = [[SKEffectNode alloc] init];
// blurNode.shouldRasterize = YES;
// [blurNode setShouldEnableEffects:NO];
// [blurNode setFilter:[CIFilter filterWithName:#"CIGaussianBlur"
// keysAndValues:#"inputRadius", #10.0f, nil]];
return blurNode;
}
I had a similar problem, with a very wide, panning scene that I wanted to blur.
To get the blur effect to work, I removed any nodes that were sticking out too far past the edges of the scene:
// Property declarations, elsewhere in the class:
var blurNode: SKEffectNode
var mainScene: SKScene
var exParents: [SKNode : SKNode] = [:]
/**
* Remove outlying nodes from the scene and activate the SKEffectNode
*/
func blurScene() {
let FILTER_MARGIN: CGFloat = 100
let widthMax: CGFloat = mainScene.size.width + FILTER_MARGIN
let heightMax: CGFloat = mainScene.size.height + FILTER_MARGIN
// Recursively iterate through all blurNode's children
blurNode.enumerateChildNodesWithName(".//*", usingBlock: {
[unowned self]
node, stop in
if node.parent != nil && node.scene != nil { // Ignore nodes we already removed
if let sprite = node as? SKSpriteNode {
// Calculate sprite node position in scene coordinates
let sceneOrig = sprite.scene!.convertPoint(sprite.position, fromNode: sprite.parent!)
// Find left, right, bottom and top edges of sprite
let l = sceneOrig.x - sprite.size.width*sprite.anchorPoint.x
let r = l + sprite.size.width
let b = sceneOrig.y - sprite.size.height*sprite.anchorPoint.y
let t = b + sprite.size.height
if l < -FILTER_MARGIN || r > widthMax || b < -FILTER_MARGIN || t > heightMax {
self.exParents[sprite] = sprite.parent!
sprite.removeFromParent()
}
}
}
})
blurNode.shouldEnableEffects = true
}
/**
* Disable blur and reparent nodes we removed earlier
*/
func removeBlur() {
self.blurNode.shouldEnableEffects = false
for (kid, parent) in exParents {
parent.addChild(kid)
}
exParents = [:]
}
NOTES:
This does remove content from your effect node, so extremely wide nodes won't show up in the final result:
You can see the mountain highlighted in red stuck out too far and was removed from the resulting blur.
This code only considers SKSpriteNodes. Empty SKNodes don't seem to break the effect node, but if you're using other visible nodes like SKShapeNodes or SKLabelNodes, you'll have to modify this code to include them.
If you have ignoreSiblingOrder = false, this code might mess up your z-ordering since you can't guarantee what order the nodes are added back to the scene.
Stuff I tried that didn't work
Simply saying node.hidden = true instead of using removeFromParent() doesn't work. That would be WAY too easy ;)
Using an SKCropNode to crop out outlying content didn't work for me. I tried having the SKEffectNode parent the SKCropNode and the other way around, but the black square appeared no matter how small I made the cropped area. This might still be worth looking into if you're desperate for a cleaner solution.
As noted here, SKScenes are secretly SKEffectNodes and you can set their filter just like our blurNode above. SKScenes don't show a black screen when their content is too big. Unfortunately, they seem to just silently disable the filter instead. Again, I might have missed something, so you could explore this option further if you're trying to apply an effect across the entire scene.
Alternate Solutions
You can capture an image of the whole screen and apply a filter to that, as suggested here. I ended up going with an even simpler solution; I took a generic screenshot of the stuff I wanted to blur, then applied a very heavy blur so you can't see the precise details. I used that as the blurred background and you can hardly tell it's not the real thing ;) This also saves a healthy chunk of memory and avoids a small UI hiccup.
Musings
This is a pretty nasty bug, and I hope Apple comes up with a solution soon. You can click this cute picture of a camera to get a GPU trace and some insight on what's happening:
The device seems to be discarding the framebuffer for the effect node because it takes up too much memory. This is affirmed by the fact that when there's more memory pressure on the device, it's easier to get the 'black square' on smaller content in the SKEffectNode.
I used a method that worked for my game but it requires the blurred area to be static without movement.
On iOS 10 using Swift 3 I used SKSpriteNode, SKView, SKEffectNode, CIFilter. I created a sprite from a texture returned from the SKView method "texture from node" and passed the current scene as the parameter because it inherits from SKNode. So essentially I was taking a "screenshot" of the scene and creating a sprite from it. I then put it in an SKEffectNode with a blur filter. (set "should rasterize" to true for better performance as I only needed to blur once). Finally I added the new sprite to the scene. From there you could add sprites to the scene and place them above the new blurred node.
let blurFilter = CIFilter(name: "CIGaussianBlur")!
let blurAmount = 15.0
blurFilter.setValue(blurAmount, forKey: kCIInputRadiusKey)
let blurEffect = SKEffectNode()
blurEffect.shouldRasterize = true
let screenshotNode = SKSpriteNode(texture: gameScene.view!.texture(from: gameScene))
blurEffect.addChild(screenshotNode)
blurEffect.filter = blurFilter
gameScene.addChild(blurEffect)
Possible workaround for the bug:
Use a camera, zoom WAY out, so you can see most everything of your background, take a screenshot style rendering of this image. Crop it to your needs, and then blur it. Then rasterise this.
Then scale this image back up, and slice it up if needs be, and place accordingly.
SKEffectNode renders into a texture. In most iOS systems the maximum size for a texture is 2048x2048. If an SKEffectNode is trying to render content larger than that, it will just use a 2048x2048 texture and anything outside of it will just not appear in the texture. It won't give you any error or warning about this happening; it simply does it silently.
And no, there is no way to tell SKEffectNode to use a texture of a specific size, and pan&clamp the content into it. It always uses a texture that will cover all the child nodes, and if the texture would be too large, it just silently uses that 2048x2048 texture.

stretch an object

I have a image of a string of size 12*30. I want to create an animation such that it gives a feel of stretching of a string. I did it by scaling the the image but problem I am facing that the collision is not happening with scaled image. It occurs only in 12*30 region which is the size of original image. I want the collision to happen though out the length of the string. Is there a better way than scaling to do this. Thanks.
image_rect = display.newImage("string.png")
image_rect.x = frog_jump_SheetSet.x + 10
image_rect.y = frog_jump_SheetSet.y + 10
physics.addBody(image_rect )
image_rect.yScale = 0.1
localGroup:insert(image_rect)
image_rect .collision = onStretch
image_rect :addEventListener("collision",image_rect )
tr1 = tnt:newTransition(image_rect,{time = 50,yScale = string_length })
tr2 = tnt:newTransition(image_rect,{delay = 100,time = 50,yScale = 0.1})
Corona Physics engine do not support scaling directly, the only thing you may do is add rectangles to the object or delete them as needed to fit the new shape...
In general, you should avoid using scaling or rotation of the image when using physics, and instead only change the physics API to rotate (using torque) and there are nothing you can do about scaling.

Resources