Smooth Shading in SceneKit - ios

I'm currently working on an iPadOS app that uses SceneKit to render some 3D models, nothing too fancy but I hit a big of a snag when it comes to shading these models...
Basically what I do is just set up a SceneKit scene (using pretty much all the default settings, I don't even have a camera) and instantiate the 3D objects from some .obj files I have, like so.
let url = <URL of my .obj file>
let asset = MDLAsset(url: url)
let object = asset.object(at: 0)
let node = SCNNode(mdlObject: object)
let texture = UIImage(data: try Data(contentsOf: <URL of the texture>))
let material = SCNMaterial()
material.diffuse.contents = texture
node.geometry!.materials = [material]
self.sceneView.scene!.rootNode.addChildNode(node)
The textures are loaded manually because unfortunately that's how the client set up the files for me to use. The code works fine and I can see the mesh and its texture, however it also looks like this
As you can see the shading is not smooth at all... and I have no idea how to fix it.
The client has been bothering me to implement Phong shading, and according to Apple's Documentation this is how you do it.
material.lightingModel = .phong
Unfortunately that's still what it looks like with Phong enabled. I'm an absolute beginner when it comes to 3D rendering so this might be laughably easy but I swear I cannot figure out how to get a smoother shading on this model.
I have tried looking left and right and the only thing that has had any kind of noticeable result was to use subdivisionLevel to increase the actual faces in the geometry but this does not scale well as the actual app needs to load a ton of these meshes and it runs out of memory fast even when subdivision set to just 1
Surely there must be a way to smooth those shadows without improving the actual geometry?
Thanks in advance!

Shading requires having correct normals for your geometry. Have you tried using Model IO to generate them?
https://developer.apple.com/documentation/modelio/mdlmesh/1391284-addnormals

Related

Text rendering in metal

I am trying to develop my own mini game engine in Apple metal on a mac and I am stuck at a place where I want to render text on the GPU. I do not have much graphics programming experience and hence I am not sure how to do it. I stumbled upon an article written by warren more using signed distance fields. But I do not know how it works and I am unable to understand it completely (lack of my graphics knowledge) to implement it myself. The blog post has a code sample which is written in obj-c but unfortunately i do not know obj-c. Is there some swift version of it? Or can someone explain / give pointers on how to render text in metal?
I have been down this road before. I think you might find SceneKit useful if you are after 3D text.
If you are OK with using SceneKit to drive your rendering: SCNText with a SCNView.
If you have your own command buffer, and you can get away with blending your text on TOP of the rest of your graphics: you can still use SCNText, by using the render() method of a SCNRenderer to render to encode a scene's render commands onto a command buffer.
If you want to avoid SceneKit's rendering process, I would recommend doing this: create a SCNText in a SCNTransaction like so:
import SceneKit
SCNTransaction.begin()
let sceneText = SCNText(string: text, extrusionDepth: extrusionDepth)
SCNTransaction.commit()
let mdlMesh = MDLMesh(scnGeometry: sceneText, bufferAllocator: yourBufferAllocator)
let mesh = try MTKMesh(mesh: mdlMesh, device: MTLCreateSystemDefaultDevice()!)
This MTKMesh will have three vertex buffers; the first one (0) is a list of positions in packed_float3 format, the second (1) a list of normals in packed_float3 format, the third (2) a list of texture coordinates in packed_float2 format. Just make sure to reflect that in your vertex shader. It will have 1-5 submeshes with their own index buffers, corresponding I believe to front, back, front chamfer, back chamfer, and extrusion side.
Now, if you are after 2D text, you can either use this method above with an extrusionDepth close to zero, or you can harness CoreText directly to do font metrics and render textured quads with a font atlas texture like the commenter suggested.
The ability to understand Objective-C is certainly useful as well, but you may not need it for this problem specifically. I tried to be brief on my explanations since I don't know what your exact goal is with this problem, but I can provide more detail on any of those methods upon request.

How to make use of cube texture type in XCassets

I'm trying to learn SceneKit development and try to add a skybox in the background.
To store the cube map textures I found that XCAssets has a type Cube Texture Set which seems to fit the bill perfectly.
However. I've not found any way to access the texture in code (e.g. as an image set where you call UIImage(named: "asset_name") ). I've tried creating an SKTexture, MDLTexture og MTKTexture from the asset but without success. Do anyone know how I can use the cube texture set?
You can load the cube texture from the asset catalog easily using MetalKit.
import MetalKit
at the top of your file. These two lines do the business:
let textureLoader = MTKTextureLoader(device: scnView.device!)
scene.background.contents = try! textureLoader.newTexture(name: textureName,
scaleFactor: 1.0,
bundle: .main, options: nil)
I tried this in a project created from the default SceneKit game template, and placed these two lines in GameViewController.swift after setting the view's background color
(I expect you can do it using the other technologies too, but this is how you can load a cube texture using Metal)

Back face culling in SceneKit

I am currently trying to set up a rotating ball in scene kit. I have created the ball and applied a texture to it.
ballMaterial.diffuse.contents = UIImage(named: ballTexture)
ballMaterial.doubleSided = true
ballGeometry.materials = [ballMaterial]
The current ballTexture is a semi-transparent texture as I am hoping to see the back face roll around.
However I get some strange culling where only half of the back facing polygons are shown even though the doubleSided property is set to true.
Any help would be appreciated, thanks.
This happens because the effects of transparency are draw-order dependent. SceneKit doesn't know to draw the back-facing polygons of the sphere before the front-facing ones. (In fact, it can't really do that without reorganizing the vertex buffers on the GPU for every frame, which would be a huge drag on render performance.)
The vertex layout for an SCNSphere has it set up like the lat/long grid on a globe: the triangles render in order along the meridians from 0° to 360°, so depending on how the sphere is oriented with respect to the camera, some of the faces on the far side of the sphere will render before the nearer ones.
To fix this, you need to force the rendering order — either directly, or through the depth buffer. Here's one way to do that, using a separate material for the inside surface to illustrate the difference.
// add two balls, one a child of the other
let node = SCNNode(geometry: SCNSphere(radius: 1))
let node2 = SCNNode(geometry: SCNSphere(radius: 1))
scene.rootNode.addChildNode(node)
node.addChildNode(node2)
// cull back-facing polygons on the first ball
// so we only see the outside
let mat1 = node.geometry!.firstMaterial!
mat1.cullMode = .Back
mat1.transparent.contents = bwCheckers
// my "bwCheckers" uses black for transparent, white for opaque
mat1.transparencyMode = .RGBZero
// cull front-facing polygons on the second ball
// so we only see the inside
let mat2 = node2.geometry!.firstMaterial!
mat2.cullMode = .Front
mat2.diffuse.contents = rgCheckers
// sphere normals face outward, so to make the inside respond
// to lighting, we need to invert them
let shader = "_geometry.normal *= -1.0;"
mat2.shaderModifiers = [SCNShaderModifierEntryPointGeometry: shader]
(The shader modifier bit at the end isn't required — it just makes the inside material get diffuse shading. You could just as well use a material property that doesn't involve normals or lighting, like emission, depending on the look you want.)
You can also do this using a single node with a double-sided material by disabling writesToDepthBuffer, but that could also lead to undesirable interactions with the rest of your scene content — you might also need to mess with renderingOrder in that case.
macOS 10.13 and iOS 11 added SCNTransparencyMode.dualLayer which as far as I can tell doesn't even require setting isDoubleSided to true (the documentation doesn't provide any information at all). So a simple solution that's working for me would be:
ballMaterial.diffuse.contents = UIImage(named: ballTexture)
ballMaterial.transparencyMode = .dualLayer
ballGeometry.materials = [ballMaterial]

fixing sharp looking geometry objects with widthSegmentCount property

Is it possible to assign widthSegmentCount (or height or chamfer) to a custom geometry object created in Blender. My geometry is rather sharp looking when imported to SceneKit. It looks great in Blender though. The sharpness is depicted in pictures.
The object is moving so setting enableJittering to true doesn't help.
I tried using this code since my object is basically a box:
let box = boxNode.geometry as! SCNBox
box.widthSegmentCount = 150
box.heightSegmentCount = 150
box.chamferSegmentCount = 150
and I'm getting an error: Thread 1: signal SIGABRT
Is this the best SceneKit can do or do I need to export my object from Blender differently?
This has nothing to do with the quality of your mesh. You are just seeing individual pixels without smoothing.
SCNView exposes the antialiasingMode property that will help you get smoother edges (try .Multisampling2X or .Multisampling4X).

Swift SpriteKit making physicsbody from texture of an image slows down my app too much

I'm trying to make an iOS app that includes some collision detection between two physics bodies. I want one of the physics bodies to be the shape of an image I am using, but when I try to do this using a texture it slows my app down tremendously and eventually causes it to freeze altogether. These are the two lines of code that are causing it:
let texture = SKTexture(imageNamed: "image.png")
physicsBody = SKPhysicsBody(texture: texture, size: size)
however, if I change these two lines to something like
physicsBody = SKPhysicsBody(rectangleOfSize: size)
then everything runs perfectly fine. Has anyone else had this problem and/or found a solution?
This may be due to the complex nature of your texture, but it's hard to tell without seeing it. As Whirlwind said, it probably shouldn't cause such a significant slowdown however it's difficult resolve without further information.
A way to get around creating the SKPhysicsBody from a texture would be to use an online tool for building the body from a path. I use this tool personally. It may be a decent work around.

Resources