How to draw a dynamic line or curve per frame in SceneKit - ios

I have a Metal-backed SceneKit project where I'd like to draw a curve that is modified on the CPU every frame. This seems like it should be a simple task; one that I've done in OpenGL many times.
Something like:
array_of_points[0].x = 25
array_of_points[0].y = 35
upload_array_to_gpu()
render_frame()
It seems that modifying the SCNGeometrySource of an SCNNode is not possible. Recreating and setting the node.geometry works but the position lags / jitters compared to other nodes in the scene. Recreating the entire node each frame also lags.
One possible avenue might be a geometry shader modifier, but I'd still have to somehow pass in my cpu-computed curve positions.
Another option is to use metal APIs directly. Though this approach seems like it will require lots more code, and I couldn't find too much info about mixing SceneKit with Metal.
I also tried setting the preferredRenderingAPI to OpenGL, but it doesn't change the renderingAPI.

Related

SKEmitterNode: modify properties like velocity and angle for individual particles?

The SKEmitterNode in SpriteKit lets you change particle properties, but it's not clear how you change properties for specific particles.
For instance, if we want particles to radiate in a circle shape, it seems we need to dictate the angle and speed for each particle -- not specify values for particles as a group.
Is this possible?
Put another way, is it possible to use the SKEmitterNode to create animations like the one from this video at the 0:22 mark: https://www.youtube.com/watch?v=wYy2G0lVTAM
you can do:
The image is a little star.
setting:
You have no control over individual particles, you can only determine what they are like when they are born. (think of it like a test-tube baby, you can specify what genes you want the baby to have, but after that, the baby will grow however it feels like)
At some point apple may get SKActions working on the particles so that you can do this kind of stuff, but I wouldn't hold my breath on it working anytime soon, they seem to have no care in the SpriteKit platform, just introducing new broken things to get people excited. ( I am cringing on how buggy ARKit will be)

Updating geometry in SpriteKit (or SceneKit)

We are porting a game to SpriteKit and I've run in to a bit of a problem. Some objects in our game have triangle-strip trails attached to them. The vertex buffers of the trails are continuously updated as the objects move in the world to create a seamless and flowing effect (there are constraints on how many vertices are in the buffer and how often we emit new vertices).
In the previous implementation we simply updated the affected vertices in the corresponding buffer whenever we emitted new vertices. In SceneKit it seems I am unable to update geometry sources, unless I use geometrySourceWithBuffer:vertexFormat:semantic:vertexCount:dataOffset:dataStride:. To do this however, it seems I need a Metal device, command queue and encoder, to be able to submit commands to update my buffer before it renders.
Is there any way I can do this with a normal SCNView, or do I have to do everything manually with a CAMetalLayer, creating my own metal device etc?
In essence, everything we need, bar this trail geometry is available in SpriteKit, so I was hoping there was some way I could get a hold of the metal device and command queue used by the SKView and simply use that to upload my geometry. This would make things a lot simpler.
I've used a particle system to get a similar effect. The particle emitter is attached to the moving object, with particles set to fade out and eventually die. You might need to set the emitter's targetNode to your scene, depending on the effect you want.
emitter = SKEmitterNode()
emitter.name = marbleNodeNames.trail.rawValue
emitter.particleTexture = SKTexture(imageNamed: "spark")
emitter.particleAlphaSpeed = -1.0
emitter.particleLifetime = 2
emitter.particleScale = 0.2
marbleSprite.addChild(emitter)

How to Render Many SpriteKit Nodes at Once?

I am using SpriteKit to render a large (20 x 20) dot grid that looks like this:
I'd like to highlight rows or columns based on user input. For example, I'd like to change rows 1-10 to a red color, or columns 5-15 to a blue color.
What is the most performant way to do this?
I've tried:
Naming each GridNode based on the column it's in (e.g. #"column-4). Then use enumerateChildNodesWithName: with the string as #"column-n", changing the color of each node (by changing SKShapeNode setFillColor:) in the enumerate block.
Giving all the columns a parent node associated with that column. Then telling the parent node to change its alpha (thus changing the alpha of all its children).
Making arrays for the different columns, then looping through each node and changing its color or alpha.
I've tried making the GridDot class an SKEffectNode with shouldRasterize: set to YES. I've tried both an SKShapeNode and a SKSpriteNode as its child. I've also tried taking away the SKEffectNode parent and just render an SKSpriteNode.
Each of these options makes my whole app lag and makes my framerate drop to ~10 FPS. What is the correct way to change the color/alpha of many nodes (without dropping frames)?
At its heart, the issue is rendering this many nodes, yes?
When I faced similar performance problems while using SKShapeNode I came up with this solution:
Create SKShapeNode with required path and color.
Use SKView's method textureFromNode:crop: to convert SKShapeNode to an SKTexture
Repeat steps 1,2 to create all required textures for a node.
Create SKSpriteNode from a texture
Use created SKSpriteNode in your scene instead of SKShapeNode
Change node's texture when needed using SKSpriteNode's texture property
If you have a limited set of collors for your dots, I think this aproach will fit fine for your task.
In contrast to #amobi's statement, 400 nodes is not a lot. For instance, I have a scene with ~400 nodes and a render time of 9.8ms and 9 draw calls.
If you have 400 draw calls though, you should try to reduce that number. To determine the amount of draw calls needed for each frame rendered, implement (some of) the following code. It is actually taken from my own SpriteKit app's ViewController class which contains the SpriteKit scene.
skView.showsFPS = YES;
skView.showsNodeCount = YES;
skView.showsDrawCount = YES;
Proposed solution
I recommend using SKView's ignoresSiblingOrder. This way, SKSpriteNodes with equal zPosition are drawn in one draw call, which (for as many nodes/draw you appear to have) is horribly efficient. Set this in the -viewDidLoad method of the SKView's ViewController.
skView.ignoresSiblingOrder = YES;
I see no reason to burden the GPU with SKEffectNodes in this scenario. They are usually a great way to tank your frame rate.
Final thoughts
Basic performance issues mean you have a CPU or a GPU bottleneck. It is difficult to guess which you're suffering from with the current information. You could launch the Profiler, but Xcode itself also provides valuable information when you are running your app in an attached device. FPS in the Simulator is not representative for device performance.

SceneKit - Adding a new SCNNode to the scene causes severe lag

I found out that adding SCNNodes (with SCNGeometry) to the scene causes a severe lag spike.
According to the Time Profiler it has to generate the geometry (at least the functions/methods are called like that). It does that at the time when the node is added to the scene, not when the node is created. Hence, creating a pool with SCNNodes will not work.
Is there a way to get rid of this lag? I'd like to be able to add nodes to the scene without any FPS drop.
The only idea I have so far is adding everything to the scene already and then hiding / un-hiding it, though this is not really a clean solution.
Here's a shot from Time Profiler:
looks like your are adding a node with an SCNShape or SCNText attached to it and these kinds of geometries are expensive to create (you have to discretize and triangulate the Bézier curve, and eventually have to compute and offset curve for the chamfer).
You can try to preload the following methods from SCNSceneRenderer : -prepareObject:shouldAbortBlock:, -prepareObjects:withCompletionHandler:

Transform to create barrel effect seen in Apple's Camera app

I'm trying to recreate the barrel effect that can be seen on the camera mode picker below:
(source: androidnova.org)
Do I have to use OpenGL in order to achieve this effect? What is the best approach?
I found a great library on GitHub that can be used to achieve this effect (https://github.com/Ciechan/BCMeshTransformView), but unfortunately it doesn't support animation and is therefore not usable.
I bet Apple used CGMeshTransform. It's just like BCMeshTransform, except it is a private API and fully integrates with Core Graphics. BCMeshTransformView was born when a developer discovered this.
The only easy option I see is:
Use CALayer.transform, which is a CATransform3D. You can use this to simulate the barrel effect you want by adjusting the z position and y rotation of each item on the barrel. Also add a semitransparent dark gradient (CAGradientLayer) to the wheel to simulate the effect of choices getting darker towards the edges. This will be simple to do, but won't look as smooth and realistic as an actual 3D barrel. Maybe it will look good enough to create a convincing illusion though? (To enable 3D transforms, you need to enable depth by using view.layer.transform.m34 = 1/500.f or similar)
http://www.thinkandbuild.it/introduction-to-3d-drawing-in-core-animation-part-1/
The hardest option is using a custom OpenGL view that makes a barrel shape and applies your contents on top of it as a texture. I would expect that you run into most of the complexities behind creating BCMeshTransformView, and have difficulty supporting animations just like BCMeshTransformView did.
You may still be able to use BCMeshTransformView though. BCMeshTransformView is slow at processing content animations such as color changes, but is very fast at processing geometry changes. So you could use it to do a barrel effect, as long as you define the barrel effect entirely in terms of mesh geometry changes (instead of as content changes like using a scroll view or adjusting subview positions). You would need to do gesture handling + scrolling yourself instead of using UIScrollView though, which is tricky and tedious to get right.
Considering the options, I would want to fudge it by using 3D transforms, then move to other options only if I can't create a convincing illusion using 3D transforms.

Resources