Animating particles from a SKEmitterNode - ios

The goal is to animate particles coming from a SKEmitterNode, but the following code doesn't work. The particles don't change texture. They only show the first texture -- or more specifically the original image used in the Xcode Particle Editor -- during the lifetime.
The particle lifetime is longer than the frame duration so this isn't the issue.
// Create animation textures
let animationAtlas = SKTextureAtlas(named: atlasFilename)
var animationFrames = [SKTexture]()
// Set number of animation frames
let numImages = animationAtlas.textureNames.count
// Load texture array
for i in 0..<numImages {
let textureName = "\(texturePrefix)\(i)"
animationFrames.append(animationAtlas.textureNamed(textureName))
}
// Create emitter node w/ animation on each particle
let emitterNode = SKEmitterNode(fileNamed: EmitterFilename)!
let animation = SKAction.animate(with: animationFrames, timePerFrame: 0.05)
emitterNode.particleAction = animation
// Define fade out sequence
let fadeOut = SKAction.fadeOut(withDuration: sparkleFadeOutDur)
let remove = SKAction.removeFromParent()
let sequence = SKAction.sequence([fadeOut, remove])
// Add emitter node to scene
addChild(emitterNode)
emitterNode.run(sequence)

While your code looks flawless and this may potentially be a bug of the particle engine, it's worth assuming that your animation runs out of frames before the particles become large or visible enough so that you see the animation perform. If that's the case, after the animation is finished the texture may revert back to the original texture that you're seeing.
To check if that's the issue, you may want to wrap your animate action into a repeatForever action:
repeatAction = SKAction.repeatForever(animation)
and then change the particleAction assignment as follows:
emitterNode.particleAction = repeatAction
It's worth noting that Apple's documentation seems contradictory. At this page (https://developer.apple.com/reference/spritekit/skaction/1417828-animate) we see: "This action can only be executed by an SKSprite​Node object."
Particles are clearly not accessible sprite nodes as this page (https://developer.apple.com/reference/spritekit/skemitternode) states, but at the same time they should behave like sprites:
"For the purpose of using actions on particles, you can treat the particle as if it were a sprite. This means you can perform other interesting tricks, such as animating the particle’s textures."

Related

How to create a User Generated Animation in SceneKit with .dae (COLLADA) file from Blender

I’m trying to animate specific parts of a SCNScene object in SceneKit (in my case I want to animate fingers of a hand). I import the .dae (COLLADA) file easily from Blender with the respective bones to generate articulation on the model.
override func viewDidLoad() {
super.viewDidLoad()
var scene = SCNScene(named: "hand.dae")!
sceneView.scene = scene
sceneView.allowsCameraControl = true
sceneView.autoenablesDefaultLighting = true
sceneView.backgroundColor = UIColor.lightGrayColor()
}
My goal is to animate those bones on iOS with user generated values between 0 and 1. Imagine a UISlider where you scroll back and forth and see the specific finger move depending on the value of the slider.
This is needed animation screenshot
Image with the animation pretended
I’ve tried animate the model by calling an animation file like the Apple’s Fox example:
private var indexFingerAnimation: CAAnimation!
indexFingerAnimation = CAAnimation.animationWithSceneNamed(“move_index_finger.dae”)
indexFingerAnimation = false
indexFingerAnimation = 0.3
indexFingerAnimation = 0.3
indexFingerAnimation = Float.infinity
The problem is that’s a Global animation instead of just the index finger. Besides it’s always a ‘pre-defined’ animation instead of an animation controlled by user input. Ultimately I want to mix animations (e.g. move index finger and thumb at the same time revealing gestures)
Is this possible? I’m struggling because I can’t figure out how to manipulate specific parts of the mesh. I’m starting to study MetalKit but it’s not clear to me that’s the solution.
Any help would be really appreciated.
I have never tried two animations at the same time
but I can rotate SCNNode in dae file with two or more animate
You must set pivot point and group them together

Need serious help on creating a comet with animations in SpriteKit

I'm currently working on a SpriteKit project and need to create a comet with a fading tail that animates across the screen. I am having serious issues with SpriteKit in this regards.
Attempt 1. It:
Draws a CGPath and creates an SKShapeNode from the path
Creates a square SKShapeNode with gradient
Creates an SKCropNode and assigns its maskNode as line, and adds square as a child
Animates the square across the screen, while being clipped by the line/SKCropNode
func makeCometInPosition(from: CGPoint, to: CGPoint, color: UIColor, timeInterval: NSTimeInterval) {
... (...s are (definitely) irrelevant lines of code)
let path = CGPathCreateMutable()
...
let line = SKShapeNode(path:path)
line.lineWidth = 1.0
line.glowWidth = 1.0
var squareFrame = line.frame
...
let square = SKShapeNode(rect: squareFrame)
//Custom SKTexture Extension. I've tried adding a normal image and the leak happens either way. The extension is not the problem
square.fillTexture = SKTexture(color1: UIColor.clearColor(), color2: color, from: from, to: to, frame: line.frame)
square.fillColor = color
square.strokeColor = UIColor.clearColor()
square.zPosition = 1.0
let maskNode = SKCropNode()
maskNode.zPosition = 1.0
maskNode.maskNode = line
maskNode.addChild(square)
//self is an SKScene, background is an SKSpriteNode
self.background?.addChild(maskNode)
let lineSequence = SKAction.sequence([SKAction.waitForDuration(timeInterval), SKAction.removeFromParent()])
let squareSequence = SKAction.sequence([SKAction.waitForDuration(1), SKAction.moveBy(CoreGraphics.CGVectorMake(deltaX * 2, deltaY * 2), duration: timeInterval), SKAction.removeFromParent()])
square.runAction(SKAction.repeatActionForever(squareSequence))
maskNode.runAction(lineSequence)
line.runAction(lineSequence)
}
This works, as shown below.
The problem is that after 20-40 other nodes come on the screen, weird things happen. Some of the nodes on the screen disappear, some stay. Also, the fps and node count (toggled in the SKView and never changed)
self.showsFPS = true
self.showsNodeCount = true
disappear from the screen. This makes me assume it's a bug with SpriteKit. SKShapeNode has been known to cause issues.
Attempt 2. I tried changing square from an SKShapeNode to an SKSpriteNode (Adding and removing lines related to the two as necessary)
let tex = SKTexture(color1: UIColor.clearColor(), color2: color, from: from, to: to, frame: line.frame)
let square = SKSpriteNode(texture: tex)
the rest of the code is basically identical. This produces a similar effect with no bugs performance/memory wise. However, something odd happens with SKCropNode and it looks like this
It has no antialiasing, and the line is thicker. I have tried changing anti-aliasing, glow width, and line width. There is a minimum width that can not change for some reason, and setting the glow width larger does this
. According to other stackoverflow questions maskNodes are either 1 or 0 in alpha. This is confusing since the SKShapeNode can have different line/glow widths.
Attempt 3. After some research, I discovered I might be able to use the clipping effect and preserve line width/glow using an SKEffectNode instead of SKCropNode.
//Not the exact code to what I tried, but very similar
let maskNode = SKEffectNode()
maskNode.filter = customLinearImageFilter
maskNode.addChild(line)
This produced the (literally) exact same effect as attempt 1. It created the same lines and animation, but the same bugs with other nodes/fps/nodeCount occured. So it seems to be a bug with SKEffectNode, and not SKShapeNode.
I do not know how to bypass the bugs with attempt 1/3 or 2.
Does anybody know if there is something I am doing wrong, if there is a bypass around this, or a different solution altogether for my problem?
Edit: I considered emitters, but there could potentially be hundreds of comets/other nodes coming in within a few seconds and didn't think they would be feasible performance-wise. I have not used SpriteKit before this project so correct me if I am wrong.
This looks like a problem for a custom shader attached to the comet path. If you are not familiar with OpenGL Shading Language (GLSL) in SpriteKit it lets you jump right into the GPU fragment shader specifically to control the drawing behavior of the nodes it is attached to via SKShader.
Conveniently the SKShapeNode has a strokeShader property for hooking up an SKShader to draw the path. When connected to this property the shader gets passed the length of the path and the point on the path currently being drawn in addition to the color value at that point.*
controlFadePath.fsh
void main() {
//uniforms and varyings
vec4 inColor = v_color_mix;
float length = u_path_length;
float distance = v_path_distance;
float start = u_start;
float end = u_end;
float mult;
mult = smoothstep(end,start,distance/length);
if(distance/length > start) {discard;}
gl_FragColor = vec4(inColor.r, inColor.g, inColor.b, inColor.a) * mult;
}
To control the fade along the path pass a start and end point into the custom shader using two SKUniform objects named u_start and u_end These get added to the custom shader during initialization of a custom SKShapeNode class CometPathShape and animated via a custom Action.
class CometPathShape:SKShapeNode
class CometPathShape:SKShapeNode {
//custom shader for fading
let pathShader:SKShader
let fadeStartU = SKUniform(name: "u_start",float:0.0)
let fadeEndU = SKUniform(name: "u_end",float: 0.0)
let fadeAction:SKAction
override init() {
pathShader = SKShader(fileNamed: "controlFadePath.fsh")
let fadeDuration:NSTimeInterval = 1.52
fadeAction = SKAction.customActionWithDuration(fadeDuration, actionBlock:
{ (node:SKNode, time:CGFloat)->Void in
let D = CGFloat(fadeDuration)
let t = time/D
var Ps:CGFloat = 0.0
var Pe:CGFloat = 0.0
Ps = 0.25 + (t*1.55)
Pe = (t*1.5)-0.25
let comet:CometPathShape = node as! CometPathShape
comet.fadeRange(Ps,to: Pe) })
super.init()
path = makeComet...(...) //custom method that creates path for comet shape
strokeShader = pathShader
pathShader.addUniform(fadeStartU)
pathShader.addUniform(fadeEndU)
hidden = true
//set up for path shape, eg. strokeColor, strokeWidth...
...
}
func fadeRange(from:CGFloat, to:CGFloat) {
fadeStartU.floatValue = Float(from)
fadeEndU.floatValue = Float(to)
}
func launch() {
hidden = false
runAction(fadeAction, completion: { ()->Void in self.hidden = true;})
}
...
The SKScene initializes the CometPathShape objects, caches and adds them to the scene. During update: the scene simply calls .launch() on the chosen CometPathShapes.
class GameScene:SKScene
...
override func didMoveToView(view: SKView) {
/* Setup your scene here */
self.name = "theScene"
...
//create a big bunch of paths with custom shaders
print("making cache of path shape nodes")
for i in 0...shapeCount {
let shape = CometPathShape()
let ext = String(i)
shape.name = "comet_".stringByAppendingString(ext)
comets.append(shape)
shape.position.y = CGFloat(i * 3)
print(shape.name)
self.addChild(shape)
}
override func update(currentTime: CFTimeInterval) {
//pull from cache and launch comets, skip busy ones
for _ in 1...launchCount {
let shape = self.comets[Int(arc4random_uniform(UInt32(shapeCount)))]
if shape.hasActions() { continue }
shape.launch()
}
}
This cuts the number of SKNodes per comet from 3 to 1 simplifying your code and the runtime environment and it opens the door for much more complex effects via the shader. The only drawback I can see is having to learn some GLSL.**
*not always correctly in the device simulator. Simulator not passing distance and length values to custom shader.
**that and some idiosyncrasies in CGPath glsl behavior. Path construction is affecting the way the fade performs. Looks like v_path_distance is not blending smoothly across curve segments. Still, with care constructing the curve this should work.

SpriteKit: How to blur the movement of a node like a shooting star

I want to blur the movement of fast moving nodes. I should look like it fades away behind the movement. Is there some easy solution to archive this?
It should somehow look like this movement:
Thanks in advance!
Update
I tried to call this every 0.0X seconds:
for child in self.allNodes{
let node = SKSpriteNode(color: child.color, size: CGSizeMake(size, size))
node.position = child.position
self.addChild(node)
node.runAction(SKAction.fadeOutWithDuration(0.2))
}
The problem is, that this solution is to expensive. It should be way easier. :/
Yo can create custom SpriteKit Particle File with fire effect. Then change it SpriteKit Particle Emmitter like you want to create effect you need. And create SKEmitterNode with with this particle effect like this:
// Create path name to file with particle effect
let sparkEmitterPath = NSBundle.mainBundle().pathForResource("Spark", ofType: "sks")!
// Create SKEmitterNode from this file
let sparkEmiter = NSKeyedUnarchiver.unarchiveObjectWithFile(sparkEmitterPath) as SKEmitterNode
// Configure it
sparkEmiter.position = point
sparkEmiter.name = "sparkEmitter"
sparkEmiter.zPosition = 0
sparkEmiter.targetNode = self
// Add to Scene
self.addChild(sparkEmiter)
In example file name is Spark.sks

SKEffectNode - CIFilter Blur Size Limit - Big Black Box

I am trying to blur multiple SKNode objects. I do this by having a parent SKEffectNode with a CIFilter set to #"CIGaussianBlur". Like so:
- (SKEffectNode *)createBlurNode
{
SKEffectNode *blurNode = [[SKEffectNode alloc] init];
blurNode.shouldRasterize = YES;
[blurNode setShouldEnableEffects:NO];
[blurNode setFilter:[CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:#"inputRadius", #10.0f, nil]];
return blurNode;
}
This works fine for a bunch of nodes currently onscreen. But when I space these notes far away from each other (about 3000 pixels), the blurring no longer happens and I get a big black box. This happens regardless of whether the SKNodes I'm blurring are SKShapeNodes or SKSpriteNodes. Here's a sample project with this issue: Sample Project. (By the way, thanks to BobMoff for the initial version found here):
Here's happy blur (when nodes are less than 3000 pixels away from each other):
Sad blur (when nodes are more than 3000 pixels away from each other):
UPDATE
This behavior occurs whenever an SKEffectNode is the parent. It doesn't matter if it's enabling effects, blurring, etc. If the parent node is an SKNode, it's fine. i.e. Even if the parent blur node is created like it is below, you will get the blackness:
- (SKEffectNode *)createBlurNode
{
SKEffectNode *blurNode = [[SKEffectNode alloc] init];
// blurNode.shouldRasterize = YES;
// [blurNode setShouldEnableEffects:NO];
// [blurNode setFilter:[CIFilter filterWithName:#"CIGaussianBlur"
// keysAndValues:#"inputRadius", #10.0f, nil]];
return blurNode;
}
I had a similar problem, with a very wide, panning scene that I wanted to blur.
To get the blur effect to work, I removed any nodes that were sticking out too far past the edges of the scene:
// Property declarations, elsewhere in the class:
var blurNode: SKEffectNode
var mainScene: SKScene
var exParents: [SKNode : SKNode] = [:]
/**
* Remove outlying nodes from the scene and activate the SKEffectNode
*/
func blurScene() {
let FILTER_MARGIN: CGFloat = 100
let widthMax: CGFloat = mainScene.size.width + FILTER_MARGIN
let heightMax: CGFloat = mainScene.size.height + FILTER_MARGIN
// Recursively iterate through all blurNode's children
blurNode.enumerateChildNodesWithName(".//*", usingBlock: {
[unowned self]
node, stop in
if node.parent != nil && node.scene != nil { // Ignore nodes we already removed
if let sprite = node as? SKSpriteNode {
// Calculate sprite node position in scene coordinates
let sceneOrig = sprite.scene!.convertPoint(sprite.position, fromNode: sprite.parent!)
// Find left, right, bottom and top edges of sprite
let l = sceneOrig.x - sprite.size.width*sprite.anchorPoint.x
let r = l + sprite.size.width
let b = sceneOrig.y - sprite.size.height*sprite.anchorPoint.y
let t = b + sprite.size.height
if l < -FILTER_MARGIN || r > widthMax || b < -FILTER_MARGIN || t > heightMax {
self.exParents[sprite] = sprite.parent!
sprite.removeFromParent()
}
}
}
})
blurNode.shouldEnableEffects = true
}
/**
* Disable blur and reparent nodes we removed earlier
*/
func removeBlur() {
self.blurNode.shouldEnableEffects = false
for (kid, parent) in exParents {
parent.addChild(kid)
}
exParents = [:]
}
NOTES:
This does remove content from your effect node, so extremely wide nodes won't show up in the final result:
You can see the mountain highlighted in red stuck out too far and was removed from the resulting blur.
This code only considers SKSpriteNodes. Empty SKNodes don't seem to break the effect node, but if you're using other visible nodes like SKShapeNodes or SKLabelNodes, you'll have to modify this code to include them.
If you have ignoreSiblingOrder = false, this code might mess up your z-ordering since you can't guarantee what order the nodes are added back to the scene.
Stuff I tried that didn't work
Simply saying node.hidden = true instead of using removeFromParent() doesn't work. That would be WAY too easy ;)
Using an SKCropNode to crop out outlying content didn't work for me. I tried having the SKEffectNode parent the SKCropNode and the other way around, but the black square appeared no matter how small I made the cropped area. This might still be worth looking into if you're desperate for a cleaner solution.
As noted here, SKScenes are secretly SKEffectNodes and you can set their filter just like our blurNode above. SKScenes don't show a black screen when their content is too big. Unfortunately, they seem to just silently disable the filter instead. Again, I might have missed something, so you could explore this option further if you're trying to apply an effect across the entire scene.
Alternate Solutions
You can capture an image of the whole screen and apply a filter to that, as suggested here. I ended up going with an even simpler solution; I took a generic screenshot of the stuff I wanted to blur, then applied a very heavy blur so you can't see the precise details. I used that as the blurred background and you can hardly tell it's not the real thing ;) This also saves a healthy chunk of memory and avoids a small UI hiccup.
Musings
This is a pretty nasty bug, and I hope Apple comes up with a solution soon. You can click this cute picture of a camera to get a GPU trace and some insight on what's happening:
The device seems to be discarding the framebuffer for the effect node because it takes up too much memory. This is affirmed by the fact that when there's more memory pressure on the device, it's easier to get the 'black square' on smaller content in the SKEffectNode.
I used a method that worked for my game but it requires the blurred area to be static without movement.
On iOS 10 using Swift 3 I used SKSpriteNode, SKView, SKEffectNode, CIFilter. I created a sprite from a texture returned from the SKView method "texture from node" and passed the current scene as the parameter because it inherits from SKNode. So essentially I was taking a "screenshot" of the scene and creating a sprite from it. I then put it in an SKEffectNode with a blur filter. (set "should rasterize" to true for better performance as I only needed to blur once). Finally I added the new sprite to the scene. From there you could add sprites to the scene and place them above the new blurred node.
let blurFilter = CIFilter(name: "CIGaussianBlur")!
let blurAmount = 15.0
blurFilter.setValue(blurAmount, forKey: kCIInputRadiusKey)
let blurEffect = SKEffectNode()
blurEffect.shouldRasterize = true
let screenshotNode = SKSpriteNode(texture: gameScene.view!.texture(from: gameScene))
blurEffect.addChild(screenshotNode)
blurEffect.filter = blurFilter
gameScene.addChild(blurEffect)
Possible workaround for the bug:
Use a camera, zoom WAY out, so you can see most everything of your background, take a screenshot style rendering of this image. Crop it to your needs, and then blur it. Then rasterise this.
Then scale this image back up, and slice it up if needs be, and place accordingly.
SKEffectNode renders into a texture. In most iOS systems the maximum size for a texture is 2048x2048. If an SKEffectNode is trying to render content larger than that, it will just use a 2048x2048 texture and anything outside of it will just not appear in the texture. It won't give you any error or warning about this happening; it simply does it silently.
And no, there is no way to tell SKEffectNode to use a texture of a specific size, and pan&clamp the content into it. It always uses a texture that will cover all the child nodes, and if the texture would be too large, it just silently uses that 2048x2048 texture.

SceneKit get texture coordinate after touch with Swift

I want to manipulate 2D textures in a 3D SceneKit scene.
Therefore i used this code to get local coordinates:
#IBAction func tap(sender: UITapGestureRecognizer) {
var arr:NSArray = my3dView.hitTest(sender.locationInView(my3dView), options: NSDictionary(dictionary: [SCNHitTestFirstFoundOnlyKey:true]))
var res:SCNHitTestResult = arr.firstObject as SCNHitTestResult
var vect:SCNVector3 = res.localCoordinates}
I have the texture read out from my scene with:
var mat:SCNNode = myscene.rootNode.childNodes[0] as SCNNode
var child:SCNNode = mat.childNodeWithName("ID12", recursively: false)
var geo:SCNMaterial = child.geometry.firstMaterial
var channel = geo.diffuse.mappingChannel
var textureimg:UIImage = geo.diffuse.contents as UIImage
and now i want to draw at the touchpoint to the texture...
how can i do that? how can i transform my coordinate from touch to the texture image?
Sounds like you have two problems. (Without even having used regular expressions. :))
First, you need to get the texture coordinates of the tapped point -- that is, the point in 2D texture space on the surface of the object. You've almost got that right already. SCNHitTestResult provides those with the textureCoordinatesWithMappingChannel method. (You're using localCoordinates, which gets you a point in the 3D space owned by the node in the hit-test result.) And you already seem to have found the business about mapping channels, so you know what to pass to that method.
Problem #2 is how to draw.
You're doing the right thing to get the material's contents as a UIImage. Once you've got that, you could look into drawing with UIGraphics and CGContext functions -- create an image with UIGraphicsBeginImageContext, draw the existing image into it, then draw whatever new content you want to add at the tapped point. After that, you can get the image you were drawing with UIGraphicsGetImageFromCurrentImageContext and set it as the new diffuse.contents of your material. However, that's probably not the best way -- you're schlepping a bunch of image data around on the CPU, and the code is a bit unwieldy, too.
A better approach might be to take advantage of the integration between SceneKit and SpriteKit. This way, all your 2D drawing is happening in the same GPU context as the 3D drawing -- and the code's a bit simpler.
You can set your material's diffuse.contents to a SpriteKit scene. (To use the UIImage you currently have for that texture, just stick it on an SKSpriteNode that fills the scene.) Once you have the texture coordinates, you can add a sprite to the scene at that point.
var nodeToDrawOn: SCNNode!
var skScene: SKScene!
func mySetup() { // or viewDidLoad, or wherever you do setup
// whatever else you're doing for setup, plus:
// 1. remember which node we want to draw on
nodeToDrawOn = myScene.rootNode.childNodeWithName("ID12", recursively: true)
// 2. set up that node's texture as a SpriteKit scene
let currentImage = nodeToDrawOn.geometry!.firstMaterial!.diffuse.contents as UIImage
skScene = SKScene(size: currentImage.size)
nodeToDrawOn.geometry!.firstMaterial!.diffuse.contents = skScene
// 3. put the currentImage into a background sprite for the skScene
let background = SKSpriteNode(texture: SKTexture(image: currentImage))
background.position = CGPoint(x: skScene.frame.midX, y: skScene.frame.midY)
skScene.addChild(background)
}
#IBAction func tap(sender: UITapGestureRecognizer) {
let results = my3dView.hitTest(sender.locationInView(my3dView), options: [SCNHitTestFirstFoundOnlyKey: true]) as [SCNHitTestResult]
if let result = results.first {
if result.node === nodeToDrawOn {
// 1. get the texture coordinates
let channel = nodeToDrawOn.geometry!.firstMaterial!.diffuse.mappingChannel
let texcoord = result.textureCoordinatesWithMappingChannel(channel)
// 2. place a sprite there
let sprite = SKSpriteNode(color: SKColor.greenColor(), size: CGSize(width: 10, height: 10))
// scale coords: texcoords go 0.0-1.0, skScene space is is pixels
sprite.position.x = texcoord.x * skScene.size.width
sprite.position.y = texcoord.y * skScene.size.height
skScene.addChild(sprite)
}
}
}
For more details on the SpriteKit approach (in Objective-C) see the SceneKit State of the Union Demo from WWDC14. That shows a SpriteKit scene used as the texture map for a torus, with spheres of paint getting thrown at it -- whenever a sphere collides with the torus, it gets a SCNHitTestResult and uses its texcoords to create a paint splatter in the SpriteKit scene.
Finally, some Swift style comments on your code (unrelated to the question and answer):
Use let instead of var wherever you don't need to reassign a value, and the optimizer will make your code go faster.
Explicit type annotations (res: SCNHitTestResult) are rarely necessary.
Swift dictionaries are bridged to NSDictionary, so you can pass them directly to an API that takes NSDictionary.
Casting to a Swift typed array (hitTest(...) as [SCNHitTestResult]) saves you from having to cast the contents.

Resources