Need serious help on creating a comet with animations in SpriteKit - ios

I'm currently working on a SpriteKit project and need to create a comet with a fading tail that animates across the screen. I am having serious issues with SpriteKit in this regards.
Attempt 1. It:
Draws a CGPath and creates an SKShapeNode from the path
Creates a square SKShapeNode with gradient
Creates an SKCropNode and assigns its maskNode as line, and adds square as a child
Animates the square across the screen, while being clipped by the line/SKCropNode
func makeCometInPosition(from: CGPoint, to: CGPoint, color: UIColor, timeInterval: NSTimeInterval) {
... (...s are (definitely) irrelevant lines of code)
let path = CGPathCreateMutable()
...
let line = SKShapeNode(path:path)
line.lineWidth = 1.0
line.glowWidth = 1.0
var squareFrame = line.frame
...
let square = SKShapeNode(rect: squareFrame)
//Custom SKTexture Extension. I've tried adding a normal image and the leak happens either way. The extension is not the problem
square.fillTexture = SKTexture(color1: UIColor.clearColor(), color2: color, from: from, to: to, frame: line.frame)
square.fillColor = color
square.strokeColor = UIColor.clearColor()
square.zPosition = 1.0
let maskNode = SKCropNode()
maskNode.zPosition = 1.0
maskNode.maskNode = line
maskNode.addChild(square)
//self is an SKScene, background is an SKSpriteNode
self.background?.addChild(maskNode)
let lineSequence = SKAction.sequence([SKAction.waitForDuration(timeInterval), SKAction.removeFromParent()])
let squareSequence = SKAction.sequence([SKAction.waitForDuration(1), SKAction.moveBy(CoreGraphics.CGVectorMake(deltaX * 2, deltaY * 2), duration: timeInterval), SKAction.removeFromParent()])
square.runAction(SKAction.repeatActionForever(squareSequence))
maskNode.runAction(lineSequence)
line.runAction(lineSequence)
}
This works, as shown below.
The problem is that after 20-40 other nodes come on the screen, weird things happen. Some of the nodes on the screen disappear, some stay. Also, the fps and node count (toggled in the SKView and never changed)
self.showsFPS = true
self.showsNodeCount = true
disappear from the screen. This makes me assume it's a bug with SpriteKit. SKShapeNode has been known to cause issues.
Attempt 2. I tried changing square from an SKShapeNode to an SKSpriteNode (Adding and removing lines related to the two as necessary)
let tex = SKTexture(color1: UIColor.clearColor(), color2: color, from: from, to: to, frame: line.frame)
let square = SKSpriteNode(texture: tex)
the rest of the code is basically identical. This produces a similar effect with no bugs performance/memory wise. However, something odd happens with SKCropNode and it looks like this
It has no antialiasing, and the line is thicker. I have tried changing anti-aliasing, glow width, and line width. There is a minimum width that can not change for some reason, and setting the glow width larger does this
. According to other stackoverflow questions maskNodes are either 1 or 0 in alpha. This is confusing since the SKShapeNode can have different line/glow widths.
Attempt 3. After some research, I discovered I might be able to use the clipping effect and preserve line width/glow using an SKEffectNode instead of SKCropNode.
//Not the exact code to what I tried, but very similar
let maskNode = SKEffectNode()
maskNode.filter = customLinearImageFilter
maskNode.addChild(line)
This produced the (literally) exact same effect as attempt 1. It created the same lines and animation, but the same bugs with other nodes/fps/nodeCount occured. So it seems to be a bug with SKEffectNode, and not SKShapeNode.
I do not know how to bypass the bugs with attempt 1/3 or 2.
Does anybody know if there is something I am doing wrong, if there is a bypass around this, or a different solution altogether for my problem?
Edit: I considered emitters, but there could potentially be hundreds of comets/other nodes coming in within a few seconds and didn't think they would be feasible performance-wise. I have not used SpriteKit before this project so correct me if I am wrong.

This looks like a problem for a custom shader attached to the comet path. If you are not familiar with OpenGL Shading Language (GLSL) in SpriteKit it lets you jump right into the GPU fragment shader specifically to control the drawing behavior of the nodes it is attached to via SKShader.
Conveniently the SKShapeNode has a strokeShader property for hooking up an SKShader to draw the path. When connected to this property the shader gets passed the length of the path and the point on the path currently being drawn in addition to the color value at that point.*
controlFadePath.fsh
void main() {
//uniforms and varyings
vec4 inColor = v_color_mix;
float length = u_path_length;
float distance = v_path_distance;
float start = u_start;
float end = u_end;
float mult;
mult = smoothstep(end,start,distance/length);
if(distance/length > start) {discard;}
gl_FragColor = vec4(inColor.r, inColor.g, inColor.b, inColor.a) * mult;
}
To control the fade along the path pass a start and end point into the custom shader using two SKUniform objects named u_start and u_end These get added to the custom shader during initialization of a custom SKShapeNode class CometPathShape and animated via a custom Action.
class CometPathShape:SKShapeNode
class CometPathShape:SKShapeNode {
//custom shader for fading
let pathShader:SKShader
let fadeStartU = SKUniform(name: "u_start",float:0.0)
let fadeEndU = SKUniform(name: "u_end",float: 0.0)
let fadeAction:SKAction
override init() {
pathShader = SKShader(fileNamed: "controlFadePath.fsh")
let fadeDuration:NSTimeInterval = 1.52
fadeAction = SKAction.customActionWithDuration(fadeDuration, actionBlock:
{ (node:SKNode, time:CGFloat)->Void in
let D = CGFloat(fadeDuration)
let t = time/D
var Ps:CGFloat = 0.0
var Pe:CGFloat = 0.0
Ps = 0.25 + (t*1.55)
Pe = (t*1.5)-0.25
let comet:CometPathShape = node as! CometPathShape
comet.fadeRange(Ps,to: Pe) })
super.init()
path = makeComet...(...) //custom method that creates path for comet shape
strokeShader = pathShader
pathShader.addUniform(fadeStartU)
pathShader.addUniform(fadeEndU)
hidden = true
//set up for path shape, eg. strokeColor, strokeWidth...
...
}
func fadeRange(from:CGFloat, to:CGFloat) {
fadeStartU.floatValue = Float(from)
fadeEndU.floatValue = Float(to)
}
func launch() {
hidden = false
runAction(fadeAction, completion: { ()->Void in self.hidden = true;})
}
...
The SKScene initializes the CometPathShape objects, caches and adds them to the scene. During update: the scene simply calls .launch() on the chosen CometPathShapes.
class GameScene:SKScene
...
override func didMoveToView(view: SKView) {
/* Setup your scene here */
self.name = "theScene"
...
//create a big bunch of paths with custom shaders
print("making cache of path shape nodes")
for i in 0...shapeCount {
let shape = CometPathShape()
let ext = String(i)
shape.name = "comet_".stringByAppendingString(ext)
comets.append(shape)
shape.position.y = CGFloat(i * 3)
print(shape.name)
self.addChild(shape)
}
override func update(currentTime: CFTimeInterval) {
//pull from cache and launch comets, skip busy ones
for _ in 1...launchCount {
let shape = self.comets[Int(arc4random_uniform(UInt32(shapeCount)))]
if shape.hasActions() { continue }
shape.launch()
}
}
This cuts the number of SKNodes per comet from 3 to 1 simplifying your code and the runtime environment and it opens the door for much more complex effects via the shader. The only drawback I can see is having to learn some GLSL.**
*not always correctly in the device simulator. Simulator not passing distance and length values to custom shader.
**that and some idiosyncrasies in CGPath glsl behavior. Path construction is affecting the way the fade performs. Looks like v_path_distance is not blending smoothly across curve segments. Still, with care constructing the curve this should work.

Related

SKEffectNode to an SKTexture?

SKEffectionNodes have a shouldRasterise "switch" that bakes them into a bitmap, and doesn't update them until such time as the underlying nodes that are impacted by the effect are changed.
However I can't find a way to create an SKTexture from this rasterised "image".
Is it possible to get a SKTexture from a SKEffectNode?
I think you could try a code like this (it's just an example):
if let effect = SKEffectNode.init(fileNamed: "myeffect") {
effect.shouldRasterize = true
self.addChild(effect)
...
let texture = SKView().texture(from: self)
}
Update:
After you answer, hope I understood better what do you want to achieve.
This is my point of view: if you want to make a shadow of a texture, you could simply create an SKSpriteNode with this texture:
let shadow = SKSpriteNode.init(texture: <yourTexture>)
shadow.blendMode = SKBlendMode.alpha
shadow.colorBlendFactor = 1
shadow.color = SKColor.black
shadow.alpha = 0.25
What I want to say is that you could proceed step by step:
get your texture
elaborate your texture (add filters, make some other effect..)
get shadow
This way of working produces a series of useful methods you could use in your project to build other kind of elements.
Maybe, by separating the tasks you don't need to use texture(from:)
I've figured this out, in a way that solves my problems, using a Factory.
Read more on how to make a factory, from BenMobile's patient and clear articulation, here: Factory creation and use for making Sprites and Shapes
There's an issue with blurring a SKTexture or SKSpriteNode in that it's going to run out of space. The blur/glow goes beyond the edges of the sprite. To solve this, in the below, you'll see I've created a "framer" object. This is simply an empty SKSpriteNode that's double the size of the texture to be blurred. The texture to be blurred is added as a child, to this "framer" object.
It works, regardless of how hacky this is ;)
Inside a static factory class file:
import SpriteKit
class Factory {
private static let view:SKView = SKView() // the magic. This is the rendering space
static func makeShadow(from source: SKTexture, rgb: SKColor, a: CGFloat) -> SKSpriteNode {
let shadowNode = SKSpriteNode(texture: source)
shadowNode.colorBlendFactor = 0.5 // near 1 makes following line more effective
shadowNode.color = SKColor.gray // makes for a darker shadow. White for "glow" shadow
let textureSize = source.size()
let doubleTextureSize = CGSize(width: textureSize.width * 2, height: textureSize.height * 2)
let framer = SKSpriteNode(color: UIColor.clear, size: doubleTextureSize)
framer.addChild(shadowNode)
let blurAmount = 10
let filter = CIFilter(name: "CIGaussianBlur")
filter?.setValue(blurAmount, forKey: kCIInputRadiusKey)
let fxNode = SKEffectNode()
fxNode.filter = filter
fxNode.blendMode = .alpha
fxNode.addChild(framer)
fxNode.shouldRasterize = true
let tex = view.texture(from: fxNode) // ‘view’ refers to the magic first line
let shadow = SKSpriteNode(texture: tex) //WHOOPEE!!! TEXTURE!!!
shadow.colorBlendFactor = 0.5
shadow.color = rgb
shadow.alpha = a
shadow.zPosition = -1
return shadow
}
}
Inside anywhere you can access the Sprite you want to make a shadow or glow texture for:
shadowSprite = Factory.makeShadow(from: button, rgb: myColor, a: 0.33)
shadowSprite.position = CGPoint(x: self.frame.midX, y: self.frame.midY - 5)
addChild(shadowSprite)
-
button is a texture of the button to be given a shadow. a: is an alpha setting (actually transparency level, 0.0 to 1.0, where 1.0 is fully opaque) the lower this is the lighter the shadow will be.
The positioning serves to drop the shadow slightly below the button so it looks like light is coming from the top, casting shadows down and onto the background.

Custom Particle System for iOS

I want to create a particle system on iOS using sprite kit where I define the colour of each individual particle. As far as I can tell this isn't possible with the existing SKEmitterNode.
It seems that best I can do is specify general behaviour. Is there any way I can specify the starting colour and position of each particle?
This can give you a basic idea what I was meant in my comments. But keep in mind that it is untested and I am not sure how it will behave if frame rate drops occur.
This example creates 5 particles per second, add them sequentially (in counterclockwise direction) along the perimeter of a given circle. Each particle will have different predefined color. You can play with Settings struct properties to change the particle spawning speed or to increase or decrease number of particles to emit.
Pretty much everything is commented, so I guess you will be fine:
Swift 2
import SpriteKit
struct Settings {
static var numberOfParticles = 30
static var particleBirthRate:CGFloat = 5 //Means 5 particles per second, 0.2 means one particle in 5 seconds etc.
}
class GameScene: SKScene {
var positions = [CGPoint]()
var colors = [SKColor]()
var emitterNode:SKEmitterNode?
var currentPosition = 0
override func didMoveToView(view: SKView) {
backgroundColor = .blackColor()
emitterNode = SKEmitterNode(fileNamed: "rain.sks")
if let emitter = emitterNode {
emitter.position = CGPoint(x: CGRectGetMidX(frame), y: CGRectGetMidY(frame))
emitter.particleBirthRate = Settings.particleBirthRate
addChild(emitter)
let radius = 50.0
let center = CGPointZero
for var i = 0; i <= Settings.numberOfParticles; i++ {
//Randomize color
colors.append(SKColor(red: 0.78, green: CGFloat(i*8)/255.0, blue: 0.38, alpha: 1))
//Create some points on a perimeter of a given circle (radius = 40)
let angle = Double(i) * 2.0 * M_PI / Double(Settings.numberOfParticles)
let x = radius * cos(angle)
let y = radius * sin(angle)
let currentParticlePosition = CGPointMake(CGFloat(x) + center.x, CGFloat(y) + center.y)
positions.append(currentParticlePosition)
if i == 1 {
/*
Set start position for the first particle.
particlePosition is starting position for each particle in the emitter's coordinate space. Defaults to (0.0, 0,0).
*/
emitter.particlePosition = positions[0]
emitter.particleColor = colors[0]
self.currentPosition++
}
}
// Added just for debugging purposes to show positions for every particle.
for particlePosition in positions {
let sprite = SKSpriteNode(color: SKColor.orangeColor(), size: CGSize(width: 1, height: 1))
sprite.position = convertPoint(particlePosition, fromNode:emitter)
sprite.zPosition = 2
addChild(sprite)
}
let block = SKAction.runBlock({
// Prevent strong reference cycles.
[unowned self] in
if self.currentPosition < self.positions.count {
// Set color for the next particle
emitter.particleColor = self.colors[self.currentPosition]
// Set position for the next particle. Keep in mind that particlePosition is a point in the emitter's coordinate space.
emitter.particlePosition = self.positions[self.currentPosition++]
}else {
//Stop the action
self.removeActionForKey("emitting")
emitter.particleBirthRate = 0
}
})
// particleBirthRate is a rate at which new particles are generated, in particles per second. Defaults to 0.0.
let rate = NSTimeInterval(CGFloat(1.0) / Settings.particleBirthRate)
let sequence = SKAction.sequence([SKAction.waitForDuration(rate), block])
let repeatAction = SKAction.repeatActionForever(sequence)
runAction(repeatAction, withKey: "emitting")
}
}
}
Swift 3.1
import SpriteKit
struct Settings {
static var numberOfParticles = 30
static var particleBirthRate:CGFloat = 5 //Means 5 particles per second, 0.2 means one particle in 5 seconds etc.
}
class GameScene: SKScene {
var positions = [CGPoint]()
var colors = [SKColor]()
var emitterNode: SKEmitterNode?
var currentPosition = 0
override func didMove(to view: SKView) {
backgroundColor = SKColor.black
emitterNode = SKEmitterNode(fileNamed: "rain.sks")
if let emitter = emitterNode {
emitter.position = CGPoint(x: frame.midX, y: frame.midY)
emitter.particleBirthRate = Settings.particleBirthRate
addChild(emitter)
let radius = 50.0
let center = CGPoint.zero
for var i in 0...Settings.numberOfParticles {
//Randomize color
colors.append(SKColor(red: 0.78, green: CGFloat(i * 8) / 255.0, blue: 0.38, alpha: 1))
//Create some points on a perimeter of a given circle (radius = 40)
let angle = Double(i) * 2.0 * Double.pi / Double(Settings.numberOfParticles)
let x = radius * cos(angle)
let y = radius * sin(angle)
let currentParticlePosition = CGPoint.init(x: CGFloat(x) + center.x, y: CGFloat(y) + center.y)
positions.append(currentParticlePosition)
if i == 1 {
/*
Set start position for the first particle.
particlePosition is starting position for each particle in the emitter's coordinate space. Defaults to (0.0, 0,0).
*/
emitter.particlePosition = positions[0]
emitter.particleColor = colors[0]
self.currentPosition += 1
}
}
// Added just for debugging purposes to show positions for every particle.
for particlePosition in positions {
let sprite = SKSpriteNode(color: SKColor.orange, size: CGSize(width: 1, height: 1))
sprite.position = convert(particlePosition, from: emitter)
sprite.zPosition = 2
addChild(sprite)
}
let block = SKAction.run({
// Prevent strong reference cycles.
[unowned self] in
if self.currentPosition < self.positions.count {
// Set color for the next particle
emitter.particleColor = self.colors[self.currentPosition]
// Set position for the next particle. Keep in mind that particlePosition is a point in the emitter's coordinate space.
emitter.particlePosition = self.positions[self.currentPosition]
self.currentPosition += 1
} else {
//Stop the action
self.removeAction(forKey: "emitting")
emitter.particleBirthRate = 0
}
})
// particleBirthRate is a rate at which new particles are generated, in particles per second. Defaults to 0.0.
let rate = TimeInterval(CGFloat(1.0) / Settings.particleBirthRate)
let sequence = SKAction.sequence([SKAction.wait(forDuration: rate), block])
let repeatAction = SKAction.repeatForever(sequence)
run(repeatAction, withKey: "emitting")
}
}
}
Orange dots are added just for debugging purposes and you can remove that part if you like.
Personally I would say that you are overthinking this, but I might be wrong because there is no clear description of what you are trying to make and how to use it. Keep in mind that SpriteKit can render a bunch of sprites in a single draw call in very performant way. Same goes with SKEmitterNode if used sparingly. Also, don't underestimate SKEmitterNode... It is very configurable actually.
Here is the setup of Particle Emitter Editor:
Anyways, here is the final result:
Note that nodes count comes from an orange SKSpriteNodes used for debugging. If you remove them, you will see that there is only one node added to the scene (emitter node).
What you want is completely possible, probably even in real time. Unfortunately to do such a thing the way you describe with moving particles as being a particle for each pixel would be best done with a pixel shader. I don't know of a clean method that would allow you to draw on top of the scene with a pixel shader otherwise all you would need is a pixel shader that takes the pixels and moves them out from the center. I personally wouldn't try to do this unless I built the game with my own custom game engine in place of spritekit.
That being said I'm not sure a pixel per pixel diffusion is the best thing in most cases. Expecially if you have cartoony art. Many popular games will actually make sprites for fragments of the object they expect to shader. So like if it's an airplane you might have a sprite for the wings with perhaps even wires hanging out of this. Then when it is time to shatter the plane, remove it from the scene and replace the area with the pieces in the same shape of the plane... Sorta like a puzzle. This will likely take some tweaking. Then you can add skphysicsbodies to all of these pieces and have a force push them out in all directions. Also this doesn't mean that each pixel gets a node. I would suggest creatively breaking it into under 10 pieces.
And as whirlwind said you could all ways get things looking "like" it actually disintegrated by using an emitter node. Just make the spawn area bigger and try to emulate the color as much as possible. To make the ship dissappear you could do a fade perhaps? Or Mabye an explosion sprite over it? Often with real time special effects and physics, or with vfx it is more about making it look like reality then actually simulating reality. Sometimes you have to use trickery to get things to look good and run real-time.
If you want to see how this might look I would recommend looking at games like jetpac joyride.
Good luck!

Applying a custom SKShader to SKScene that pixelates the whole rendered scene in iOS 8 SpriteKit with Swift

I'm trying to create a full-screen pixelation effect on SKScene. I've learned that there should be two options to do this:
Using a custom SKShader using GLES 2.0.
Using Core Image filters.
I've tried to add a custom SKShader that should modify the whole screen by pixelating it. I'm not sure that if it's possible, but documentation from SKScene (which is a subclass of SKEffectNode) suggests it:
An SKEffectNode object renders its children into a buffer and
optionally applies a Core Image filter to this rendered output.
It's possible to assign a SKShader to the SKScene, as in GameScene : SKScene:
override func didMoveToView(view: SKView) {
let shader = SKShader(fileNamed: "pixelation.fsh")
self.shader = shader
self.shouldEnableEffects = true
}
... but it seems that the rendered buffer is not passed as the u_texture to the GLES:
void main()
{
vec2 coord = v_tex_coord;
coord.x = floor(coord.x * 10.0) / 10.0;
coord.y = floor(coord.y * 10.0) / 10.0;
vec4 texture = texture2D(u_texture, coord);
gl_FragColor = texture;
}
... so the previous shader doesn't work.
If I assign that shader to a texture-based SKSpriteNode, it works.
So is it possible to modify the whole frame buffer (and for example pixelate it) as a post-processing measure after all the nodes have been rendered?
Edit: I found a way to do the pixelation using Core Image filters in OS X (How do you add a CIPixellate Core Image Filter to a Sprite Kit scene?), but copying that implementation doesn't yield any results on iOS. According to the documents CIPixellate should be Available in OS X v10.4 and later and in iOS 6.0 and later..
I managed to make it work using Core Image filter CIPixellate. I used is as a filter to SKEffectNode to produce the pixelation effect. Couple of things to note:
SKScene is a subclass of SKEffectNode, but applying the filter to SKScene doesn't work. It'll mess up the background and doesn't do any pixellation.
You need to create a SKEffectNode and add the nodes to be pixelated under that.
Here's the solution based on the code generated when you choose a Game type project with Swift:
import SpriteKit
class GameScene: SKScene {
var effectNode : SKEffectNode = SKEffectNode.node()
override func didMoveToView(view: SKView) {
let filter = CIFilter(name: "CIPixellate")
filter.setDefaults()
filter.setValue(5.0, forKey: "inputScale")
self.effectNode.filter = filter
self.effectNode.shouldEnableEffects = true
self.addChild(effectNode)
}
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
for touch: AnyObject in touches {
let location = touch.locationInNode(self)
let sprite = SKSpriteNode(imageNamed:"Spaceship")
sprite.xScale = 0.5
sprite.yScale = 0.5
sprite.position = location
let action = SKAction.rotateByAngle(CGFloat(M_PI), duration:1)
sprite.runAction(SKAction.repeatActionForever(action))
self.effectNode.addChild(sprite)
}
}
override func update(currentTime: CFTimeInterval) {
/* Called before each frame is rendered */
}
}
In order to get your .shader running on SKScene, you need to set shouldEnableEffects to true on the scene (same thing goes for SKEffectNode).
While technically, that "works" (the shader is applied), there's a bug in the rendering of the scene afterwards that gets slightly resized.
So using CoreImage filters is, so far, the best way to go.
I actually had to do the exact same thing for a recent project as a way to transition between levels, and ended up doing a work around for it. Basically, I took a screenshot of the screen in the code then when I loaded the next level, I called a previously saved screenshot of how the level should look when it was loaded. I added the previous level screenshot as an SKSpriteNode and then ran the shader a number of times until it was incredibly pixelated. Then I did the same to the screenshot for that level and replaced the two and then I un-pixelated the second screenshot so it looked like as soon as the level was beaten everything pixelated itself then un-pixelated itself to reveal a new level.
UIGraphicsBeginImageContextWithOptions(UIScreen.mainScreen().bounds.size, false, 0);
self.view!.drawViewHierarchyInRect(view!.bounds, afterScreenUpdates: true)
let image:UIImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//UIGraphicsEndImageContext()
protoImage = SKSpriteNode(texture: SKTexture(CGImage: image.CGImage!))
protoImage.size = CGSizeMake(self.frame.size.width, self.frame.size.height)
node.position = CGPointMake(self.frame.size.width/2, self.frame.size.height/2)
protoImage.zPosition = 9000
Second Scene
let shader: SKShader = SKShader(fileNamed: "RWTGradient2.fsh")
let ratioX: Float = divisor/Float(protoImage.frame.size.width)
let ratioY: Float = divisor/Float(protoImage.frame.size.height)
shader.uniforms = [
SKUniform(name: "ratioX", float: ratioX),
SKUniform(name: "ratioY", float: ratioY),
]
protoImage.shader = shader;

Fading a shadow together with the SKSpriteNode that casts it

Here's my setup, using Sprite Kit. First, I create a simple sprite node within a SKScene, like so:
let block = SKSpriteNode(color: UIColor.redColor(), size: CGSizeMake(90, 160))
block.zPosition = 2
block.shadowCastBitMask = 1
addChild(block)
Then add a light node to the scene:
let light = SKLightNode()
light.categoryBitMask = 1
light.falloff = 1
addChild(light)
Sure enough, the block now casts a nice little shadow:
Now I fade the block by manipulating its alpha value, for example by running an action:
let fadeOut = SKAction.fadeAlphaTo(0.0, duration: 5.0)
block.runAction(fadeOut)
Here's the awkward situation: while the block becomes more and more translucent, the shadow stays exactly the same. This is how it looks like just a moment before the end of the action:
And once the alpha drops to 0.0 entirely, the shadow suddenly disappears, from one frame to the next.
It would be much nicer, however, to have the shadow slowly become weaker and weaker, as the object casting it becomes more and more transparent.
Question:
Is an effect like this possible with Sprite Kit? If so, how would you go about it?
This is a little tricky because the shadow cast by an SKLightNode isn't affected by the node's alpha property. What you need to do is fade out the alpha channel of the shadowColor property of the SKLightNode at the same time you're fading out your block.
The basic steps are:
Store the light's shadowColor and that color's alpha channel for reference.
Create a SKAction.customActionWithDuration which:
Re-calculates the value for the alpha channel based on the original and how much time has past so far in the action.
Sets the light's shadowColor to its original color but with the new alpha channel.
Run the block's fade action and the shadow's fade action in parallel.
Example:
let fadeDuration = 5.0 // We're going to use this a lot
// Grab the light's original shadowColor so we can use it later
let shadowColor = light.shadowColor
// Also grab its alpha channel so we don't have to do it each time
let shadowAlpha = CGColorGetAlpha(shadowColor.CGColor)
let fadeShadow = SKAction.customActionWithDuration(fadeDuration) {
// The first parameter here is the node this is running on.
// Ideally you'd use that to get the light, but I'm taking
// a shortcut and accessing it directly.
(_, time) -> Void in
// This is the original alpha channel of the shadow, adjusted
// for how much time has past while running the action so far
// It will go from shadowAlpha to 0.0 over fadeDuration
let alpha = shadowAlpha - (shadowAlpha * time / CGFloat(fadeDuration))
// Set the light's shadowColor to the original color, but replace
// its alpha channel our newly calculated one
light.shadowColor = shadowColor.colorWithAlphaComponent(alpha)
}
// Make the action to fade the block too; easy!
let fadeBlock = SKAction.fadeAlphaTo(0.0, duration: fadeDuration)
// Run the fadeBlock action and fadeShadow action in parallel
block.runAction(SKAction.group([fadeBlock, fadeShadow]))
The following is one way to ensure that the shadow and block fade-in/fade-out together. To use this approach, you will need to declare light and block as properties of the class.
override func didEvaluateActions() {
light.shadowColor = light.shadowColor.colorWithAlphaComponent(block.alpha/2.0)
}
EDIT: Here's how to implement the above.
class GameScene: SKScene {
let light = SKLightNode()
let block = SKSpriteNode(color: UIColor.redColor(), size: CGSizeMake(90, 160))
override func didMoveToView(view: SKView) {
/* Setup your scene here */
block.zPosition = 2
block.shadowCastBitMask = 1
block.position = CGPointMake(100, 100)
addChild(block)
light.categoryBitMask = 1
light.falloff = 1
addChild(light)
let fadeOut = SKAction.fadeAlphaTo(0.0, duration: 5.0);
let fadeIn = SKAction.fadeAlphaTo(1.0, duration: 5.0);
block.runAction(SKAction.sequence([fadeOut,fadeIn,fadeOut]))
}
override func didEvaluateActions() {
light.shadowColor = light.shadowColor.colorWithAlphaComponent(block.alpha/2.0)
}
}

SceneKit get texture coordinate after touch with Swift

I want to manipulate 2D textures in a 3D SceneKit scene.
Therefore i used this code to get local coordinates:
#IBAction func tap(sender: UITapGestureRecognizer) {
var arr:NSArray = my3dView.hitTest(sender.locationInView(my3dView), options: NSDictionary(dictionary: [SCNHitTestFirstFoundOnlyKey:true]))
var res:SCNHitTestResult = arr.firstObject as SCNHitTestResult
var vect:SCNVector3 = res.localCoordinates}
I have the texture read out from my scene with:
var mat:SCNNode = myscene.rootNode.childNodes[0] as SCNNode
var child:SCNNode = mat.childNodeWithName("ID12", recursively: false)
var geo:SCNMaterial = child.geometry.firstMaterial
var channel = geo.diffuse.mappingChannel
var textureimg:UIImage = geo.diffuse.contents as UIImage
and now i want to draw at the touchpoint to the texture...
how can i do that? how can i transform my coordinate from touch to the texture image?
Sounds like you have two problems. (Without even having used regular expressions. :))
First, you need to get the texture coordinates of the tapped point -- that is, the point in 2D texture space on the surface of the object. You've almost got that right already. SCNHitTestResult provides those with the textureCoordinatesWithMappingChannel method. (You're using localCoordinates, which gets you a point in the 3D space owned by the node in the hit-test result.) And you already seem to have found the business about mapping channels, so you know what to pass to that method.
Problem #2 is how to draw.
You're doing the right thing to get the material's contents as a UIImage. Once you've got that, you could look into drawing with UIGraphics and CGContext functions -- create an image with UIGraphicsBeginImageContext, draw the existing image into it, then draw whatever new content you want to add at the tapped point. After that, you can get the image you were drawing with UIGraphicsGetImageFromCurrentImageContext and set it as the new diffuse.contents of your material. However, that's probably not the best way -- you're schlepping a bunch of image data around on the CPU, and the code is a bit unwieldy, too.
A better approach might be to take advantage of the integration between SceneKit and SpriteKit. This way, all your 2D drawing is happening in the same GPU context as the 3D drawing -- and the code's a bit simpler.
You can set your material's diffuse.contents to a SpriteKit scene. (To use the UIImage you currently have for that texture, just stick it on an SKSpriteNode that fills the scene.) Once you have the texture coordinates, you can add a sprite to the scene at that point.
var nodeToDrawOn: SCNNode!
var skScene: SKScene!
func mySetup() { // or viewDidLoad, or wherever you do setup
// whatever else you're doing for setup, plus:
// 1. remember which node we want to draw on
nodeToDrawOn = myScene.rootNode.childNodeWithName("ID12", recursively: true)
// 2. set up that node's texture as a SpriteKit scene
let currentImage = nodeToDrawOn.geometry!.firstMaterial!.diffuse.contents as UIImage
skScene = SKScene(size: currentImage.size)
nodeToDrawOn.geometry!.firstMaterial!.diffuse.contents = skScene
// 3. put the currentImage into a background sprite for the skScene
let background = SKSpriteNode(texture: SKTexture(image: currentImage))
background.position = CGPoint(x: skScene.frame.midX, y: skScene.frame.midY)
skScene.addChild(background)
}
#IBAction func tap(sender: UITapGestureRecognizer) {
let results = my3dView.hitTest(sender.locationInView(my3dView), options: [SCNHitTestFirstFoundOnlyKey: true]) as [SCNHitTestResult]
if let result = results.first {
if result.node === nodeToDrawOn {
// 1. get the texture coordinates
let channel = nodeToDrawOn.geometry!.firstMaterial!.diffuse.mappingChannel
let texcoord = result.textureCoordinatesWithMappingChannel(channel)
// 2. place a sprite there
let sprite = SKSpriteNode(color: SKColor.greenColor(), size: CGSize(width: 10, height: 10))
// scale coords: texcoords go 0.0-1.0, skScene space is is pixels
sprite.position.x = texcoord.x * skScene.size.width
sprite.position.y = texcoord.y * skScene.size.height
skScene.addChild(sprite)
}
}
}
For more details on the SpriteKit approach (in Objective-C) see the SceneKit State of the Union Demo from WWDC14. That shows a SpriteKit scene used as the texture map for a torus, with spheres of paint getting thrown at it -- whenever a sphere collides with the torus, it gets a SCNHitTestResult and uses its texcoords to create a paint splatter in the SpriteKit scene.
Finally, some Swift style comments on your code (unrelated to the question and answer):
Use let instead of var wherever you don't need to reassign a value, and the optimizer will make your code go faster.
Explicit type annotations (res: SCNHitTestResult) are rarely necessary.
Swift dictionaries are bridged to NSDictionary, so you can pass them directly to an API that takes NSDictionary.
Casting to a Swift typed array (hitTest(...) as [SCNHitTestResult]) saves you from having to cast the contents.

Resources