Background for different screen sizes - ios

I am quite new in the SpriteKit development and I am trying to develop my first game.
I have implemented some edges (to avoid the ball going out of the screen) using the visual Scene editor (GameScene.sks) and I had to specify the size of the Scene (640x960).
Now, using the code, I would like to change the background according to the device width/height (because I cannot limit to 640x960).
So, I need to use:
scene.scaleMode = SKSceneScaleMode.Fill
In order to stretch the edges of the scene for different devices
BUT
I would like the backgrounds to be in scale mode 1
Is that possible at all?
This is the code I am using to set up a different background for each device:
if(skView.bounds.width == 768.0) {
var backgroundTexture = SKTexture(imageNamed: "bg_768.jpg")
let background = SKSpriteNode(texture: backgroundTexture)
background.size.width = skView.bounds.width
background.size.height = skView.bounds.height
background.position = CGPointMake(CGRectGetMidX(skView.frame), CGRectGetMidY(skView.frame))
scene.addChild(background)
} else {
//set others backgrounds
}
But if the scene get stretched they get stretched too..!
Is there a way to avoid this?
Thank you!

Related

Setting lighting in ARKit framework

Ok, I'm new to SceneKit and ARKit here and I just want to set any models I add to my scene to have a certain, bright lighting. I have tried all different configurations of the automatically update lighting settings with ARSceneView, however the only thing that really creates a discernible difference is autoenablesDefaultLighting:
func setup() {
antialiasingMode = .multisampling4X
//autoenablesDefaultLighting = true
preferredFramesPerSecond = 60
contentScaleFactor = 1.3
if let camera = pointOfView?.camera {
camera.wantsHDR = true
camera.wantsExposureAdaptation = true
camera.exposureOffset = -1
camera.minimumExposure = -1
camera.maximumExposure = 3
}
}
Regardless of the lighting obtained from the camera (as I know ArKit is able to do), I just want to set 1 lighting setting always. I want my scene contents to be lit like this:
Is this possible? What would I set sceneView.scene.lightingEnvironment equal to in order to achieve this effect?
According to the docs, you should be able to create a SCNNode in a position and then add a SCNLight to it:
https://developer.apple.com/documentation/scenekit/scnnode
https://developer.apple.com/documentation/scenekit/scnlight

Swift sprite kit vertical background infinite image

I have 3 images:
topBg.png
midBg.png
botBg.png
I want to set topBg.png at top scene and height = 200
middleBg.png should be infinite scale or repeat vertically
botBg.png - should be in bottom and height = 200
i have next code:
override func didMove(to view: SKView) {
self.bgTopSpriteNode = self.childNode(withName: "//bgTopNode") as? SKSpriteNode
self.bgMiddleSpriteNode = self.childNode(withName: "//bgMiddleNode") as? SKSpriteNode
self.bgBottomSpriteNode = self.childNode(withName: "//bgBottomNode") as? SKSpriteNode
if let bgTopSpriteNode = self.bgTopSpriteNode,
let bgMiddleSpriteNode = self.bgMiddleSpriteNode,
let bgBottomSpriteNode = self.bgBottomSpriteNode {
bgTopSpriteNode.size.width = self.frame.width
bgTopSpriteNode.size.height = 200
bgTopSpriteNode.position.x = 0
bgMiddleSpriteNode.size.width = self.frame.width
bgMiddleSpriteNode.size.height = self.frame.height-400
bgMiddleSpriteNode.position.x = 0
bgBottomSpriteNode.size.width = self.frame.width
bgBottomSpriteNode.size.height = 200
bgBottomSpriteNode.position.x = 0
}
}
But how to set Y position of images. Because coordinates begin from center of screen, not from left top and i don't know how to convert them.
There are a couple of different ways to achieve what you're looking to do.
First, you can compute the y position of the top and the bottom of the screen using simply size.height / 2 if you have the anchorPoint of your scene at (0.5,0.5). (Don't use frame - use size. That way, you take into account the scaleMode of the scene.)
It sounds like you are frustrated that the origin of the scene is in the center. If you'd like to move it to the corner, you can easily do so by setting the scene's anchorPoint property, say, to (0.0, 0.0) for the lower left corner. Then, your y-values are 0 and size.height. If you are using the .sks editor, this is exposed in the interface - you can just set it there. Otherwise, you can set it programmatically.
Finally, you can set the scaleMode of your scene to something like .aspectFill, set the size of the scene directly (say, to 1024x768 for an iPad), and just place the images wherever they need to go. This approach works particularly well with .sks files, if you are using them; when you load up a scene, you can set the size of the scene based on the aspect ratio of the view it's in to accommodate different aspect ratios. For instance, you could adopt a 320x480 "reference size" for your iPhone scenes. Whenever you load up the scene, you could set the size of the scene to be 320 points wide and however many points tall to match the aspect ratio of the device. Then, all your graphics would be produced at 320pt wide, and you could slide them up or down proportionally across the scene's size for layout. This is a little more complicated, but it's a lot easier than trying to deal with separate layout considerations for multiple devices.
I should also point out a couple of things.
You can use the anchorPoint property of a sprite to dictate where the sprite's coordinates are measured from. This is handy for cases where you want images to be flush up against something. For instance, if you want an image flush against the left side of the screen, set its position to be exactly the left side of the screen, and then set its anchorPoint.x to 0.0; this will put the left edge of the sprite against the left edge of the screen. This also works for scenes, as you encountered - moving the anchorPoint of the scene moves everything in the scene relative to its size.
You don't need three images for what you're describing. You can use a single sprite and just set its centerRect property to tell it to use the top and bottom of an image and stretch the center part vertically. You have to do a little math to set the right xScale and yScale (not width and height, IIRC), but then you can draw all of that with one sprite instead of three. This would be really handy in your case, because you could just leave the sprite at (0,0), set its scale to match the size of the entire scene, and set the centerRect property - you wouldn't have to do any positioning math at all.

How to resize an SKEmitterNode?

I just wanted to do something that to me seems really simple, which is make an emitter into the entire background of a view... then I would want the view to be able to aspectFill and scale and so forth, allowing the emitter to look proper whatever I did...
I'm not looking to just scale the emitter. I want the objects to remain the right size, but i want the area of the emitter to change... think of it like the "canvas size" (without resizing) option in photoshop..
I would use this effect, for example, to add snow or rain to an entire scene, or make a snow globe sprite...
Thing is, maybe I'm just not looking in the right place, but it seems that any size properties on an SKEmitterNode are read only... what am I doing wrong?
here's some code, where the resulting emitter is just a small rectangle in the middle of the view.
override func viewDidLoad() {
super.viewDidLoad()
if let scene = GameScene(fileNamed:"GameScene") {
// Configure the view.
let skView = self.view as! SKView
skView.showsFPS = true
skView.showsNodeCount = true
if let emitter = SKEmitterNode(fileNamed: "Bokeh") {
emitter.position = view.center
scene.addChild(emitter)
}
/* Sprite Kit applies additional optimizations to improve rendering performance */
skView.ignoresSiblingOrder = true
/* Set the scale mode to scale to fit the window */
scene.scaleMode = .AspectFill
skView.presentScene(scene)
}
}
I need to clarify a little more. For snow, I would want the particles to instantiate before they enter the scene, and then fall through the scene, and only die after they have left the scene, rather than randomly appear throughout the scene, and then continue to fall down...I would want the snow to take up the entire bounds of the scene.
What you are looking for is called particlePositionRange :
The range of allowed random values for a particle’s position.
You can change it like this:
emitterNode.particlePositionRange = CGVector(dx: dx, dy: dy)
dx should be the width of your emitter, and dy the height. So that might be the size of a scene (note that by default size of the scene is 1024x768 if not specified differently).
Same thing you can do using the Particle Editor by changing the values in Position Range section:

Getting actual view size in Swift / IOS

Even after reading several posts in frame vs view I still cannot get this working. I open Xcode (7.3), create a new game project for IOS. On the default scene file, right after addChild, I add the following code:
print(self.view!.bounds.width)
print(self.view!.bounds.height)
print(self.frame.size.width)
print(self.frame.size.height)
print(UIScreen.mainScreen().bounds.size.width)
print(UIScreen.mainScreen().bounds.size.height)
I get following results when I run it for iPhone 6s:
667.0
375.0
1024.0
768.0
667.0
375.0
I can guess that first two numbers are Retina Pixel size at x2. I am trying to understand why frame size reports 1024x768 ?
Then I add following code to resize a simple background image to fill the screen:
self.anchorPoint = CGPointMake(0.5,0.5)
let theTexture = SKTexture(imageNamed: "intro_screen_phone")
let theSizeFromBounds = CGSizeMake(self.view!.bounds.width, self.view!.bounds.height)
let theImage = SKSpriteNode(texture: theTexture, color: SKColor.clearColor(), size: theSizeFromBounds)
I get an image smaller than the screen size. Image is displayed even smaller if I choose landscape mode.
I tried multiplying bounds width/height with two, hoping to get actual screen size but then the image gets too big. I also tried frame size which makes the image slightly bigger than the screen.
Main reason for my confusion, besides lack of knowledge is the fact that I've seen this exact example on a lesson working perfectly. Either I am missing something obvious or ?
The frame is given as 1024x768 because it is defined in points, not pixels.
If you want your scene to be the same size as your screen, before the scene is presented in your GameViewController, before:
skView.presentScene(scene)
use this:
scene.size = self.view.frame.size
which will make the scene the exact size of the screen.
Then you could easily make an image fill the scene like so:
func addBackground() {
let bgTexture = SKTexture(imageNamed: "NAME")
let bgSprite = SKSpriteNode(texture: bgTexture, color: SKColor.clearColor(), size: scene.size)
bgSprite.anchorPoint = CGPoint(x: 0, y: 0)
bgSprite.position = self.frame.origin
self.addChild(bgSprite)
}
Also, you may want to read up on the difference between a view's bounds and it's frame.

SKEffectNode - CIFilter Blur Size Limit - Big Black Box

I am trying to blur multiple SKNode objects. I do this by having a parent SKEffectNode with a CIFilter set to #"CIGaussianBlur". Like so:
- (SKEffectNode *)createBlurNode
{
SKEffectNode *blurNode = [[SKEffectNode alloc] init];
blurNode.shouldRasterize = YES;
[blurNode setShouldEnableEffects:NO];
[blurNode setFilter:[CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:#"inputRadius", #10.0f, nil]];
return blurNode;
}
This works fine for a bunch of nodes currently onscreen. But when I space these notes far away from each other (about 3000 pixels), the blurring no longer happens and I get a big black box. This happens regardless of whether the SKNodes I'm blurring are SKShapeNodes or SKSpriteNodes. Here's a sample project with this issue: Sample Project. (By the way, thanks to BobMoff for the initial version found here):
Here's happy blur (when nodes are less than 3000 pixels away from each other):
Sad blur (when nodes are more than 3000 pixels away from each other):
UPDATE
This behavior occurs whenever an SKEffectNode is the parent. It doesn't matter if it's enabling effects, blurring, etc. If the parent node is an SKNode, it's fine. i.e. Even if the parent blur node is created like it is below, you will get the blackness:
- (SKEffectNode *)createBlurNode
{
SKEffectNode *blurNode = [[SKEffectNode alloc] init];
// blurNode.shouldRasterize = YES;
// [blurNode setShouldEnableEffects:NO];
// [blurNode setFilter:[CIFilter filterWithName:#"CIGaussianBlur"
// keysAndValues:#"inputRadius", #10.0f, nil]];
return blurNode;
}
I had a similar problem, with a very wide, panning scene that I wanted to blur.
To get the blur effect to work, I removed any nodes that were sticking out too far past the edges of the scene:
// Property declarations, elsewhere in the class:
var blurNode: SKEffectNode
var mainScene: SKScene
var exParents: [SKNode : SKNode] = [:]
/**
* Remove outlying nodes from the scene and activate the SKEffectNode
*/
func blurScene() {
let FILTER_MARGIN: CGFloat = 100
let widthMax: CGFloat = mainScene.size.width + FILTER_MARGIN
let heightMax: CGFloat = mainScene.size.height + FILTER_MARGIN
// Recursively iterate through all blurNode's children
blurNode.enumerateChildNodesWithName(".//*", usingBlock: {
[unowned self]
node, stop in
if node.parent != nil && node.scene != nil { // Ignore nodes we already removed
if let sprite = node as? SKSpriteNode {
// Calculate sprite node position in scene coordinates
let sceneOrig = sprite.scene!.convertPoint(sprite.position, fromNode: sprite.parent!)
// Find left, right, bottom and top edges of sprite
let l = sceneOrig.x - sprite.size.width*sprite.anchorPoint.x
let r = l + sprite.size.width
let b = sceneOrig.y - sprite.size.height*sprite.anchorPoint.y
let t = b + sprite.size.height
if l < -FILTER_MARGIN || r > widthMax || b < -FILTER_MARGIN || t > heightMax {
self.exParents[sprite] = sprite.parent!
sprite.removeFromParent()
}
}
}
})
blurNode.shouldEnableEffects = true
}
/**
* Disable blur and reparent nodes we removed earlier
*/
func removeBlur() {
self.blurNode.shouldEnableEffects = false
for (kid, parent) in exParents {
parent.addChild(kid)
}
exParents = [:]
}
NOTES:
This does remove content from your effect node, so extremely wide nodes won't show up in the final result:
You can see the mountain highlighted in red stuck out too far and was removed from the resulting blur.
This code only considers SKSpriteNodes. Empty SKNodes don't seem to break the effect node, but if you're using other visible nodes like SKShapeNodes or SKLabelNodes, you'll have to modify this code to include them.
If you have ignoreSiblingOrder = false, this code might mess up your z-ordering since you can't guarantee what order the nodes are added back to the scene.
Stuff I tried that didn't work
Simply saying node.hidden = true instead of using removeFromParent() doesn't work. That would be WAY too easy ;)
Using an SKCropNode to crop out outlying content didn't work for me. I tried having the SKEffectNode parent the SKCropNode and the other way around, but the black square appeared no matter how small I made the cropped area. This might still be worth looking into if you're desperate for a cleaner solution.
As noted here, SKScenes are secretly SKEffectNodes and you can set their filter just like our blurNode above. SKScenes don't show a black screen when their content is too big. Unfortunately, they seem to just silently disable the filter instead. Again, I might have missed something, so you could explore this option further if you're trying to apply an effect across the entire scene.
Alternate Solutions
You can capture an image of the whole screen and apply a filter to that, as suggested here. I ended up going with an even simpler solution; I took a generic screenshot of the stuff I wanted to blur, then applied a very heavy blur so you can't see the precise details. I used that as the blurred background and you can hardly tell it's not the real thing ;) This also saves a healthy chunk of memory and avoids a small UI hiccup.
Musings
This is a pretty nasty bug, and I hope Apple comes up with a solution soon. You can click this cute picture of a camera to get a GPU trace and some insight on what's happening:
The device seems to be discarding the framebuffer for the effect node because it takes up too much memory. This is affirmed by the fact that when there's more memory pressure on the device, it's easier to get the 'black square' on smaller content in the SKEffectNode.
I used a method that worked for my game but it requires the blurred area to be static without movement.
On iOS 10 using Swift 3 I used SKSpriteNode, SKView, SKEffectNode, CIFilter. I created a sprite from a texture returned from the SKView method "texture from node" and passed the current scene as the parameter because it inherits from SKNode. So essentially I was taking a "screenshot" of the scene and creating a sprite from it. I then put it in an SKEffectNode with a blur filter. (set "should rasterize" to true for better performance as I only needed to blur once). Finally I added the new sprite to the scene. From there you could add sprites to the scene and place them above the new blurred node.
let blurFilter = CIFilter(name: "CIGaussianBlur")!
let blurAmount = 15.0
blurFilter.setValue(blurAmount, forKey: kCIInputRadiusKey)
let blurEffect = SKEffectNode()
blurEffect.shouldRasterize = true
let screenshotNode = SKSpriteNode(texture: gameScene.view!.texture(from: gameScene))
blurEffect.addChild(screenshotNode)
blurEffect.filter = blurFilter
gameScene.addChild(blurEffect)
Possible workaround for the bug:
Use a camera, zoom WAY out, so you can see most everything of your background, take a screenshot style rendering of this image. Crop it to your needs, and then blur it. Then rasterise this.
Then scale this image back up, and slice it up if needs be, and place accordingly.
SKEffectNode renders into a texture. In most iOS systems the maximum size for a texture is 2048x2048. If an SKEffectNode is trying to render content larger than that, it will just use a 2048x2048 texture and anything outside of it will just not appear in the texture. It won't give you any error or warning about this happening; it simply does it silently.
And no, there is no way to tell SKEffectNode to use a texture of a specific size, and pan&clamp the content into it. It always uses a texture that will cover all the child nodes, and if the texture would be too large, it just silently uses that 2048x2048 texture.

Resources