How to use pixel art in an app? - ios

I have an iOS app that uses sprite kit, and I am ready to add my artwork. The artwork is pixel-art and is inherently very small. I am trying to find the best way to display this in way where:
All of the art is the same size, meaning that one image pixel takes up exactly the amount of real world pixels as in all the other images.
There is no blurring in an attempt to make the textures look smoother, which often happens when scaling images up.
I have tried solving the second one like so:
self = [super init];
if(self){
self.size=size;
self.texture = [SKTexture textureWithImageNamed:#"ForestTree1.png"];
self.texture.filteringMode = SKTextureFilteringNearest;
[self.texture size];
}
return self;
The above code is in the initialization of the SKSpriteNode which will have the texture.
This is my original image (scaled up for easy reference):
The problem is that my result always looks like this:
(The bottom of the trunk being offset is not part of this question.) I am not using any motion blur or anything like it. I'm not sure why it isn't displaying correctly.
Edit 1:
I failed to mention above that the trees were constantly animating when the screenshots were taken. When they are still they look like this:
The image above is of two trees overlapping with one flipped caused because of a bug to be fixed later. My question is now how can I prevent the image from blurring while animation is occurring?
Edit 2:
I am adding multiple instances of the tree, each one loading the same texture. I know it as nothing to do with the animation because I changed the code to add just one tree and animate it, and it was pixelated perfectly.

You need to use "nearest" filtering:
self.texture.filteringMode = SKTextureFilteringNearest;

The pixels in your image must correspond with pixels on the screen perfectly.
If your image is 100x100, and you display it over a whole screen that is 105x105, it will do interpolation to figure out how to do it.
If you display it at a scaled resolution of some multiple of 2 (which should work properly), I think you still have to tell the renderer not to interpolate pixels when it does the scaling.

I've solved the problem...but its really a hack. I have a SKScene which is the parent node to all of the "trees" (SKSpriteNodes). This scene will be adding multiple trees to itself. At first I thought that this was some sort of problem because if I only added one tree, it would display the image correctly. The answer to this question led me to believe that I would need to programmatically create a SKTextureAtlas singleton in the (the texture is in a SKTextureAtlas) and pass it to the tree class to get the texture from on an init method. I made a property in the SKScene to hold the texture atlas so that I could pass it to the tree class every time I made a new one. I tried loading the texture from texture atlas (in the tree class) using the textureNamed: method. This still did not work. I switched back to loading the texture with SKTexture's textureWithImageNamed: method and it worked. Further more I changed to code back so that the tree subclass would not be sent the SKTextureAtlas singleton at all and it still worked.
In the SKScene I get the texture atlas using:
[SKTextureAtlas atlasNamed:#"Textures"]; //Textures is the atlas name.
and set the return value to be the SKTextureAtlas property described above. I thought that maybe the atlas just had to initialized at some point in the code, so I tried this:
SKTextureAtlas *myAtlas = [SKTextureAtlas atlasNamed:#"Textures"];
and the following alone on one line:
[SKTextureAtlas atlasNamed:#"Textures"]
but neither worked. Apparently I need to have a property in my tree's parent class which is the SKTextureAtlas which holds the texture which the tree uses without any reference to a SKTextureAtlas whatsoever... Is this a glitch or something? It's working now but it feels like a hack.

[self setScaleMode:SKSceneScaleModeAspectFill];
SKTexture* texture = [SKTexture textureWithImageNamed:#"image"];
[texture setFilteringMode:SKTextureFilteringNearest];
SKSpriteNode* imageNode = [SKSpriteNode spriteNodeWithTexture:texture];
[self addChild:imageNode];
Works perfectly for me. There's no blur with animation

Related

iOS: Can I use Texture Atlas already created with Adobe Animate?

I know that I can add sequences of individual images in Xcode, and let it create the Texture Atlas. This process is well-described in easily-searched venues. (Like this one.)
I am getting atlases from designers, created with Adobe Animate (v18.0) already combined into the full sheet, and moreover with the accompanying XML file describing the animation. (In which sub-images and display frames do not match 1:1, so it's hard to see how Xcode would figure that out.)
It's not clear to me from the SpriteKit docs whether/how to use these pre-defined Texture Atlases. Is this possible?
If you're getting pre-baked texture atlases, with externally-generated descriptions of where the sprites should get their textures, you'll probably have to create your sprites using SKTextures that you create using the init(rect:in:) initializer of a SKTexture.
You'll need to read the sprite's extents out of the XML file, and then create a texture out of the atlas. Then you can create a new SKTexture object that represents a part of the larger texture to act as your sprite's texture.
This is untested pseudocode, but it shows the process:
let spriteRect = (get the rect from the XML)
let atlas = SKTexture( imageNamed: "myTextureAtlas" )
let spriteTexture = SKTexture( rect:spriteRect, in:atlas )
let sprite = SKSpriteNode( texture:spriteTexture )
Once you have this process in place, you can animate the sprites using the usual methods, like setting up SKActions with a list of textures out of the texture atlas.

How to Render Many SpriteKit Nodes at Once?

I am using SpriteKit to render a large (20 x 20) dot grid that looks like this:
I'd like to highlight rows or columns based on user input. For example, I'd like to change rows 1-10 to a red color, or columns 5-15 to a blue color.
What is the most performant way to do this?
I've tried:
Naming each GridNode based on the column it's in (e.g. #"column-4). Then use enumerateChildNodesWithName: with the string as #"column-n", changing the color of each node (by changing SKShapeNode setFillColor:) in the enumerate block.
Giving all the columns a parent node associated with that column. Then telling the parent node to change its alpha (thus changing the alpha of all its children).
Making arrays for the different columns, then looping through each node and changing its color or alpha.
I've tried making the GridDot class an SKEffectNode with shouldRasterize: set to YES. I've tried both an SKShapeNode and a SKSpriteNode as its child. I've also tried taking away the SKEffectNode parent and just render an SKSpriteNode.
Each of these options makes my whole app lag and makes my framerate drop to ~10 FPS. What is the correct way to change the color/alpha of many nodes (without dropping frames)?
At its heart, the issue is rendering this many nodes, yes?
When I faced similar performance problems while using SKShapeNode I came up with this solution:
Create SKShapeNode with required path and color.
Use SKView's method textureFromNode:crop: to convert SKShapeNode to an SKTexture
Repeat steps 1,2 to create all required textures for a node.
Create SKSpriteNode from a texture
Use created SKSpriteNode in your scene instead of SKShapeNode
Change node's texture when needed using SKSpriteNode's texture property
If you have a limited set of collors for your dots, I think this aproach will fit fine for your task.
In contrast to #amobi's statement, 400 nodes is not a lot. For instance, I have a scene with ~400 nodes and a render time of 9.8ms and 9 draw calls.
If you have 400 draw calls though, you should try to reduce that number. To determine the amount of draw calls needed for each frame rendered, implement (some of) the following code. It is actually taken from my own SpriteKit app's ViewController class which contains the SpriteKit scene.
skView.showsFPS = YES;
skView.showsNodeCount = YES;
skView.showsDrawCount = YES;
Proposed solution
I recommend using SKView's ignoresSiblingOrder. This way, SKSpriteNodes with equal zPosition are drawn in one draw call, which (for as many nodes/draw you appear to have) is horribly efficient. Set this in the -viewDidLoad method of the SKView's ViewController.
skView.ignoresSiblingOrder = YES;
I see no reason to burden the GPU with SKEffectNodes in this scenario. They are usually a great way to tank your frame rate.
Final thoughts
Basic performance issues mean you have a CPU or a GPU bottleneck. It is difficult to guess which you're suffering from with the current information. You could launch the Profiler, but Xcode itself also provides valuable information when you are running your app in an attached device. FPS in the Simulator is not representative for device performance.

Checking/removing SKScene sprite children easily/efficiently?

Working on an iOS game using SpriteKit. My background is made up of map tiles (essentially an infinite map, procedurally generated).
Our system is designed to manage "chunks" of the map, and we only load chunks near the player. Since SpriteKit requires we add SKSpriteNodes, we no longer have clean control over "unloading" sprites for chunks/tiles that are no longer near the player.
I realize SpriteKit won't actually render things off-screen, but it's going to kill performance if we can't remove sprites no longer needed, or check if a chunk/tile is already added.
Since SKNodes doesn't respond to isEqual:, I only see two ways to do this:
Give each sprite a name with their chunk/tile coordinate, and check this name each update
Maintain a separate array of loaded tiles and check that instead
Is there any easier way of checking/removing if a sprite has been added already? Maybe a partial string match on sprite name?
I'm not sure that using SpriteKit is the best solution (Xcode simulator seems to drag at 30fps, have yet to test on a real device). We originally built this game in Java and we're rendering our own textures - hence only what was loaded and could be fed into opengl manually.
-(void) renderToScene:(SKScene *)scene {
for( Chunk *chunk in loadedChunks ){
for( Tile *tile in [chunk getTiles] ){
SKSpriteNode *sprite = [SKSpriteNode spriteNodeWithTexture:tileTexture];
sprite.name = #"Tile";
sprite.position = CGPointMake(realX,realY);
[scene addChild:sprite];
}
}
}
What will in fact kill your framerate is frequently creating and removing nodes - you should have a pool of sprites that you re-use rather than recreate.
Just update the sprite's texture, position and other attributes when reusing one of those you no longer need. A common use case is to have enough tiles to span the entire screen plus one row and one column, so that when an entire row or column has moved outside the screen you can reposition it at the other side with new textures according to the map data.
If i'm understanding what you're asking correctly. You want to properly remove a node/sprite from the scene if its no longer within view.
You should just be able to call the [self removeFromParent] method to remove in whenever its outside the bounds of the screen. Remember you can call this method on any object as long as its a child.
For instance if i had character,
SKSpriteNode *character;
SKSpriteNode *sword;
//all the rest of the properties are up to you.
if my character picked up a sword in the game and after a certain time period the sword is no longer accessible. I would do:
[character addChild:sword];
which would add it to the character.
[sword removeFromParent];
when i no longer need the sword to be apart of its parent.

SKSpriteNode stretching when manually animating textures

I have a fairly simple animation with 8 identically sized images. I'm not using the built in animation methods as I want to manually control the speed of the animation on the fly. I'm using preloaded SKTexture's and doing [object setTexture:texture]; inside of the update:currentTime method.
The problem is that sometimes the texture gets really distorted/stretched. After a lot of debugging, I have narrowed it down to only happening when the node is stationary. In fact, if I move the node a pixel and move it back like this, the problem never occurs:
[self setTexture:texture];
CGPoint currentPosition = self.position;
self.position = CGPointMake(currentPosition.x + 1, currentPosition.y + 1);
self.position = currentPosition;
This feels extremely hacky to me. I think under the hood, it's triggering a redraw on the parent node. Has anyone else experienced this? I have two major questions. 1) What is the cause? and 2) How can I resolve this without resorting to a hack?
Here is a normal frame and a stretched version (I apologize for the quality, placeholder art...)
Edit: After a few comments, I realized that I forgot to mention that I scaled the size of the node smaller than the size of the texture. Even though the textures are the same size, applying a new texture to a node with a smaller size causes the bug.
It seems that upon setting the texture using setTexture: sprite node doesn't change it size, until being moved, resized, etc...
You can resolve this by manually setting the size after setting the texture.
[spriteNode setTexture:texture];
[spriteNode setSize:texture.size];

Sprite Kit - Low Resolution Texture on Retina when using textureFromNode

For my game I am trying to create custom textures from two other textures. This is to allow for a varietly of colours, etc in my sprites.
To do this, I'm creating a sprite by adding both textures together, then applying this to a new SKTexture by using
SKTexture *texture = [self.view textureFromNode:newSprite];
This works great on the whole and I get a nice custom texture. Except when trying my game on Retina devices, where the texture is the correct size on the screen, but clearly a lower resolution.
The textures are all there and properly named so I don't believe that that is an issue.
Has anyone encountered this, or know how I can create the proper #2x texture?
I finally (accidentally) figured out how to fix this. The node which you are creating a texture from has to be added to the scene. Otherwise you will get a non-retina size for your texture.
It's not ideal as it would be nice to create textures without having to add them onto the screen.
I've discovered another way of improving the fidelity of textures created from ShapeNodes, not quite related to this question - but useful intel.
Create your shape at x2 its size and width.
Create all the fonts and other shapes at the same oversized ratio.
Make sure your positioning is relative to this overall size, (e.g. don't use absolute sizes, use relative sizes to the container.)
When you create the texture as a sprite it'll be huge - but then apply
sprite.scale = 0.5; // if you were using 2x
I've found this makes it look much higher resolution, no graininess, no fuzziness on fonts, sharp corners.
I also used tex.filteringMode = SKTextureFilteringNearest;
Thus: it doesn't have to be added to the scene and then removed.

Resources