SpriteKit low performance - ios

I have an app, that in one screen it has randomly positioned 800 - 1200 images all over the entire screen (small dots - see .png attachment). I tried that with UIImageView, Layers, UIViews, but the performance was always terrible, so I decided to use SpriteKit in this ViewController, to take advantage of device's OpenGL. The boost was really noticeable, but still the performance is not acceptable.
I'm fading out and in 1/3 of all the images (dots) 10/sec.
Any Ideas, how to increase the performance? It's only a couple of hundreds 13px x 13px (retina) .PNGs :)
Here's the .png:

You, my friend, need to be using SKEmitterNode.
In Xcode, you select File -> New -> File... and in the Resources section select SpriteKit Particle File. Supply it with your image and set up the parameters to create the desired effect. Voila!

How can I put this ... it's just a lot of sprites, it may not get much faster no matter what, specifically not with alpha blending (fading).
Perhaps you can simulate the effect using particles, they're faster to render than the same number of sprites but you have less influence over particles.
Also be sure you test performance on a real device, iOS Simulator's software rendering is very slow.

Related

Sprite Animation file sizes in SpriteKit

I looked into inverse kinematics as a way of using animation, but overall thought I might want to proceed with using sprite texture atlases to create animation instead. The only thing is i'm concerned about size..
I wanted to ask for some help in the "overall global solution":
I will have 100 monsters. Each has 25 frames of animation for an attack, idle, and spawning animation. Thus 75 frames in total per monster.
I'd imagine I want to do 3x, 2x and 1x animations so that means even more frames (75 x 3 images per monster). Unless I do pdf vectors then it's just one size.
Is this approach just too much in terms of size? 25 frames of animation alone was 4MB on the hard disk, but i'm not sure what happens in terms of compression when you load that into the Xcode and texture atlas.
Does anyone know if this approach i'm embarking on will take up a lot of space and potentially be a poor decision long term if I want even more monsters (right now I only have a few monsters and other images and i'm already up to ~150MB when I go to the app on the phone and look at it's storage - so it's hard to tell what would happen in the long term with way more monsters but I feel like it would be prohibitively large like 4GB+).
To me, this sounds like the wrong approach, and yet everywhere I read, they encourage using sprites and atlases accordingly. What am I doing wrong? too many frames of animation? too many monsters?
Thanks!
So, you are correct that you will run into a problem. In general, the tutorials you find online simply ignore this issue of download side and memory use on device. When building a real game you will need to consider total download size and the amount of memory on the actual device when rendering multiple animations at the same time on screen. There are 3 approaches, just store everything as PNG, make use of an animation format that compresses better than PNG, or third you can encode things as H264. Each of these approaches has issues. If you would like to take a look at my solution to the memory use issue at runtime, have a peek at SpriteKitFireAnimation link at this question. If you want to roll your own approach with H264, you can get lots of compression but you will have issues with alpha channel support. The lazy thing to do is use PNGs, it will work and support alpha channel, but PNGs will bloat your app and runtime memory use is heavy.

One background image vs repeating pattern/images, for Corona SDK game?

What would be a better approach in Corona re static background for a 2D scrolling based game?
Let's say the game level "size" is the equivalent of two screens wide and two screens deep.
Q1 - Would one large background image be an OK approach? This would probably be easier as you could do the work in Photoshop to prepare. Or is there a significant advantage (re performance) in Corona to have a small image "pattern" and repeat this in Corona (Lua code) to create the backdrop?
Q2 - If one large background image approach is OK, would I assume that one might have to sacrifice the resolution of the image, noting the size (2xscreens wide, and 2xscreens deep) correct for the higher resolution devices? That is for the iPad 3 say, assuming your configuration would normally would try to pickup the 3x image version (for other images, say smaller play icons, etc.) that for the background you might have to remain with the 1x or 2x image size. Otherwise, it may hit the texture limit (I've read "Most devices have a maximum texture size of 2048x2048"). Is this correct / does this make sense?
I used both approaches on my games.
Advantages of tiled mode:
You can make huge backgrounds.
Can be made to use less memory (specially with smallish tiles with lots of repeating, like a real world wallpaper)
Allow for some interesting effects (like parallax scrolling).
Problems of tiled mode:
Use more CPU performance
Might be buggy and hard to make behave correctly (for example one of my games gaps showed between tiles, but only on iPad Retina... it required some heavy math hackery to make it work)
It is hard to make complex and awesome backgrounds (reason why my point and click adventure games don't use tiled backgrounds).
Pay attention that some devices, has a limit in the size of the textures in pixels, this might be the ultimate limit of how large a single-texture background can be.

Faster Alternatives to UIImageView in iOS

I have a list of png images that I want them to show one after another to show an animation. In most of my cases I use a UIImageView with animationImages and it works fine. But in a couple of cases my pngs are 1280*768 (full screen iPad) animations with 100+ frames. I see that using the UIImageView is quite slow on the emulator (too long to load for the first time) and I believe that if I put it on the device it will be even slower.
Is there any alternative that can make show an image sequence quite smoothly? Maybe Core Animation? Is there any working example I can see?
Core Animation can be used for vector/key-frame based animation - not image sequences. Loading over a hundred full-screen PNGs on an iPad is a really bad idea, you'll almost certainly get a memory warning if not outright termination.
You should be using a video to display these kind of animations. Performance will be considerably better. Is there any reason why you couldn't use a H.264 video for your animation?
Make a video of your pictures. It is the simplest and probably most reasonable approach.
If you want really good performance and full control over your animation, you can convert the pictures to pvrtc4 format and draw them as billboards (textured sprites) with OpenGL. This can be a lot of work if you don't know how to do it.
Look at the second example
http://www.modejong.com/iPhone/
Extracts from http://www.modejong.com/iPhone/
There is also the UIImageView.animationImages API, but it quickly sucks up all the system memory when using more than a couple of decent size images.
I wanted to show a full screen animation that lasts 2 seconds, at 15 FPS that is a total of 30 PNG images of size 480x320. This example implements an animation oriented view controller that simply waits to read the PNG image data for a frame until it is needed.
Instead of alllocating many megabytes, this class run in about a half a meg of memory with about a 5-10% CPU utilization on a 2nd gen iPhone. This example has also been updated to include the ability to optionally play an audio file via AVAudioPlayer as the animation is displayed.

iOS Animated Logo -- Low memory alternatives

I have a three-second PNG sequence (a logo animation) that I'd like to display right after my iOS app launches. Since this is the only animated sequence in the app, I'd prefer not to use Cocos2D.
But with UIImageView's animationImages, the app runs out of memory on iPod Touch devices.
Is there more memory-conscious/efficient way to show this animation? Perhaps a sprite sheet class that doesn't involve Cocos2D? Or something else?
If this is an animated splash screen or similar, note that the HIG frowns on such behavior (outside of fullscreen games, at least).
If you're undeterred by such arguments (or making a game), you might consider saving your animation as an MPEG-4 video and using MPMoviePlayerController to present it. With a good compressor, it should be possible to get the size and memory usage down quite a lot and still have a good quality logo animation.
I doubt you're going to find much improvement any other way -- a sprite sheet, for example, is still going to be doing the same kind of work as as sequence of PNGs. The problem is that for most animations, a lot of the pixels are untouched from frame to frame... if you're presenting it just as a series of images, you're wasting a lot of time and space on temporally duplicated pixels. This is why we have video codecs.
You could try manually loading/unloading the png images as needed. I don't know what your frame rate requirements are. Also, consider a decent-quality jpg or animated gif. And you can always make the image smaller so it doesn't take up the whole screen. Just a few thoughts.

iOS: playing a frame-by-frame greyscale animation in a custom colour

I have a 32 frame greyscale animation of a diamond exploding into pieces (ie 32 PNG images # 1024x1024)
my game consists of 12 separate colours, so I need to perform the animation in any desired colour
this I believe rules out any Apple frameworks, also it rules out a lot of public code for animating frame by frame in iOS.
what are my potential solution paths?
these are the best SO links I have found:
Faster iPhone PNG Animations
frame by frame animation
Is it possible using video as texture for GL in iOS?
that last one just shows it is may be possible to load an image into a GL texture each frame ( he is doing it from the camera, so if I have everything stored in memory, that should be even faster )
I can see these options ( listed laziest first, most optimised last )
option A
each frame (courtesy of CADisplayLink), load the relevant image from file into a texture, and display that texture
I'm pretty sure this is stupid, so onto option B
option B
preload all images into memory
then as per above, only we load from memory rather than from file
I think this is going to be the ideal solution, can anyone give it the thumbs up or thumbs down?
option C
preload all of my PNGs into a single GL texture of the maximum size, creating a texture Atlas. each frame, set the texture coordinates to the rectangle in the Atlas for that frame.
while this is potentially a perfect balance between coding efficiency and performance efficiency, the main problem here is losing resolution; on older iOS devices maximum texture size is 1024x1024. if we are cramming 32 frames into this ( really this is the same as cramming 64 ) we would be at 128x128 for each frame. if the resulting animation is close to full screen on the iPad this isn't going to hack it
option D
instead of loading into a single GL texture, load into a bunch of textures
moreover, we can squeeze 4 images into a single texture using all four channels
I baulk at the sheer amount of fiddly coding required here. My RSI starts to tingle even thinking about this approach
I think I have answered my own question here, but if anyone has actually done this or can see the way through, please answer!
If something higher performance than (B) is needed, it looks like the key is glTexSubImage2D http://www.opengl.org/sdk/docs/man/xhtml/glTexSubImage2D.xml
Rather than pull across one frame at a time from memory, we could arrange say 16 512x512x8-bit greyscale frames contiguously in memory, send this across to GL as a single 1024x1024x32bit RGBA texture, and then split it within GL using the above function.
This would mean that we are performing one [RAM->VRAM] transfer per 16 frames rather than per one frame.
Of course, for more modern devices we could get 64 instead of 16, since more recent iOS devices can handle 2048x2048 textures.
I will first try technique (B) and leave it at that if it works ( I don't want to over code ), and look at this if needed.
I still can't find any way to query how many GL textures it is possible to hold on the graphics chip. I have been told that when you try to allocate memory for a texture, GL just returns 0 when it has run out of memory. however to implement this properly I would want to make sure that I am not sailing close to the wind re: resources... I don't want my animation to use up so much VRAM that the rest of my rendering fails...
You would be able to get this working just fine with CoreGraphics APIs, there is no reason to deep dive into OpenGL for a simple 2D problem like this. For the general approach you should take to creating colored frames from a grayscale frame, see colorizing-image-ignores-alpha-channel-why-and-how-to-fix. Basically, you need to use CGContextClipToMask() and then render a specific color so that what is left is the diamond colored in with the specific color you have selected. You could do this at runtime, or you could do it offline and create 1 video for each of the colors you want to support. It is be easier on your CPU if you do the operation N times and save the results into files, but modern iOS hardware is much faster than it used to be. Beware of memory usage issues when writing video processing code, see video-and-memory-usage-on-ios-devices for a primer that describes the problem space. You could code it all up with texture atlases and complex openGL stuff, but an approach that makes use of videos would be a lot easier to deal with and you would not need to worry so much about resource usage, see my library linked in the memory post for more info if you are interested in saving time on the implementation.

Resources