Save generated SKTexture to file - ios

I've now filed a bug for the issue below. Anyone with a good
workaround?
I try to save an SKTexture to file, and load it back again, but I don't succeed. The following code snippet can be copied to GameScene.m in the Xcode startup project.
I use textureFromNode in generateTexture, and that seems to be the root cause of my problem. If I use a texture from a sprite, the code works, and two spaceships are visible.
This code worked in iOS 8 but it stopped working in Xcode7 & iOS 9. I just want to verify that this is a bug before I file a bug report. My worry is that I do something wrong with NSKeyedArchiver.
It happens both in simulator and on device.
#import "GameScene.h"
#implementation GameScene
// Generates a texture
- (SKTexture *)generateTexture
{
SKScene *scene = [[SKScene alloc] initWithSize:CGSizeMake(100, 100)];
SKShapeNode *shapeNode = [SKShapeNode shapeNodeWithRectOfSize:CGSizeMake(50, 50)];
shapeNode.position = CGPointMake(50, 50);
shapeNode.strokeColor = SKColor.redColor;
shapeNode.lineWidth = 10;
[scene addChild:shapeNode];
SKTexture *texture = [self.view textureFromNode:scene];
//SKTexture *texture = [SKSpriteNode spriteNodeWithImageNamed:#"Spaceship"].texture; // This works!
return texture;
}
// Just generate a path
- (NSString *)fullDocumentsPath
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *yourFileName = [documentsDirectory stringByAppendingPathComponent:#"fileName"];
return yourFileName;
}
- (void)didMoveToView:(SKView *)view
{
self.scaleMode = SKSceneScaleModeResizeFill;
// Verify that the generateTexture method indeed produces a valid texture.
SKSpriteNode *s1 = [SKSpriteNode spriteNodeWithTexture:[self generateTexture]];
s1.position = CGPointMake(100, 100);
[self addChild:s1];
// Start with saving the texture.
NSString *fullName = [self fullDocumentsPath];
NSError *error;
NSFileManager *fileMgr = [NSFileManager defaultManager];
if ([fileMgr fileExistsAtPath:fullName])
{
[fileMgr removeItemAtPath:fullName error:&error];
assert(error == nil);
}
NSDictionary *dict1 = [NSDictionary dictionaryWithObject:[self generateTexture] forKey:#"object"];
bool ok = [NSKeyedArchiver archiveRootObject:dict1 toFile:fullName];
assert(ok);
// Read back the texture and place it in a sprite. This sprite is not shown. Why?
NSData *data = [NSData dataWithContentsOfFile:fullName];
NSDictionary *dict2 = [NSKeyedUnarchiver unarchiveObjectWithData:data];
SKTexture *loadedTexture = [dict2 objectForKey:#"object"];
SKSpriteNode *s2= [SKSpriteNode spriteNodeWithTexture:loadedTexture];
NSLog(#"t(%f, %f)", loadedTexture.size.width, loadedTexture.size.height); // Size of sprite & texture is zero. Why?
s2.position = CGPointMake(200, 100);
[self addChild:s2];
}
#end
Update for Yudong:
This might be a more relevant example, but imagine that the scene consists of 4 layers, with lots of sprites. When the game play is over I want to store a thumbnail image of the end scene of the match. The image will be used as a texture on a button. Pressing that button will start a replay movie of the match. There will be lots of buttons with images of old games so I need to store each image on file.
-(SKTexture*)generateTexture
{
SKScene *scene = [[SKScene alloc] initWithSize:CGSizeMake(100, 100)];
SKSpriteNode *ship = [SKSpriteNode spriteNodeWithImageNamed:#"Spaceship"];
ship.position = CGPointMake(50, 50);
[scene addChild:ship];
SKTexture *texture = [self.view textureFromNode:scene];
NSLog(#"texture: %#", texture);
return texture;
}
The solution/work around:
Inspired by Russells code I did the following. It works!
CGImageRef cgImg = texture.CGImage;
SKTexture *newText = [SKTexture textureWithCGImage:cgImg];

I've done a lot of experimenting/hacking with SKTextures. My game utilizes SKTextures. It is written in Swift. Specifically, I've had many problems with textureFromNode and textureFromNode:crop: and creating SKPhysicsBodies from textures. These methods worked fine in ios 8, but Apple completely broke them when they released ios 9.0. In ios 9.0, the textures were coming back as nil. Those nil textures broke SKPhysicsBodies from the textures.
I recently worked on serialization/deserialization of SKTextures.
Some key ideas/clues you might investigate are:
Run ios 9.2. Apple Staff mentioned a lot of issues have been fixed. https://forums.developer.apple.com/thread/17463 I've found ios 9.2 helps with SKTextures but didn't solve every issue especially the serialization issues.
Try PrefersOpenGL (set it to "YES" as a Boolean custom property in your config). Here is a post about PrefersOpenGL in the Apple Dev Forums by Apple Staff. https://forums.developer.apple.com/thread/19683 I've observed that ios 9.x seems to use Metal by default rather than OpenGL. I've found PrefersOpenGL helps with SKTexture issues but still doesn't make my SKShaders work (written in GLSL).
When I tried to serialize/deserialize nodes with SKTextures on ios 9.2, I got white boxes instead of visible textures. Inspired by Apple SKTexture docs that say, "The texture data is loaded when:
The size method on the texture object is called.
Another method is called that requires the texture’s size, such as creating a new SKSpriteNode object that uses the texture object.
One of the preload methods is called (See Preloading the Texture Data.)
The texture data is prepared for rendering when:
A sprite or particle that uses the texture is part of a node tree that is being rendered."
... I've hacked a workaround that creates a secondary texture from the CGImage() call:
// ios 9.2 workaround for white boxes on serialization
let img = texture!.CGImage()
let uimg = UIImage(CGImage: img)
let ntex = SKTexture(image: uimg)
let sprite = SKSpriteNode(texture: ntex, size: texture!.size())
So now my SKSpriteNodes created this way seem to serialize/deserialize fine. BTW, just invoking size() or creating an SKSpriteNode with the original texture does not seem to be enough to reify the texture into memory.
You didn't ask about textureFromNode:crop: but I'm adding observations anyway just in case it helps you: I've found this method in ios 8 worked (although the crop parameters were very tricky and seemed to require normalization with UIScreen.mainScreen().scale) In ios 9.0, this method didn't work at all (returned nil). In ios 9.2 this method now works (it now returns a non-nil texture) however subsequent creation of nodes from the texture do not need the size normalization. And furthermore, to make serialization/deserialization work, I found you ultimately have to do #3 above.
I hope this helps you. I imagine I've struggled more than most with SKTextures since my app is so dependent on them.

I tested your code in Xcode 7 and found texture returned in generateTexture was null. That's the reason why you can't load anything from the file, and you even haven't saved anything.
Try to use NSLog to log the description of your texture or sprite. E.g. add this line in generateTexture:
NSLog(#"texture: %#", texture);
What you will get in console:
texture: '(null)' (300 x 300)
And same for s1 and dict1 in your code:
s1: name:'(null)' texture:[ '(null)'
(300 x 300)] position:{100, 100} scale:{1.00, 1.00} size:{100, 100}
anchor:{0.5, 0.5} rotation:0.00
dict1: {
object = " '(null)' (300 x 300)"; }
You may do these tests on both iOS 8 and iOS 9 and you will probably get different results.
I'm not sure why you add the SKShapeNode to a scene and then save the texture from the scene. One workaround is to set texture for your SKShapeNode, and your code should work fine.
shapeNode.fillTexture = [SKTexture textureWithImageNamed:#"Spaceship"];
SKTexture *texture = shapeNode.fillTexture;
return texture;
Update:
It's quite annoying that textureFromNode doesn't works as expected in iOS 9. I tried to solve it by trial and error but no luck at last. Thus, I asked you if you would consider make a snapshot of the whole screen and set it as your thumbnail. Here's the progress I made today and hope you will get inspired from it.
I created a scene which contained SKLabelNode and SKSpriteNode in didMoveToView. After I clicked anywhere on screen, snapshot would be invoked and the down-scaled screenshot would be saved in the document folder. I used the code here.
- (UIImage *)snapshot
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 0.5);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:YES];
UIImage *snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return snapshotImage;
}
Since the thumbnail is saved as UIImage therefore loading it back for the sprite's texture should be done easily. A sample project demonstrates the whole process and it works on both iOS 8 and 9.

Related

Generate an image of the contents of a SKScene which is not displayed

In my current SKScene, I use UIGraphicsGetImageFromCurrentImageContext to generate an image from the current graphics on screen.
However, I would like to generate an image from a scene not on screen. One I have created but have not displayed. Is this possible?
The reason for doing is this is to create a custom image for users to share when they achieve a high score, which is similar, but not the same as my main Scene.
Here is a method that captures the contents of a node as a PNG. Be aware that current SpriteKit seems to have a memory leak on the access of the CGImage property, so use this in DEBUG mode.
+ (NSData*) captureNodeAsPNG:(SKSpriteNode*)node skView:(SKView*)skView
{
NSData *data = nil;
#autoreleasepool {
SKTexture *captureTexture = [skView textureFromNode:node];
CGImageRef cgImageRef = captureTexture.CGImage;
NSLog(#"capture texture from node as pixels : %d x %d", (int)CGImageGetWidth(cgImageRef), (int)(CGImageGetHeight(cgImageRef)));
UIImage *capImg = [UIImage imageWithCGImage:cgImageRef];
data = UIImagePNGRepresentation(capImg);
}
return data;
}

Spritebuilder file not found when creating CCtexture, but works when creating CCsprite?

In my game I initially create sprites using this code:
- (void)addSpritesForCookies:(NSSet *)cookies {
for (BBQCookie *cookie in cookies) {
NSString *directory = [NSString stringWithFormat:#"sprites/%#.png", [cookie spriteName]];
CCSprite *sprite = [CCSprite spriteWithImageNamed:directory];
sprite.position = [self pointForColumn:cookie.column row:cookie.row];
[self.cookiesLayer addChild:sprite];
cookie.sprite = sprite;
}
}
This works perfectly fine, and the sprites display properly. All of the textures that I need are in a smart sprite sheet that I created in spritebuilder. Here is a screenshot of the structure in Spritebuilder, since I don't have enough reputation to post images here: https://www.evernote.com/shard/s30/sh/e1ea553f-a26c-4ecd-866b-551b50f14bd7/cc338b5ea3e557777a9a3acdeb0cd5ac
Then later on I need to change the texture on some of the sprites. I'm using the following code:
CCActionCallBlock *changeSprite = [CCActionCallBlock actionWithBlock:^{
NSString *directory = [NSString stringWithFormat:#"sprites/%#.png", [combo.cookieB spriteName]];
CCTexture *texture = [CCTexture textureWithFile:directory];
combo.cookieB.sprite.texture = texture;
}];
However, I just get these messages logged:
2015-01-17 12:22:09.294 BbqBlitz[65732:4875151] -[CCFileUtils fullPathForFilename:contentScale:] : cocos2d: Warning: File not found: sprites/Cupcake.png
2015-01-17 12:22:09.294 BbqBlitz[65732:4875151] cocos2d: Couldn't find file:sprites/Cupcake.png
What's weird is that I'm using exactly the same file path to initially create the sprites, which works fine. But now that I'm trying to create a texture its not working, and the sprites are just turning into black squares in my game.
I've already read this similar question: SpriteBuilder image file not found but from what I can tell the directory I'm using is correct.
What am I missing here? Thanks so much for any help!
Try to load it without folder name
NSString *directory = [NSString stringWithFormat:#"%#.png", [combo.cookieB spriteName]];
It could be a part of sprite sheet after spriteBuilder, I suppose.

EZAudio iOS - Save Waveform to Image

I'm using the EZAudio library for iOS to handle the playback of an audio file and draw its waveform. I'd like to create a a view with the entire waveform (an EZAudioPlotGL view, which is a subclass of UIView) and then save it as a png.
I'm having a couple problems with this:
The temporary audio plot I'm creating to save the snapshot image is drawing to the view, which I don't understand because I never add it as a subview.
The tempPlot is only drawing the top half of the waveform (not "mirrored" as I set it in the code)
The UIImage being saved from the tempPlot is only saving a short portion of the beginning of the waveform.
The problems can be seen in these images:
How the screen should look after (the original audio plot):
How the screen does look (showing the tempPlot I don't want to draw to the screen):
The saved image I get out that should be a copy of tempPlot:
The EZAudio library can be found here: https://github.com/syedhali/EZAudio
And my project can be found here, if you want to see the problem for yourself: https://www.dropbox.com/sh/8ilfaofvaa8aq3p/AADU5rOwqzCtEmJz-ePRXIDZa
I'm not very experienced with OpenGL graphics, so a lot of the work going on inside the EZAudioPlotGL class is a bit over my head.
Here's the relevant code:
ViewController.m:
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Customizing the audio plot's look
self.audioPlot.backgroundColor = [UIColor blueColor];
self.audioPlot.color = [UIColor whiteColor];
self.audioPlot.plotType = EZPlotTypeBuffer;
self.audioPlot.shouldFill = YES;
self.audioPlot.shouldMirror = YES;
// Try opening the sample file
[self openFileWithFilePathURL:[NSURL fileURLWithPath:kAudioFileDefault]];
}
-(void)openFileWithFilePathURL:(NSURL*)filePathURL {
self.audioFile = [EZAudioFile audioFileWithURL:filePathURL];
// Plot the whole waveform
[self.audioFile getWaveformDataWithCompletionBlock:^(float *waveformData, UInt32 length) {
[self.audioPlot updateBuffer:waveformData withBufferSize:length];
}];
//save whole waveform as image
[self.audioPlot fullWaveformImageForSender:self];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *filePath = [[paths objectAtIndex:0] stringByAppendingPathComponent:#"waveformImage.png"];
[UIImagePNGRepresentation(self.snapshotImage) writeToFile:filePath atomically:YES];
}
#end
My Category of EZAudioPlotGL:
- (void)fullWaveformImageForSender:(ViewController *)sender{
EZAudioPlotGL *tempPlot = [[EZAudioPlotGL alloc]initWithFrame:self.frame];
[tempPlot setPlotType: EZPlotTypeBuffer];
[tempPlot setShouldFill: YES];
[tempPlot setShouldMirror: YES];
[tempPlot setBackgroundColor: [UIColor redColor]];
[tempPlot setColor: [UIColor greenColor]];
//plot full waveform on tempPlot
[sender.audioFile getWaveformDataWithCompletionBlock:^(float *waveformData, UInt32 length) {
[tempPlot updateBuffer:waveformData withBufferSize:length];
//tempPlot.glkVC is a getter for the private EZAudioPlotGLKViewController property in tempPlot (added by me in the EZAudioPlotGL class)
sender.snapshotImage = [((GLKView *)tempPlot.glkVC.view) snapshot];
}];
}
drawViewHierarchyInRect only works for capturing CoreGraphics-based view drawing. (CG drawing happens on the CPU and renders into a buffer in main memory, so CG, aka UIGraphics, can just slurp an image out of there.) It won't help you if your view draws its content using OpenGL. (OpenGL drawing happens on the GPU, so you need to use GL to read pixels back from the GPU to main memory before you can build an image out of them.)
It looks like your library does its drawing with an instance of GLKView. (Poking around in the source, EZAudioPlotGL uses EZAudioPlotGLKViewController, which creates its own GLKView.) That class, in turn, has a snapshot method that does all the heavy lifting to get pixels back from the GPU and put them in a UIImage.

ios Loading texture from atlas not working

I have an atlas with a bunch of tiles and i am trying to load them into memory using SKTexture and SKTextureAtlas but it is not working. I use the following code to load them:
NSString *atlasName = [NSString stringWithFormat:#"Tiles"];
SKTextureAtlas *tileAtlas = [SKTextureAtlas atlasNamed:atlasName];
NSInteger numberOfTiles = tileAtlas.textureNames.count;
backgroundTiles = [[NSMutableArray alloc] initWithCapacity:numberOfTiles];
for (int y = 0; y < 5; y++) {
for (int x = 0; x < 9; x++) {
int tileNumber = y*9 + x + 1;
NSString *textureName = [NSString stringWithFormat:#"tile%d.png",tileNumber];
SKSpriteNode *tileNode = [SKSpriteNode spriteNodeWithTexture:[tileAtlas textureNamed:textureName]];
CGPoint position = CGPointMake((0.5 + x)*_tileSize - _levelWidth/2,(0.5 - y - 1)*_tileSize + _levelHeight/2);
tileNode.position = position;
tileNode.zPosition = -1.0f;
tileNode.blendMode = SKBlendModeReplace;
[(NSMutableArray *)backgroundTiles addObject:tileNode];
}
}
Then i use this code to add them to my scene:
- (void)addBackgroundTiles
{
for (SKNode *tileNode in [self backgroundTiles]) {
[self addChild: tileNode];
}
}
The problem is it doesnt load the correct texture for a tile or find the texture at all.
What I end up with is this (ignore the blue circle): http://i.stack.imgur.com/g39BF.png
Here is my tile atlas: http://snk.to/f-ctp5yhpz
EDIT: I am using NameChanger(www.mrrsoftware.com/MRRSoftware/NameChanger.html) to rename all my tiles, can it be that program that messes up my pngs? as far as i can see they are in the correct order after i have renamed them.
Solution
Editing my answer to point out that the solution is in the comments below this answer.
It turned out that the issue was caused by Xcode not rebuilding the atlas after the image files were renamed outside of Xcode (presumably by the file changed OP mentioned).
By cleaning and rebuilding the project, all the texture atlases were built again, and OPs code started working.
Original answer
Two things to double-check:
Is your .atlas added to your project as a folder or a group? It must be a folder (blue icon in Xcode, instead of yellow).
After adding Tiles.atlas to your project, you must also enable atlas generation in Xcode settings.
See here for a similar issue: How to create atlas for SpriteKit. I linked to Apple documentation on incorporating texture atlases into your projects which has a detailed step-by-step instruction on enabling atlas generation.
Why the double for loops?
Are you saving the backgroundTiles array as a property?
I've had this occur recently and the only fix that worked was:
[SKSpriteNode spriteNodeWithTexture:[SKTexture textureWithImageNamed:#"someTile.png"]]; The textureWithImageNamed always gets the right one.
So try:
SKSpriteNode *tileNode = [SKSpriteNode spriteNodeWithTexture:[SKTexture textureWithImageNamed:textureName]];

Changing texture of game but the image is not replaced according to cocos hd defination

I am making a game in cocos2d. I am checking the level of the sprite and updating the texture accordingly.
Now at start the image is placed in hd alright.. now when the game starts and i start moving the sprite the hd image is replaced with the normal one. I am checking the replacing the texture with this code.
int value = [self.weapon.weaponLevel intValue];
NSString *path = [[NSBundle mainBundle]pathForResource:#"name" ofType:#"png"];
UIImage *imageView = [[UIImage alloc]initWithContentsOfFile:path];
CCTexture2D *frame = [[CCTexture2D alloc]initWithImage:imageView];
[self.player setTexture:frame];
Can any one please help me out here. Thanks.
regards.
There's no need to go through NSBundle and UIImage when you can use CCTextureCache that handles all the path resolution and HD/SD detection internally.
Plus, as the name implies, it caches the texture so that the subsequent use of the same sprite file would be faster:
Here is what you need:
CCTexture2D *frame = [[CCTextureCache sharedTextureCache] addImage:#"name.png"];
[self.player setTexture:frame];

Resources