SpriteKit loses textures - ios

I'm making multilevel game based on SpriteKit.
Everything works well except one thing: when user plays long time, changes many levels, etc... then SpriteKit starts losing textures.
I mean there is no big red cross like when image load fails but just empty space like nothing is there.
Hours of debugging and googling did not produce any results.
How can I deal with that bug?

I think I might be having a related issue, except the loss of textures occurs when I am rapidly running actions on a SKSpriteNode. In my code, I run an action each time I get a touch and when the touches are rapid and the animations are firing quickly, the base texture of the SKSpriteNode seemingly disappears. No memory warnings, not a peep from the console; the SKSpriteNode's texture is magically set to nil.
I get the impression from your question that this isn't your exact cause, but you are having the same symptoms. Unfortunately I don't know what is causing it. What I've done to work around the issue has been to constantly check if the texture on my SKSprite node has been set to nil immediately after I run an SKAction and then re-assign it if needed.
So, an abridged version (in Swift) of what I'm doing looks like this :
func doAnimation( ) {
_character.runAction(someSKAction, withKey: "animation")
//Whoops!, we lost our base texture again!
if _character.texture == nil {
let atlas = SKTextureAtlas(named: "someAtlasName")
let texture = atlas.textureNamed("idleFrame" )
_character.texture = texture
}
}
This is not really solution so much as a workaround, but it might be adaptable to your situation until you (or someone else on SO) figures it out. I won't argue that it's not ugly.
BTW, you are not alone with the disappearing texture issue; I wrote a similar response to a similar question here.

Related

iOS SpriteKit strong collision issues

I wrote few simple SpriteKit games and always had problems with strong collisions behaviour. Even now with my latest game in development the same issue appears quite randomly.
Game-play area is inside edgeLoopFromRect physics body
There are some objects (with circle physics body) bouncing around
But sometimes (this happens very rarely) when a strong collision (edge <- body1 <- body2) happens directly near the edgeLoopFromRect physics body, sometimes colliding-object (body1) jumps out. I am totally confused how to debug this kind of behaviour.
EDIT: I have recorded a video of what exactly happens:
Physics engine collapses
For the demo I have made impulse strength (the biggest blue ball) on purpose 10-times stronger, so the effect can be seen sooner. Look what happens after 10-15 seconds. Physics totally collapses. This can be seen by background totally going off and balls fly into the unknown.
Note: everything I do in this demo is applying impulse on the blue ball.
You need to enable usesPreciseCollisionDetection. This will force SpriteKit to calculate the physics positions of the entire movement, which will now prevent it from popping out. Code for this would look like this in Objective-C:
self.body1.physicsBody.usesPreciseCollisionDetection = YES;
And in Swift:
body1.physicsBody?.usesPreciseCollisionDetection = true
This will have a performance impact, but will prevent your glitch from happening. You will probably have to do this for all of the circular physics bodies that you have moving around.

Swift SpriteKit making physicsbody from texture of an image slows down my app too much

I'm trying to make an iOS app that includes some collision detection between two physics bodies. I want one of the physics bodies to be the shape of an image I am using, but when I try to do this using a texture it slows my app down tremendously and eventually causes it to freeze altogether. These are the two lines of code that are causing it:
let texture = SKTexture(imageNamed: "image.png")
physicsBody = SKPhysicsBody(texture: texture, size: size)
however, if I change these two lines to something like
physicsBody = SKPhysicsBody(rectangleOfSize: size)
then everything runs perfectly fine. Has anyone else had this problem and/or found a solution?
This may be due to the complex nature of your texture, but it's hard to tell without seeing it. As Whirlwind said, it probably shouldn't cause such a significant slowdown however it's difficult resolve without further information.
A way to get around creating the SKPhysicsBody from a texture would be to use an online tool for building the body from a path. I use this tool personally. It may be a decent work around.

SKPhysicsWorld bodyWithTexture not working well with complex shapes

Am I the only one whose having issues with the new bodyWithTexture function of SKPhysicsBody?
I'm new to iOS development and maybe it's me, but I'm trying to create a game where I need to detect if a ball is inside a path.
I'm loading both from images dynamically (as the level proceeds the paths are more and more complex), and I'm setting a physics body to both the ball (bodyWithCircle) and from the dynamic path which is a PNG file of a path and all the rest is transparent background. I'm using the new bodyWithTexture function (yes I know it's supported only under iOS 8), and after assigning bit masks I've defined a contact between the ball and path and am informed with begin/end contact.
SKSpriteNode *lvlPath = [SKSpriteNode spriteNodeWithImageNamed:currentLevel.imagePath];
lvlPath.position = CGPointMake(self.frame.size.width/2, self.frame.size.height/2);
lvlPath.physicsBody = [SKPhysicsBody bodyWithTexture:lvlPath.texture size:lvlPath.frame.size];
now for simple paths like straight line, it works great. once it comes to complicated paths - the mechanism is going crazy, at least in my simluator (running iOS 8).
i've created another simple app just to check this issue, and saw that it's going crazy when the path is a complex shape. when the ball enters the path in one direction it seems to be working (begin/end contact), but going the reverse direction suddenly acts weird when still inside the path it reports to have ended contact, and then like randomly flips begin/end contact.
help... since the levels are loaded dynamically, this is a really cool feature for me, saving me the definitions of all levels as CGPathRef and creating a polygon for each level (and perhaps device).
thanks all,
Eyal
edit
example screenshot:
https://www.dropbox.com/s/e8v9g1kajtvakfq/screenshot%20bodywithtexture.jpg?dl=0
in this example the ball with the arrow is inited using bodyWithCircle, and the C shaped object is inited using bodyWithTexture. I'm debug printing "didBeginContact" and "didEndContact" and it freaks out in the top line there, you can see it's with "didEndContact" while the two object are definitely at contact. If I jiggle it (I'm moving it with the cursor) it suddenly flips to "didBeginContact".
With Simpler objects (like horizontal/vertical lines with round corners) it works perfectly.

Smooth Rotation using Bullet and Ogre3D

I've been suffering from an issue regarding the implementation of orienting characters in a game I'm implementing using Ogre3D and Bullet physics.
What I have: A Direction Vector that the character is moving in, along with its current orientation.
What I need: To set the orientation of the character to face the way it is moving.
I have a snippet of code that sort of does what I want:
btTransform src = body->getCenterOfMassTransform();
btVector3 up = BtOgre::Convert::toBullet(Ogre::Vector3::UNIT_X);
btVector3 normDirection = mDirection.normalized();
btScalar angle = acos(up.dot(normDirection));
btVector3 axis = up.cross(normDirection);
src.setRotation(btQuaternion(axis, angle));
body->setCenterOfMassTransform(src);
Where 'body' is the rigidbody I'm trying to orient.
This snippet has a couple of problems however:
1) When changing direction, it tends to 'jitter' i.e. it rapidly faces one way, then the opposite for a second or so before correcting itself to the orientation it is supposed to be at.
2) Most times that the code is run I get an assertion error from Bullet's btQuaternion on
assert(d != btScalar(0.0));
Can anyone help?
Thanks!
I think you shouldn't use functions like 'acos' for such things, as it may cause some inconsistencies in border-cases as the 180 vs 0 rotation mentioned above. You can also get high numerical error for such data.
The second thing is that - in general - you should avoid setting explicit position and rotation in physics engines, but rather apply forces and torques to make your body moving as you want. Your current approach may work perfectly now, but when you add another object and force you character to occupy the same space, your simulation will explode. And at this stage it's very hard to fix it, so it's better to do it right from start :) .
I know that finding correct force/torque can be tricky but it's the best way to make your simulation consistent.

Why isn't Quartz double buffering my drawInContext()?

I am rendering a simple line drawing (a line with some text in the middle) in a CALayer subclass via drawInContext(). I update this layer as the user is performing a gesture by calling setNeedsDisplay on it. The effect that I am seeing is what I might expect if there were no double buffering going on... i.e. I see parts of new rendering overlapping parts of old rendering. When I stop updating (complete the gesture) the system "catches up" and I always see the correct final result, but during the updates I see inconsistent results... This effect is not subtle and sometimes it is extreme... e.g. if I keep updating fast enough I can keep stale parts of the drawing on the screen for seconds while the new parts are drawing ahead...
I don't understand this at all. If Quartz is doing buffering then it seems that it is not blitting the result to the screen in its entirety or it is miscalculating the affected area.
Things I've tried:
1) I am disabling implicit animations and doing all of the drawing within a CATransaction
2) I am not making a mistake in my drawing... It's literally just two lines with some text in between... there is no way that I'm rendering the intermediate artifacts.
3) I have tried limiting the rate of updates by skipping most of them... but even at the lower rate I see artifacts until I stop updating and let the system catch up.
4) BTW, this happens identically in the simulator and on the device (iPad).
Is it necessary for me to draw into an offscreen buffer myself and copy it to the screen in its entirety? I thought that I had read that Quartz does this for me.
Update:
As usual, after hours of banging my head against the wall I find the (partial) answer 5 minutes after posting the question. I realized that I was using a CATiledLayer in order to get my layer re-rendered on zoom. If I switch it back to a regular CALayer the glitches go away. So I guess what I am seeing artifacts of the separate tiles rendering. Now I am trying to figure out how to deal with this...
So, it turns out that I had three problems:
1) CATiledLayer explicitly fades in new tile content with a default time of 0.25 seconds... This was causing havoc with my drawing. I overrode this in my CATiledLayer subclass:
+ (CFTimeInterval)fadeDuration {
NSLog(#"got fade duration");
return 0;
}
2) I also had to adjust the maximum tile size up (I set it to 1024x1024 though I don't know what size it is actually using).
3) I was making adjustments to my layer's frame periodically during the updates and that seemed to cause additional problems for the tiled layer. I am making changes to stop that.
With all of those changes the performance seems acceptable now.

Resources