Core Animation on retina - ios

I wrote some code for a Mac (Cocoa) App, where the animations are smooth and lovely. However, I copied it as it is to the iOS project, and ran it on "The New iPad", the resolutionary thingy, and for some reason the layers animation is jagged. The layer suddenly moves a few pixels up, then stops for a sec, then suddenly popes up a few more pixels, it is like a very low frame rate animation).
Interestingly, running the same code on the old iPad 1 gave smooth results!! x( .. made me realize it's a retina display problem.
I am doing a faint (slow) animation, not moving the layers around much. (Example, moving a layer.position.x 10 pixels over a period of 9.1 sec. This gives us a hint that the layer is not interpolating the 0.x pixels?)
I tried increasing the speed (reducing the duration) by four, and it is animating without problems. :/ but, faint animations (slow) have problems..
Any Ideas?
If my question is vague, this might help:
move a layer 10 pixels over 10 sec duration. On the iPad, it looks great (60 FPS presumably).
On the new iPad (retina), it is as if it's 10 FPS (or something like that) !!.

To animate properly on Retina screens, an extra line of code was missing .. >.<
if ([subLayer respondsToSelector:#selector(setContentsScale:)]) {
subLayer.contentsScale = [[UIScreen mainScreen] scale];
}
From this awesome guy: Retina display core graphics font quality
(I don't think I should close the question, since this title is more appropriate, as it addresses all retina issues with core animation).

Related

SCNParticleSystem partially hidden or occluded in wrong manner

I have an issue with particle systems, which can be, in rare cases, kind of occluded in a wrong manner. The particle system you see on the print-screens is a sphere (with invisible material, material transparency = 0.0), that emits particles from its surface. Like 250 particles per second - no magic - and the particle systems works in 99% as it should.
You see also a floor (which is a SCNPlane) that has a very large diameter, of like 100m x 100m. The occlusion happens when the camera is flying by and the angle of the view changes a little bit, because the camera moves smoothly. Depending on the camera angle, it can happen - as you see on the second image - the particle system is occluded partially in a wrong manner, like it would stay behind the horizon - but it does not - it hoovers 2m above the floor and has a radius of 1m.
Did anyone ran into a similar issue? Is there something that could be done, to make this render correctly in all cases (from all viewing angles).
Sometimes the particle system disappears even completely. i.Ex when the camera looks from (20m) above directly on the particle system.
(The scene uses physically based rendering using SceneKit - the background is a simple skybox)
You asked if anyone ran into a similar issue?
I can answer yes!
Depending on the point of view (camera position), and the object on which the SCNParticleSystem is attached, I'm getting weird occlusions of the emitted particles.
I have no SCNPlane, but I have a large SCNSphere around the scene showing a 360 video. If I remove the sphere, the bug doesn't occur anymore.
It might be a regression with iOS 14.x and macOS 11.2, as the same application running under iOS 13.6.1 doesn't show the problem !
If somebody needs. I had similar problem and was trying to set different settings of the particle system for a while.
One of solutions was increased "Rendering order" of the node, that contains the particles, but particles disappear if you change camera orientation.
By chance, I discovered that bug happens when I add specific node to the scene. One difference I found, this node had a material with Transparency mode "Dual layer". I tried this one to make transparent texture.
I changed mode to "Default" and it helped.
Xcode screen

SpriteKit Animation - Keeping Sprites Fixed

I am animating some frames of a monster jumping and swinging a sword, and the frames are such that the width gets bigger or smaller as he swings the sword (the monster standing is 500 width, but his sword, fully extended to the left, adds another 200 width, thus he varies from 500 to 700 or more in width)
I originally took each frame, which is on a transparent background, and used the Photoshop magic wand tool to select just the monster. I then saved these frames like that, and when I used them to animate, the monster warped and changed sizes (it looked bad).
The original frames had a large 1000 x 1000 transparent background surrounding him, and as a result it always kept him "bound" so that it never warped.
My question is what is a good way to create frames of animation where the sprite inside might change size or width as he's moving so that there is no warping?
If I have to use a large border of transparent pixels, is that the recommended approach? I'm noticing that for my animation, each monster takes up about 3 - 5MB. I plan on potentially having a lot of these people ultimately, so i'm wondering if this is the best approach (using large 900 x 900 images all the time, plus I'll be using more for 2x and 1x). So all of this seems like it could spiral out of control to 4 or 5GB.
What are other people doing when making animations that require different poses and positions? Just fixing the frames with borders that are as small as possible?
Thanks!
You should probably change the approach to animation and use inverse kinematics instead. Take a look at this and Ray's tutorial.

all elements in the app are pixelated

screenshot below has been taken from 3.5inch simulator
these are bunch of UIButton, and the border created programmatically like:
btn.layer.cornerRadius = btn.frame.size.width / 2;
i don't know, but now all Fonts and UIButtons in the app get pixelated. mostly everything got pixelated.
I checked every setting in Xcode.
I tried to clean the project, then cleaned DerivedData folder.
I tried building the app in another machine.
I tried the app on real device. same problem.
nothing worked out yet.
An easy way to get pixellation on retina devices is rasterizing layers without setting the right rasterizationScale.
view.layer.shouldRasterize = YES;
view.layer.rasterizationScale = UIScreen.mainScreen.scale;
Without the second line, stuff will look fine for non-retina devices but it'll look awful on retina devices.
...hard to say if that's the problem you're seeing without more of the code, but it's a common enough bug that it merits posting.
Whether or not you should actually be rasterizing is a separate question...there are performance tradeoffs of which to be aware.
It could be that the resulting frame isn't aligned on an even integer. i.e. Moving/changing width is causing the frame to be something like (100.5, 50.0, 50.0, 50.0). When you are drawing on a half-pixel boundary, some of the drawing routines are going to make things look blurry to try and make it appear in the correct place. I would print out the frame after the animation and check:
NSLog(#"%#", NSStringFromCGRect(yourButton.frame));
If you see any non-integer values, use one of the floor() functions to modify the resulting frame in order to snap it to a boundary.
I had same problem with my UILabel(when i changed frame),
Before using floor() method:
And after:

Can I draw and animate vectors in iOS without resorting to bitmap images?

Can I do this?
My question arises from the need of a button that I'm animating when a user touches it.
This animation has been made with a set of 30 png images (half a second of animation # 60FPS). This totals 60 images for regular and retina screens. It works quite well this way, but I'm not happy about it.
My goals are:
1 - Drastically reduce the size of my app (e.g my background is a 400KB png file, but with quartz I can do it with a dozen lines of code).
2 - Do with it with a perfect, smooth animation, as light on the CPU/GPU as I can.
So, is there anyway I can do this?
I have the images in pure vector, and I can draw them with Quartz. But not animate it without having to redraw everything for every frame. (Well, the animation is a "2 way street", it's the coming back that would be problematic to redraw)
Are there any APIs/Frameworks that would help me do this? How would I go about it?
Thank you!
Take a look at CAShapeLayer. It's path property is animatable. For an animation to look good it's important that the from & to shape in the animation have the same amount of points. So depending on your shapes this might or might not work.

CCParticleSystem and iPhone "bug" or limitation?

I seemed to have run into a strange problem with the CCParticleSystem and the iPhone.
I have a laser that is being shot across the screen from left to right. I added a particle effect to give the laser more of a railgun look to it. I used the "emmas sharing" particle effect from Particle Designer.
Here is the code to send the laser across the screen:
-(void)fireLaserCannonAddon
{
if( _ship == nil || _ship.dead ) return;
CGSize winSize = [CCDirector sharedDirector].winSize;
shipLaserCannon = [_laserCannonArray nextSprite];
[shipLaserCannon stopAllActions];
shipLaserCannon.position = ccpAdd(_ship.position, ccp(shipLaserCannon.contentSize.width / 2, -shipLaserCannon.contentSize.height));
[shipLaserCannon revive];
CCParticleSystemQuad *laserEffect = [_laserEffect nextParticleSystem];
[laserEffect resetSystem];
[shipLaserCannon runAction:[CCSequence actions:
[CCMoveBy actionWithDuration:0.5 position:ccp(winSize.width, 0)],
[CCCallFuncN actionWithTarget:self selector:#selector(invisNode:)],
[CCCallFunc actionWithTarget:self selector:#selector(endLaserEffects)], nil]];
}
And the code to set the particle system effect to the laser's position:
-(void)updateLaserEffects:(ccTime)dt
{
for( CCParticleSystemQuad *laserEffect in _laserEffect.array )
{
laserEffect.position = shipLaserCannon.position;
}
}
-(void)endLaserEffects
{
for( CCParticleSystemQuad *laserEffect in _laserEffect.array )
{
[laserEffect stopSystem];
}
}
If you open up the "emmas sharing" effect in Particle Designer, the effect is the same as when you click and drag across the screen. This works perfectly on the iPad and iPad simulator, however on my iPhone 3GS / iPhone (SD and retina) simulator, the emitted particles seem to be "carried" with the laser. It's not as identical as setting the PositionType to kCCPositionTypeGrouped (the emitted particles stay in that circle shape), but kind of a mix between it being kCCPositionTypeGrouped and kCCPositionTypeFree. The particles are emitting off the laser, but also being a dragged a bit behind the laser instead of staying where it was emitted like on the Particle Designer simulator and regular iPad. It looks as if the laser is creating its own layer with the particle effect on it with the "layer" lagging behind it.
I thought that maybe the laser was moving too fast, but even when slowed down, it had the same effect.
This "bug" also creates another small problem, since it's being "carried" with the laser, when the laser is off the screen and then taken out, the remnants of the last emitting particles are visible on the bottom left of the screen, since I'm sure its because the emitted particles are still following the position.x of the laser (which it shouldn't be doing, only the base of it is supposed to) and since the laser is gone, it defaults to it's default set position. However, I do not have this problem on the iPad / iPad simulator.
BTW, this wasn't just limited to only the "emma sharing" particle effect, it seems to do the same for all the other effects.
Has anyone else ever had similar issues with using CCParticleSystems on a moving object for the iPhone?
Any helpful input is greatly appreciated!
OK, so after some messing around, I found out what was causing all this.
I originally had the CCParticleSystem set to 1.0 (original scale) for the iPad and 0.5 for the iphone. I changed the scale for the iPhone to 1.0 and everything worked like it should..just a lot bigger, but it worked. I really didn't want to have two different particle effects for the same effect just because of screen size, so I figured I'll just scale up to 2.0 for the iPad while leaving 1.0 on the iPhone. Low and behold, now the iPad suffered the same weird looking effect as I did on the iPhone, but much more extreme.
Looks like I don't have much of a choice now, but to have two different files for the same effect, but I'm relieved I found out what was causing this and can save a few hairs from leaving prematurely.
I think scaling an effect is fine as long as it's not following an object dynamically, like in my case.
I don't know if this would be considered as a bug or not, since I'm sure it's a math thing that cocos2d is using and scaling it affects it.
TLDR::Scaling up/down a particle effect will cause this weird effect when it is following an object's position. Don't re-scale particle effects that are following an object's position dynamically. If it's just in one spot, then it's fine.

Resources