all elements in the app are pixelated - ios

screenshot below has been taken from 3.5inch simulator
these are bunch of UIButton, and the border created programmatically like:
btn.layer.cornerRadius = btn.frame.size.width / 2;
i don't know, but now all Fonts and UIButtons in the app get pixelated. mostly everything got pixelated.
I checked every setting in Xcode.
I tried to clean the project, then cleaned DerivedData folder.
I tried building the app in another machine.
I tried the app on real device. same problem.
nothing worked out yet.

An easy way to get pixellation on retina devices is rasterizing layers without setting the right rasterizationScale.
view.layer.shouldRasterize = YES;
view.layer.rasterizationScale = UIScreen.mainScreen.scale;
Without the second line, stuff will look fine for non-retina devices but it'll look awful on retina devices.
...hard to say if that's the problem you're seeing without more of the code, but it's a common enough bug that it merits posting.
Whether or not you should actually be rasterizing is a separate question...there are performance tradeoffs of which to be aware.

It could be that the resulting frame isn't aligned on an even integer. i.e. Moving/changing width is causing the frame to be something like (100.5, 50.0, 50.0, 50.0). When you are drawing on a half-pixel boundary, some of the drawing routines are going to make things look blurry to try and make it appear in the correct place. I would print out the frame after the animation and check:
NSLog(#"%#", NSStringFromCGRect(yourButton.frame));
If you see any non-integer values, use one of the floor() functions to modify the resulting frame in order to snap it to a boundary.
I had same problem with my UILabel(when i changed frame),
Before using floor() method:
And after:

Related

Should we use integer strictly when we layout controls in iOS? Why?

I need to draw a line on the screen. The designer tell me to set the height of line to 2.5 pt. I'm wondering if it's acceptable to use decimal here.
I know it will be better if we use integer as the size or position of a UIView. But I can't tell why. I didn't find any convincing document about it.
So could anyone explain it or find something for me?
The only problem I know is UILabel with decimal size will be blurry. Is there any other problem like performance problem would happen?
As per your case it's just about half point which in my opinion should be ok. I often find that in storyboards my frames get moved 0.5 point up or down when I have lot of UI elements and sometimes constraints just get confused.
Additionally: the easiest scenario when your frame could be positioned at "some 0.5 point" is if you use a constraint to centre your view in it's superview. In this case the default x or y frame position could easily end up being "some 0.5 point". So it can happen often times and could be done by Xcode itself, so that's why I think your view will be just fine.
As stated previous in comments those points are CGFloat. What we have now is 1x, 2x and 3x resolutions. So there will always be the need for some calculations for at least one of the devices.
You may also look at this Screen resolutions guide to see all the time when up sampling and downsampling occurs.

Why is object.y not positioning the image in Corona SDK?

displaycontent = display.newImageRect (rawdata[currentpath][3], screenW*1.1, ((screenW*1.1/1654)*rawdata[currentpath][6]))
displaycontent.anchorY = 0
displaycontent.y = screenH*0.78
My program loads an image from a database to be displayed on the mobile phone's screen, everything works correctly apart from being able to position it with the y coordinates.
The only thing that changes its position is the anchor point 0 puts the top of the image in the centre of the screen, and values from 0.1 - 1 all position it higher. Changing the y position via object.y has zero effect regardless of what I set it as.
(the size settings probably look a bit weird in the first line, but this is because the images are different sizes and need to show the correct proportions on different screen types).
Btw I am using a tabbar widget as the UI (in case that is relevant)
Any help would be appreciated.
Edit: I am aware that displaycontent is bad name for a variable because of its similarity to things like display.contentCenterY for example, this will be changed to prevent any confusion when I look over the code in future.
I went through my code and tried disabling sections to find the culprit and a content mask was preventing me from setting the position of the loaded images within it.
I will look over my masking code and fix it (should be straight forward now I know where the problem started).
If anyone else has a similar problem (where an image or object wont position itself on given coordinates) check your content mask as that may be the issue!

C4 cropping image some weird things happening

I thought this would be rather straight forward but it seems it's not.
Things I have noticed when trying to crop an image like this:
#import "C4Workspace.h"
#implementation C4WorkSpace{
C4Image *image;
C4Image *copiedImage;
}
-(void)setup {
image=[C4Image imageNamed:#"C4Sky.png"];
//image.width=200;
image.origin=CGPointMake(0, 20);
C4Log(#" image width %f", image.width);
//[self.canvas addImage:image];
copiedImage=[C4Image imageWithImage:image];
[copiedImage crop:CGRectMake(50, 0, 200, 200)];
copiedImage.origin=CGPointMake(0, 220);
[self.canvas addObjects:#[image, copiedImage]];
C4Log(#"copied image width %f", copiedImage.width);
}
#end
origin of CGRectMake (the x and y coordinates) do not start from the upper left corner, but from the lower left and the height goes up instead of down then.
size of cropped image is actually the same as from the original image. I suppose the image doesn't really get cropped but only masked?
different scales In the example above I'm actually not specifying any scale, nevertheless original and cropped image do NOT have the same scale. Why?
I'm actually wondering how this function can be useful at all then... It seems that it would actually make more sense to go into the raw image data to crop some part of an image, rather than having to guess which area has been cropped/masked, so that I'd know where exactly the image actually remains...
Or maybe I'm doing something wrong?? (I couldn't find any example on cropping an image, so this is what I made...)
What you have found is a bug in the expected implementation of the crop: filter being run on your image.
1) The crop: method is actually an implementation done using Core Graphics, and is specifically running a Core Image filter (CIFilter) on your original image. The placement of (0,0) in Core Graphics is in the bottom left corner of the image. This is why the origin is off.
2) Yes. I'm not sure if this is should be considered a bug or a feature, something for me to think about... This actually has to do with the way that "filters" are designed.
3) Because of the bug in the way crop: is built, the filter doesn't account for the fact that the image scale should be 2.0, and it is re-rendering at 1.0 (and it shouldn't do this)
Finally, you've found a bug. I've listed it to be fixed here:
https://github.com/C4Framework/C4iOS/issues/110
The reason for much of the confusion, I believe, is that I built the filter methods for C4Image when I was originally working on a device / simulator that wasn't retina. I haven't had the opportunity to revisit how those are built, there also haven't been any questions about this issue before!

iOS: CAEmitter Layer and EmitterCell Flips Image on iOS 6

Was playing around with CAEmitterLayer and discovered something really weird.
I set up the CAEmitterLayer at the lower left corner, positioned at 45 degree (pointing towards the top right corner) and tried to shoot some arrows toward the top right corner.
Everything worked, except the image that I set via the content property of the cell.
Here is the original image on iOS 7 device:
When run on iOS 6, it becomes like this:
Has anyone experienced this and do you know why this is so? Having two sets of images and check whether the device is iOS 6 or iOS 7 and set the image up accordingly is not a problem for me, but my curiosity urges me to find out why this is so. Thanks in advance.
I am using Xcode 5.
This is normal behaviour in CAEmitterLayer. It uses a different coordinate system than the rest of iOS. As it was a technology derived from MacOS its origin (0,0) is located at the bottom left, while in iOS the origin is located at the top left. When the picture gets drawn it causes the image to get flipped. CAEmitterLayer was not really designed to use images like that, mostly made for particle systems that do not require a specific orientation.
The simplest solution to this would be to flip the image yourself so when CAEmitterLayer flips it again it will appear like you want it. This might have gotten changes in iOS7 so you would have to do a version check and apply the correct image.
You could also flip it in code if you wanted. This is a short code that does it:
UIImage *flippedPicture = [UIImage imageWithCGImage:picture.CGImage scale:1.0 orientation:UIImageOrientationLeftMirrored];
Source: http://www.vigorouscoding.com/2013/02/particle-image-gets-mirrored-by-uikit-particle-system/

CCParticleSystem and iPhone "bug" or limitation?

I seemed to have run into a strange problem with the CCParticleSystem and the iPhone.
I have a laser that is being shot across the screen from left to right. I added a particle effect to give the laser more of a railgun look to it. I used the "emmas sharing" particle effect from Particle Designer.
Here is the code to send the laser across the screen:
-(void)fireLaserCannonAddon
{
if( _ship == nil || _ship.dead ) return;
CGSize winSize = [CCDirector sharedDirector].winSize;
shipLaserCannon = [_laserCannonArray nextSprite];
[shipLaserCannon stopAllActions];
shipLaserCannon.position = ccpAdd(_ship.position, ccp(shipLaserCannon.contentSize.width / 2, -shipLaserCannon.contentSize.height));
[shipLaserCannon revive];
CCParticleSystemQuad *laserEffect = [_laserEffect nextParticleSystem];
[laserEffect resetSystem];
[shipLaserCannon runAction:[CCSequence actions:
[CCMoveBy actionWithDuration:0.5 position:ccp(winSize.width, 0)],
[CCCallFuncN actionWithTarget:self selector:#selector(invisNode:)],
[CCCallFunc actionWithTarget:self selector:#selector(endLaserEffects)], nil]];
}
And the code to set the particle system effect to the laser's position:
-(void)updateLaserEffects:(ccTime)dt
{
for( CCParticleSystemQuad *laserEffect in _laserEffect.array )
{
laserEffect.position = shipLaserCannon.position;
}
}
-(void)endLaserEffects
{
for( CCParticleSystemQuad *laserEffect in _laserEffect.array )
{
[laserEffect stopSystem];
}
}
If you open up the "emmas sharing" effect in Particle Designer, the effect is the same as when you click and drag across the screen. This works perfectly on the iPad and iPad simulator, however on my iPhone 3GS / iPhone (SD and retina) simulator, the emitted particles seem to be "carried" with the laser. It's not as identical as setting the PositionType to kCCPositionTypeGrouped (the emitted particles stay in that circle shape), but kind of a mix between it being kCCPositionTypeGrouped and kCCPositionTypeFree. The particles are emitting off the laser, but also being a dragged a bit behind the laser instead of staying where it was emitted like on the Particle Designer simulator and regular iPad. It looks as if the laser is creating its own layer with the particle effect on it with the "layer" lagging behind it.
I thought that maybe the laser was moving too fast, but even when slowed down, it had the same effect.
This "bug" also creates another small problem, since it's being "carried" with the laser, when the laser is off the screen and then taken out, the remnants of the last emitting particles are visible on the bottom left of the screen, since I'm sure its because the emitted particles are still following the position.x of the laser (which it shouldn't be doing, only the base of it is supposed to) and since the laser is gone, it defaults to it's default set position. However, I do not have this problem on the iPad / iPad simulator.
BTW, this wasn't just limited to only the "emma sharing" particle effect, it seems to do the same for all the other effects.
Has anyone else ever had similar issues with using CCParticleSystems on a moving object for the iPhone?
Any helpful input is greatly appreciated!
OK, so after some messing around, I found out what was causing all this.
I originally had the CCParticleSystem set to 1.0 (original scale) for the iPad and 0.5 for the iphone. I changed the scale for the iPhone to 1.0 and everything worked like it should..just a lot bigger, but it worked. I really didn't want to have two different particle effects for the same effect just because of screen size, so I figured I'll just scale up to 2.0 for the iPad while leaving 1.0 on the iPhone. Low and behold, now the iPad suffered the same weird looking effect as I did on the iPhone, but much more extreme.
Looks like I don't have much of a choice now, but to have two different files for the same effect, but I'm relieved I found out what was causing this and can save a few hairs from leaving prematurely.
I think scaling an effect is fine as long as it's not following an object dynamically, like in my case.
I don't know if this would be considered as a bug or not, since I'm sure it's a math thing that cocos2d is using and scaling it affects it.
TLDR::Scaling up/down a particle effect will cause this weird effect when it is following an object's position. Don't re-scale particle effects that are following an object's position dynamically. If it's just in one spot, then it's fine.

Resources