Moving a Rotated UIButton - ios

I have a problem. I'm working on making a game. As part of my game I need images to be rotated and then moved in the direction of the rotated angle inside a game loop (using an NSTimer). In essence I'm trying to create the effect of launching a projectile. The code works fine when moving in perpendicular directions such as 0, 90, 180, 270, and 360 degrees, but any other angle and the image starts to glitch out. The object on the screen maintains its correct bounds and contents, but the actual displayed image disappears. Does anybody know what the problem is or someway I could get around it? If needed, I can make and post a video of my problem so you can see what I'm talking about.
Here is a sample of the code I'm using. The "background" variable is just a UIImageView:
angle = 60;
background.transform = CGAffineTransformRotate(object.transform, angle*M_PI/180); //converts degrees to radians and rotates the image
background.frame = CGRectMake( background.frame.origin.x + cos(angle*m_PI/180)*32; background.frame.origin.y -sin(angle*M_PI/180)*32, background.frame.size.width, background.frame.size.height); //moves the image in the direction of the angle

For starters, there is a semicolon after the x origin in your CGRect instead of a comma. Was that just a typo?
The UIView documentation for frame states:
Warning: If the transform property is not the identity transform, the
value of this property is undefined and therefore should be ignored.
Changes to this property can be animated. However, if the transform
property contains a non-identity transform, the value of the frame
property is undefined and should not be modified. In that case, you
can reposition the view using the center property and adjust the size
using the bounds property instead.
So there you have it, you should not be trying to change the frame when setting a custom transform. You are only trying to adjust the position of the view anyway so just modify your code to adjust center instead of the origin coordinates.
To change the size, you can use the bounds.
CGRect bounds = myView.bounds;
bounds.size.width = whatever;
bounds.size.height = whatever;
myView.bounds = bounds;

Related

Resizable UIView

I am building a custom UIView that you can rotate and resize. I can resize the UIView by dragging the corners of the UIView. I calculate how much I have dragged then change the frame of the UIView accordingly.
However, I am running into problems once I added a rotation gesture recognizer to the view. If I rotate or apply a transform to the view, I no longer know how to calculate drag distance and change the frame of the view. How could I calculate the width and height change between my new view and the original view when things are put at an added angle or if they have some other transform, like a translation transform?
I thought of possibilities to set the view's transform back to .identity, change the size of the view, then re-apply its transform, but I'm not sure how to actually go about implementing this.
After applying transform you can not use frame
You have two options
1) First Calculate everything using center of your view
2) As you know apply identity and change frame
for point 2 I have added example that might helpful to you
let transform = imageView.transform
imageView.transform = CGAffineTransform.identity
var rect: CGRect = imageView.frame
rect = // Change Rect here
imageView.frame = rect // Assign it
imageView.transform = transform // Apply Transform

images on iOS becoming narrow and squeezed when I apply rotation and transformation

I am using the following code to rotate and transform an image view:
myImageView.transform = CGAffineTransformMakeRotation(45); // rotation
CGRect frame = myImageView.frame;
frame.origin.x = x_position;
frame.origin.y = y_position;
myImageView.frame = frame; // transformation
tl;dr: The frame is bogus when you have a non-identity transform. Change the center instead.
From the documentation:
Warning If the transform property is not the identity transform, the value of this property is undefined and therefore should be ignored.
When you are setting the transform on the first line and then reading the frame after it is undefined what you actually get back.
// Setting a non-identity transform (1)
myImageView.transform = CGAffineTransformMakeRotation(45);
CGRect frame = myImageView.frame; // At this point the frame is undefined (2)
// You are modifying something which is undefined from now on ...
Also, not only should you not read the frame because it is undefined, you should also not set it.
if the transform property contains a non-identity transform, the value of the frame property is undefined and should not be modified.
The solution to that problem comes in the next sentence of the documentation
In that case, you can reposition the view using the center property and adjust the size using the bounds property instead.
Since you are only changing the position I would suggest that you don't touch the frame and instead read the center of the image view and set it to it's new value. The center is not affected by the transform in the same way as the frame.
CGPoint center = myImageView.center;
center.x = x_center_position; // Note that the `center` is not the same as the frame origin.
center.y = y_center_position; // Note that the `center` is not the same as the frame origin.
myImageView.center = center;
Try to remove autoresizing Mask of ImageView
imageView.autoresizingMask = UIViewAutoresizingNone;
This may solve your problem

How to Rotate CAShapeLayer containing UIBezierPath? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Rotate CGPath without changing its position
I searched and tested a variety of code for a couple of hours and I can't get this to work.
I am adding an arbitrary UIBezierPath at a random location to a CAShapeLayer which gets added to a view. I need to rotate the path so that I can handle device rotations. I can rotate the layer instead of the path. I just need the result to be rotated.
I already have methods to handle transforming the bezier path by scaling and translation. It works great, but now I need to simply rotate 90 degrees left or right.
Any recommendations on how to do this?
Basic code:
UIBezierPath *path = <create arbitrary path>
CAShapeLayer *layer = [CAShapeLayer layer];
[self addPathToLayer:layer
fromPath:path];
// I could get the center of the box but where is the box center for the view it is in?
// CGRect box = CGPathGetPathBoundingBox(path.CGPath);
// layer.anchorPoint = ? How to find the center of the box for the anchor point?
// Rotating here appears to rotate around 0,0 of the view
layer.transform = CATransform3DMakeRotation(DegreesToRadians(-90), 0.0, 0.0, 1.0);
I see the following post:
BezierPath Rotation in a UIView
I suppose I could rotate as-is and then translate the path back into place. I just need to figure out what the translation values would be.
I should also state that what I am seeing after I try to rotate is that the image moves off-screen somewhere. I tried rotating 25 degrees to see movement and it pivots around the view's origin of 0,0 so that if I rotate 90 degrees the image is off-screen. I am running these test WITHOUT rotating the device - just to see how rotation works.
UPDATE #1 - 12/4/2012: For some bizarre reason if I set the position to a value I found empirically it moves the rotated bezier path into the correct position after rotation:
layer.position = CGPointMake(280, 60);
This values are a guess from starting/stopping the app and making adjustments. I have no idea why I need to adjust the position on rotation. The anchor point should be in the center of the layer. However, I did find that both the frame and position of a CAShapeLayer are all ZERO even though the path is set, and also the fact that the path is in the correct position within the view. The 280, 60 position shifts the path into what would be the center of the path bounding box when a rotation of +90 is made. If I change the rotation value I need to adjust the position. I should not have to do this manually adjustment.
I think a last resort is to somehow convert the bezier path to an image and then add it. I found that if I set the layer content to an image, then rotate, it rotates about its center point with no positional adjustment needed. Not so with setting the path.
UPDATE #2 12/4/2012 - I tried setting the frame and with fiddling I get it to center as follows:
CGRect box = CGPathGetPathBoundingBox(path.CGPath);
CGRect rect = CGRectMake(0, 0, box.origin.x + (3.5 * box.size.width), box.origin.y + (3.5 * box.size.height));
layer.frame = rect;
layer.transform = CATransform3DMakeRotation(DegreesToRadians(90), 0.0, 0.0, 1.0);
Why multiply by 3.5? I have no clue. I found that adding the box origin with about 3.5 times the size of the box shifts the rotated CAShapeLayer path to about where it should be.
There must be a better way to do this. This is a better solution than my previous post since the frame size does not depend on the rotation angle. I just don't know why the frame needs to be set to the value I am setting it to. I THOUGHT it should be
CGRectMake(0, 0, box.origin.x + (box.size.width / 2), box.origin.y + (box.size.height / 2));
However, it shifts the image to the left too much.
Another clue I found is that if I set the frame of [self view].frame (the frame of the entire parent view, which is the screen of the iPhone), then rotate, the rotation point is the center of the screen, an the path/image orbits around this center point. This is why I tried shifting the frame to what the center of the path should be so that it orbits around the box center.
UPDATE #3 12/4/2012 - I tried to render the layer as an image. However, it appears that just setting the path of a layer does not make it an "image" in the layer since it is empty
CGRect box = CGPathGetPathBoundingBox(path.CGPath);
layer.frame = box;
UIImage *image = [ImageHelper imageFromLayer:layer]; // ImageHelper library I created
CAShapeLayer *newLayer = [CAShapeLayer layer];
newLayer.frame = CGRectMake(box.origin.x, box.origin.y, image.size.width, image.size.height);
newLayer.contents = (id) image.CGImage;
It appears that rotating the layer with its path set is no different than simply rotating the bezier path itself. I will go back to rotating the bezier path and see if I can fiddle with the position elements or something. There's got to be a solution to this.
Goal: Rotate a UIBezierPath around its center point within the view it was originally created in.
UPDATE #4 12/4/2012 - I ran a series of tests measuring the values needed for translation in order to place a UIBezierPath in its previous center location.
CGAffineTransform rotate = CGAffineTransformMakeRotation(DegreesToRadians(-15));
[path applyTransform:rotate];
// CGAffineTransform translate = CGAffineTransformMakeTranslation(-110, 70); // -45
CGAffineTransform translate = CGAffineTransformMakeTranslation(-52, -58); // -15
[path applyTransform:translate];
However, the ratios of x/y translations do not correspond so I cannot extrapolate what translation is required based on the angle. It appears that 'CGAffineTransformMakeRotation' uses some arbitrary anchor put to make the rotation, which at the moment appears to be maybe (viewWidth / 2, 0). I am making this much harder than it needs to be. There's something I am missing to make a simple rotation so that the center point is maintained. I just need to "spin" the path 90 degrees left or right.
UPDATE #5 12/4/2012 - After running additional tests it appears that the anchor point for rotating a UIBezierPath is the origin from where all of the points were drawn. In this case the origin is 0,0 and all of the points are relative to that point. Therefore, it a rotation is applied, the rotation is occurring around the origin, and is why the path shifts up-right on -90 and up-left on 90. I need to somehow set the anchor point for the rotation to the center so it "spins" around the center, rather than the original origin point. 12 hours spent on this one issue.
After some detailed analysis and graphing the bounding box on paper I found my assertion that the origin of 0,0 is correct.
A solution to this problem is to translate the path (the underlying matrix) to the origin, with the center of the bounding box at origin, rotate, then translate the path back to its original location.
Here's how to rotate a UIBezierPath 90 degrees:
CGAffineTransform translate = CGAffineTransformMakeTranslation(-1 * (box.origin.x + (box.size.width / 2)), -1 * (box.origin.y + (box.size.height / 2)));
[path applyTransform:translate];
CGAffineTransform rotate = CGAffineTransformMakeRotation(DegreesToRadians(90));
[path applyTransform:rotate];
translate = CGAffineTransformMakeTranslation((box.origin.x + (box.size.width / 2)), (box.origin.y + (box.size.height / 2)));
[path applyTransform:translate];
Plug in -90 degrees to rotate in the other direction.
This formula can be used when rotating the device from portrait to landscape and vice/versa.
I still don't think this is the ideal solution but the result is what I need for now.
If anyone has a better solution for this please post.
UPDATE 12/7/2012 - I found what I think is the best solution, and very simple as I though it would be. Rather than using rotate, translate, and scale methods on the bezier path, I instead extract the array of points as CGPoint objects, and scale/translate them as needed based on the view size as well as the orientation. I then create a new bezier path and set the layer to this path.
The result is perfect scaling, translation, rotation.

Detecting rotated width/height of UIView after orientation change

I would like to be able to inspect a UIView instance that may or may not have been rotated from its original orientation (by the user rotating the device for instance) and determine what the "true" width and height are. "true" here meaning, if a view on a portrait-oriented iPad was 768x1024 before rotation, after being turned sideways I would calculate that the new width was 1024 and the new height was 768.
It appears that if I apply the view's transform to its frame property like this:
CGRect rotated = CGRectApplyAffineTransform([myview frame], [myview transform);
I get the desired result. Apple's documentation however states that UIView::frame is undefined if the transform for the view is not the identity transform, so maybe it's not a good idea to rely on this calculation?
view.bounds will return the rect you want
view.frame will return a rect with the transfrom applied to the bounds along with position.
Well due to lack of input, I'm going to go with my solution of using:
CGRect rotated = CGRectApplyAffineTransform([myview frame], [myview transform);
To get the properly oriented bounding. If somebody has another solution or can confirm this is safe I will award the answer to them instead.

UIImageView gets blurry after layer transformation

I have a UIImageView add to a UIView as a subview. When I apply a transformation on the UIView's layer the UIImageView gets blurry. Why is that? How can this problem be resolved?
view.layer.position = newPosition;
I apply only this transformation.
Edit:
I've tested it and if I apply other transformations like these:
view.layer.transform = newTransform;
view.layer.zPosition = newZPosition;
then the blurry doesn't appear, only if I change the layer position.
Use
blurryDialog.frame = CGRectIntegral(blurryDialog.frame);
To set the frame coordinates to integer values
I found the answer. When you set the position of a layer or a view then the origin of it can be float values and this causes the blurry thing.
So the solution is to set also the frame's origin after you set the position. You just have to set the frame's position to integer values. On the UI this change won't be seen, but the image will not be blurry.

Resources