How to Rotate CAShapeLayer containing UIBezierPath? [duplicate] - ios

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Rotate CGPath without changing its position
I searched and tested a variety of code for a couple of hours and I can't get this to work.
I am adding an arbitrary UIBezierPath at a random location to a CAShapeLayer which gets added to a view. I need to rotate the path so that I can handle device rotations. I can rotate the layer instead of the path. I just need the result to be rotated.
I already have methods to handle transforming the bezier path by scaling and translation. It works great, but now I need to simply rotate 90 degrees left or right.
Any recommendations on how to do this?
Basic code:
UIBezierPath *path = <create arbitrary path>
CAShapeLayer *layer = [CAShapeLayer layer];
[self addPathToLayer:layer
fromPath:path];
// I could get the center of the box but where is the box center for the view it is in?
// CGRect box = CGPathGetPathBoundingBox(path.CGPath);
// layer.anchorPoint = ? How to find the center of the box for the anchor point?
// Rotating here appears to rotate around 0,0 of the view
layer.transform = CATransform3DMakeRotation(DegreesToRadians(-90), 0.0, 0.0, 1.0);
I see the following post:
BezierPath Rotation in a UIView
I suppose I could rotate as-is and then translate the path back into place. I just need to figure out what the translation values would be.
I should also state that what I am seeing after I try to rotate is that the image moves off-screen somewhere. I tried rotating 25 degrees to see movement and it pivots around the view's origin of 0,0 so that if I rotate 90 degrees the image is off-screen. I am running these test WITHOUT rotating the device - just to see how rotation works.
UPDATE #1 - 12/4/2012: For some bizarre reason if I set the position to a value I found empirically it moves the rotated bezier path into the correct position after rotation:
layer.position = CGPointMake(280, 60);
This values are a guess from starting/stopping the app and making adjustments. I have no idea why I need to adjust the position on rotation. The anchor point should be in the center of the layer. However, I did find that both the frame and position of a CAShapeLayer are all ZERO even though the path is set, and also the fact that the path is in the correct position within the view. The 280, 60 position shifts the path into what would be the center of the path bounding box when a rotation of +90 is made. If I change the rotation value I need to adjust the position. I should not have to do this manually adjustment.
I think a last resort is to somehow convert the bezier path to an image and then add it. I found that if I set the layer content to an image, then rotate, it rotates about its center point with no positional adjustment needed. Not so with setting the path.
UPDATE #2 12/4/2012 - I tried setting the frame and with fiddling I get it to center as follows:
CGRect box = CGPathGetPathBoundingBox(path.CGPath);
CGRect rect = CGRectMake(0, 0, box.origin.x + (3.5 * box.size.width), box.origin.y + (3.5 * box.size.height));
layer.frame = rect;
layer.transform = CATransform3DMakeRotation(DegreesToRadians(90), 0.0, 0.0, 1.0);
Why multiply by 3.5? I have no clue. I found that adding the box origin with about 3.5 times the size of the box shifts the rotated CAShapeLayer path to about where it should be.
There must be a better way to do this. This is a better solution than my previous post since the frame size does not depend on the rotation angle. I just don't know why the frame needs to be set to the value I am setting it to. I THOUGHT it should be
CGRectMake(0, 0, box.origin.x + (box.size.width / 2), box.origin.y + (box.size.height / 2));
However, it shifts the image to the left too much.
Another clue I found is that if I set the frame of [self view].frame (the frame of the entire parent view, which is the screen of the iPhone), then rotate, the rotation point is the center of the screen, an the path/image orbits around this center point. This is why I tried shifting the frame to what the center of the path should be so that it orbits around the box center.
UPDATE #3 12/4/2012 - I tried to render the layer as an image. However, it appears that just setting the path of a layer does not make it an "image" in the layer since it is empty
CGRect box = CGPathGetPathBoundingBox(path.CGPath);
layer.frame = box;
UIImage *image = [ImageHelper imageFromLayer:layer]; // ImageHelper library I created
CAShapeLayer *newLayer = [CAShapeLayer layer];
newLayer.frame = CGRectMake(box.origin.x, box.origin.y, image.size.width, image.size.height);
newLayer.contents = (id) image.CGImage;
It appears that rotating the layer with its path set is no different than simply rotating the bezier path itself. I will go back to rotating the bezier path and see if I can fiddle with the position elements or something. There's got to be a solution to this.
Goal: Rotate a UIBezierPath around its center point within the view it was originally created in.
UPDATE #4 12/4/2012 - I ran a series of tests measuring the values needed for translation in order to place a UIBezierPath in its previous center location.
CGAffineTransform rotate = CGAffineTransformMakeRotation(DegreesToRadians(-15));
[path applyTransform:rotate];
// CGAffineTransform translate = CGAffineTransformMakeTranslation(-110, 70); // -45
CGAffineTransform translate = CGAffineTransformMakeTranslation(-52, -58); // -15
[path applyTransform:translate];
However, the ratios of x/y translations do not correspond so I cannot extrapolate what translation is required based on the angle. It appears that 'CGAffineTransformMakeRotation' uses some arbitrary anchor put to make the rotation, which at the moment appears to be maybe (viewWidth / 2, 0). I am making this much harder than it needs to be. There's something I am missing to make a simple rotation so that the center point is maintained. I just need to "spin" the path 90 degrees left or right.
UPDATE #5 12/4/2012 - After running additional tests it appears that the anchor point for rotating a UIBezierPath is the origin from where all of the points were drawn. In this case the origin is 0,0 and all of the points are relative to that point. Therefore, it a rotation is applied, the rotation is occurring around the origin, and is why the path shifts up-right on -90 and up-left on 90. I need to somehow set the anchor point for the rotation to the center so it "spins" around the center, rather than the original origin point. 12 hours spent on this one issue.

After some detailed analysis and graphing the bounding box on paper I found my assertion that the origin of 0,0 is correct.
A solution to this problem is to translate the path (the underlying matrix) to the origin, with the center of the bounding box at origin, rotate, then translate the path back to its original location.
Here's how to rotate a UIBezierPath 90 degrees:
CGAffineTransform translate = CGAffineTransformMakeTranslation(-1 * (box.origin.x + (box.size.width / 2)), -1 * (box.origin.y + (box.size.height / 2)));
[path applyTransform:translate];
CGAffineTransform rotate = CGAffineTransformMakeRotation(DegreesToRadians(90));
[path applyTransform:rotate];
translate = CGAffineTransformMakeTranslation((box.origin.x + (box.size.width / 2)), (box.origin.y + (box.size.height / 2)));
[path applyTransform:translate];
Plug in -90 degrees to rotate in the other direction.
This formula can be used when rotating the device from portrait to landscape and vice/versa.
I still don't think this is the ideal solution but the result is what I need for now.
If anyone has a better solution for this please post.
UPDATE 12/7/2012 - I found what I think is the best solution, and very simple as I though it would be. Rather than using rotate, translate, and scale methods on the bezier path, I instead extract the array of points as CGPoint objects, and scale/translate them as needed based on the view size as well as the orientation. I then create a new bezier path and set the layer to this path.
The result is perfect scaling, translation, rotation.

Related

iOS - Draw image with CGContext and transform

I am trying to draw an image on top of another image. I have the image's size, transform and origin. My code below shows correct size and transform angle but not at the correct point.
Code:
UIGraphicsBeginImageContextWithOptions(backgroundImage.size, NO, [[UIScreen mainScreen] scale]);
CGContextRef context = UIGraphicsGetCurrentContext();
CGRect baseRect = CGRectMake(0, 0, backgroundImage.size.width, backgroundImage.size.height);
[backgroundImage drawInRect:baseRect];
CGRect newRect = CGRectMake(x, y, width, height);
CGContextTranslateCTM(context, x, y);
CGContextConcatCTM(context, watermarkImageView.transform);
CGContextTranslateCTM(context, -x, -y);
[watermarkImageView.image drawInRect:newRect];
UIImage* result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
The watermark image should be placed like this:
But currently its looking like this:
What did I miss?
Thanks in advance
EDIT
The x,y is the edge of the bounding box
Your code doesn't show what watermarkImageView.transform is and that is important because when you concat transformations, the effects of previous transformations will also effect all the following transformations.
E.g. a translation that moves the object 10 pixels along the x-axis will move the object 10 pixels to the right. However, if you first have a rotation that rotates the object by 45 degrees and then add a translation that moves 10 pixels along the x-axis, the object will not move 10 pixels to the right, it will move 10 pixels along a line that is 45 degrees rotated, which means it will move about 7 pixels up and 7 pixels to the right. That's because a rotation does not really rotate the object itself, it actually rotates the whole coordinate system which causes the object to be drawn rotated.
See this image:
Initially the translation coordinate system (red lines) matches the "real coordinate" system. But after the rotation by 45 degrees, the translation coordinate system has been rotated and now translating across the red lines moves the object diagonally.
Think about a sheet of paper and a stamp. The stamp always has the same position and the same orientation, you cannot move or rotate the stamp. But you can move and rotate the sheet of paper below the stamp! And that's what your transformations do. They transform the sheet before the stamp is pressed upon it.
For most people it is very hard to imagine the effects of transforming the whole space, it's much easier for them to think about transforming the object. The trick is: You must read your transformations in the opposite order than you wrote them. I guess what you want to do is actually:
CGContextTranslateCTM(context, x, y);
CGContextConcatCTM(context, watermarkImageView.transform);
CGContextTranslateCTM(context,
-watermarkImageView.size.with * 0.5,
-watermarkImageView.size.height * 0.5
);
Now read them in the opposite order (from bottom to top). First you center the watermark around (0,0) by moving it up half the height and left half the width. Now the center of your watermark is exactly at (0,0). Then you rotate it as desired. Finally you move it to the desired position. Of course you wrote all transformations the other way round but that's only because you are transforming the coordinate space, not the object.
Centering your watermark prior to rotation is important because rotation always rotates around (0,0) coordinates. If you'd just rotate, the rotation looks like this:
That's not what you want as it will not just rotate the object but also changes its position. If you center the image around (0,0) first, the rotation looks like this instead:
The answer to my question was
I had to translate to the centre of where I want to draw the context.
CGContextTranslateCTM(context, imageView.center.x, imageView.center.y);
Then rotate context.
CGFloat angle = [(NSNumber *)[imageView valueForKeyPath:#"layer.transform.rotation.z"] floatValue];
CGContextRotateCTM(context, angle);
Then draw
[imageView.image drawInRect:CGRectMake(-width * 0.5f, -height * 0.5f, width, height)];

Rotating a view using another view's center as the anchor point

I have two image views. The first is the blueish arrow, and the second is the white circle, with a black dot drawn to represent the center of the circle.
I'm trying to rotate the arrow so it's anchor point is the black dot in the picture like this
Right now I'm setting the anchor point of the arrow's layer to a point calculated like this
CGFloat y = _userImageViewContainer.center.y - CGRectGetMinY(_directionArrowView.frame);
CGFloat x = _userImageViewContainer.center.x - CGRectGetMinX(_directionArrowView.frame);
CGFloat yOff = y / CGRectGetHeight(_directionArrowView.frame);
CGFloat xOff = x / CGRectGetWidth(_directionArrowView.frame);
_directionArrowView.center = _userImageViewContainer.center;
CGPoint anchor = CGPointMake(xOff, yOff);
NSLog(#"anchor: %#", NSStringFromCGPoint(anchor));
_directionArrowView.layer.anchorPoint = anchor;
Since the anchor point is set as a percentage of the view, i.e. the coords for the center are (.5, .5), I'm calculating the percentage of the height in arrow's frame where the black dot falls. But my math, even after working out by hand, keeps resulting in .5, which isn't right because it's further than half way down when the arrow is in the original position (vertical, with the point up).
I'm rotating based on the user's compass heading
CLHeading *heading = [notif object];
// update direction of arrow
CGFloat degrees = [self p_calculateAngleBetween:[PULAccount currentUser].location.coordinate
and:_user.location.coordinate];
_directionArrowView.transform = CGAffineTransformMakeRotation((degrees - heading.trueHeading) * M_PI / 180);
The rotation is correct, it's just the anchor point that's not working right. Any ideas of how to accomplish this?
I've always found the anchor point stuff flaky, especially with rotation. I'd try something like this.
CGPoint convertedCenter = [_directionArrowView convertPoint:_userImageViewContainer.center fromView:_userImageViewContainer ];
CGSize offset = CGSizeMake(_directionArrowView.center.x - convertedCenter.x, _directionArrowView.center.y - convertedCenter.y);
// I may have that backwards, try the one below if it offsets the rotation in the wrong direction..
// CGSize offset = CGSizeMake(convertedCenter.x -_directionArrowView.center.x , convertedCenter.y - _directionArrowView.center.y);
CGFloat rotation = 0; //get your angle (radians)
CGAffineTransform tr = CGAffineTransformMakeTranslation(-offset.width, -offset.height);
tr = CGAffineTransformConcat(tr, CGAffineTransformMakeRotation(rotation) );
tr = CGAffineTransformConcat(tr, CGAffineTransformMakeTranslation(offset.width, offset.height) );
[_directionArrowView setTransform:tr];
NB. the transform property on UIView is animatable, so you could put that last line there in an animation block if desired..
Maybe better use much easier solution - make arrow image size bigger, and square. So the black point will be in center of image.
Please compare attached images and you understand what I'm talking about
New image with black dot in center
Old image with shifted dot
Now you can easy use standard anchor point (0.5, 0.5) to rotate edited image

Moving a Rotated UIButton

I have a problem. I'm working on making a game. As part of my game I need images to be rotated and then moved in the direction of the rotated angle inside a game loop (using an NSTimer). In essence I'm trying to create the effect of launching a projectile. The code works fine when moving in perpendicular directions such as 0, 90, 180, 270, and 360 degrees, but any other angle and the image starts to glitch out. The object on the screen maintains its correct bounds and contents, but the actual displayed image disappears. Does anybody know what the problem is or someway I could get around it? If needed, I can make and post a video of my problem so you can see what I'm talking about.
Here is a sample of the code I'm using. The "background" variable is just a UIImageView:
angle = 60;
background.transform = CGAffineTransformRotate(object.transform, angle*M_PI/180); //converts degrees to radians and rotates the image
background.frame = CGRectMake( background.frame.origin.x + cos(angle*m_PI/180)*32; background.frame.origin.y -sin(angle*M_PI/180)*32, background.frame.size.width, background.frame.size.height); //moves the image in the direction of the angle
For starters, there is a semicolon after the x origin in your CGRect instead of a comma. Was that just a typo?
The UIView documentation for frame states:
Warning: If the transform property is not the identity transform, the
value of this property is undefined and therefore should be ignored.
Changes to this property can be animated. However, if the transform
property contains a non-identity transform, the value of the frame
property is undefined and should not be modified. In that case, you
can reposition the view using the center property and adjust the size
using the bounds property instead.
So there you have it, you should not be trying to change the frame when setting a custom transform. You are only trying to adjust the position of the view anyway so just modify your code to adjust center instead of the origin coordinates.
To change the size, you can use the bounds.
CGRect bounds = myView.bounds;
bounds.size.width = whatever;
bounds.size.height = whatever;
myView.bounds = bounds;

How can I mirror a UIBezierPath?

I have a UIBezierPath and I would like to get its mirror image. How can I accomplish this?
// Method for generating a path
UIBezierPath *myPath = [self generateAPathInBounds:boundingRect];
// ? now I would like to mirror myPath ?
// Method for generating a path
UIBezierPath *myPath = [self generateAPathInBounds:boundingRect];
// Create two transforms, one to mirror across the x axis, and one to
// to translate the resulting path back into the desired boundingRect
CGAffineTransform mirrorOverXOrigin = CGAffineTransformMakeScale(-1.0f, 1.0f);
CGAffineTransform translate = CGAffineTransformMakeTranslation(boundingRect.width, 0);
// Apply these transforms to the path
[myPath applyTransform:mirrorOverXOrigin];
[myPath applyTransform:translate];
It should be noted that you might need additional code depending on where in the overall view the path you are trying to mirror is.
If you are trying to mirror a path that takes up the whole view (or is at least flush with the 0 coordinate of the axis you are mirroring over), then #wbarksdale's answer will work for you. However, if you are trying to mirror a path thats a small section of the overall view thats somewhere in the middle of the view, then you need to do more work. In general, the algorithm is like this
//this rect should be the bounds of your path within its superviews
//coordinate system
let rect = myPath.bounds
//first, you need to move the view all the way to the left
//because otherwise, if you mirror it in its current position,
//the view will be thrown way off screen to the left
path.apply(CGAffineTransform(translationX: -rect.origin.x, y: 0))
//then you mirror it
path.apply(CGAffineTransform(scaleX: -1, y: 1))
//then, after its mirrored, move it back to its original position
path.apply(CGAffineTransform(translationX: rect.origin.x + rect.width, y: 0))
In general, the algorithm should be
Move the path either to the very left or the very top of the view
depending on whether you are flipping horizontally or vertically, using CGAffineTransform(translateX....)
Mirror the view using CGAffineTransform(scaleX....
Move the view back to its original position using CGAffineTransform(translateX....)
I case your path in not centered to (0,0) this code is correct:
// Method for generating a path
UIBezierPath *myPath = [self generateAPathInBounds:boundingRect];
// Create two transforms, one to mirror across the x axis, and one to
// to translate the resulting path back into the desired boundingRect
CGAffineTransform mirrorOverXOrigin = CGAffineTransformMakeScale(-1.0f, 1.0f);
CGAffineTransform translate = CGAffineTransformMakeTranslation(CGRectGetMidX(boundingRect), 0);
CGAffineTransform translateBack = CGAffineTransformMakeTranslation(-CGRectGetMidX(boundingRect), 0);
// Apply these transforms to the path
[myPath applyTransform:translate];
[myPath applyTransform:mirrorOverXOrigin];
[myPath applyTransform:translateBack];

How to compose Core Animation CATransform3D matrices to animate simultaneous translation and scaling

I want to simultaneously scale and translate a CALayer from one CGrect (a small one, from a button) to a another (a bigger, centered one, for a view). Basically, the idea is that the user touches a button and from the button, a CALayer reveals and translates and scales up to end up centered on the screen. Then the CALayer (through another button) shrinks back to the position and size of the button.
I'm animating this through CATransform3D matrices. But the CALayer is actually the backing layer for a UIView (because I also need Responder functionality). And while applying my scale or translation transforms separately works fine. The concatenation of both (translation, followed by scaling) offsets the layer's position so that it doesn't align with the button when it shrinks.
My guess is that this is because the CALayer anchor point is in its center by default. The transform applies translation first, moving the 'big' CALayer to align with the button at the upper left corner of their frames. Then, when scaling takes place, since the CALayer anchor point is in the center, all directions scale down towards it. At this point, my layer is the button's size (what I want), but the position is offset (cause all points shrank towards the layer center).
Makes sense?
So I'm trying to figure out whether instead of concatenating translation + scale, I need to:
translate
change anchor point to upper-left.
scale.
Or, if I should be able to come up with some factor or constant to incorporate to the values of the translation matrix, so that it translates to a position offset by what the subsequent scaling will in turn offset, and then the final position would be right.
Any thoughts?
You should post your code. It is generally much easier for us to help you when we can look at your code.
Anyway, this works for me:
- (IBAction)showZoomView:(id)sender {
[UIView animateWithDuration:.5 animations:^{
self.zoomView.layer.transform = CATransform3DIdentity;
}];
}
- (IBAction)hideZoomView:(id)sender {
CGPoint buttonCenter = self.hideButton.center;
CGPoint zoomViewCenter = self.zoomView.center;
CATransform3D transform = CATransform3DIdentity;
transform = CATransform3DTranslate(transform, buttonCenter.x - zoomViewCenter.x, buttonCenter.y - zoomViewCenter.y, 0);
transform = CATransform3DScale(transform, .001, .001, 1);
[UIView animateWithDuration:.5 animations:^{
self.zoomView.layer.transform = transform;
}];
}
In my test case, self.hideButton and self.zoomView have the same superview.

Resources