Flip a scaled and rotated uiview in objective c - ios

I am trying to flip a scaled and rotated uiview on the horizontal axis.
Here is the code being used -
CGFloat xScale = selectedFrame.transform.a;
CGFloat yScale = selectedFrame.transform.d;
selectedFrame.layer.transform = CATransform3DScale(CATransform3DMakeRotation(M_PI, 0, 1, 0),xScale, yScale,-1);
The output of this is that the view flips properly and the original scale factor is also manitained but the rotation isnt.
Here are the images to explain the problem -
Original Image (The tiger view has to be flipped on the horizontal axis) -
Flipped image after above code (see that the scale is maintained but the rotation angle isnt and the image s flipped properly) -
Any help would be really appreciated!

To flip the image you are using a rotation of M_PI around the Y axis. To rotate the image, you need to apply another different rotation around the Z axis. These are two different transforms. You can combine them using CATransform3DConcat. Then you can scale the resulting transform.
CATransform3D transform = CATransform3DConcat(CATransform3DMakeRotation(zRotation, 0, 0, 1),CATransform3DMakeRotation(M_PI, 0, 1, 0));
[layer setTransform:CATransform3DScale(transform,xScale, yScale,1)];
The problem with your original code is that you are only applying the Y axis transform.
I'm sure there is a more elegant way to do this, but this works on my simulator.

Related

Rotate view around y axis

How to rotate a UIView around its Y axis using CGAffineTransform (or other)?
I've tried for example with:
self.image.transform = CGAffineTransform(rotationAngle: 2*CGFloat.pi)
but only rotate around Z axis.
That is not what a CGAffineTransform is. You want a CATransform3D.
But you cannot set that on a UIView transform. You will have to set it on the view's layer (or on some CALayer).
https://developer.apple.com/documentation/quartzcore/calayer/1410836-transform

iOS - Draw image with CGContext and transform

I am trying to draw an image on top of another image. I have the image's size, transform and origin. My code below shows correct size and transform angle but not at the correct point.
Code:
UIGraphicsBeginImageContextWithOptions(backgroundImage.size, NO, [[UIScreen mainScreen] scale]);
CGContextRef context = UIGraphicsGetCurrentContext();
CGRect baseRect = CGRectMake(0, 0, backgroundImage.size.width, backgroundImage.size.height);
[backgroundImage drawInRect:baseRect];
CGRect newRect = CGRectMake(x, y, width, height);
CGContextTranslateCTM(context, x, y);
CGContextConcatCTM(context, watermarkImageView.transform);
CGContextTranslateCTM(context, -x, -y);
[watermarkImageView.image drawInRect:newRect];
UIImage* result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
The watermark image should be placed like this:
But currently its looking like this:
What did I miss?
Thanks in advance
EDIT
The x,y is the edge of the bounding box
Your code doesn't show what watermarkImageView.transform is and that is important because when you concat transformations, the effects of previous transformations will also effect all the following transformations.
E.g. a translation that moves the object 10 pixels along the x-axis will move the object 10 pixels to the right. However, if you first have a rotation that rotates the object by 45 degrees and then add a translation that moves 10 pixels along the x-axis, the object will not move 10 pixels to the right, it will move 10 pixels along a line that is 45 degrees rotated, which means it will move about 7 pixels up and 7 pixels to the right. That's because a rotation does not really rotate the object itself, it actually rotates the whole coordinate system which causes the object to be drawn rotated.
See this image:
Initially the translation coordinate system (red lines) matches the "real coordinate" system. But after the rotation by 45 degrees, the translation coordinate system has been rotated and now translating across the red lines moves the object diagonally.
Think about a sheet of paper and a stamp. The stamp always has the same position and the same orientation, you cannot move or rotate the stamp. But you can move and rotate the sheet of paper below the stamp! And that's what your transformations do. They transform the sheet before the stamp is pressed upon it.
For most people it is very hard to imagine the effects of transforming the whole space, it's much easier for them to think about transforming the object. The trick is: You must read your transformations in the opposite order than you wrote them. I guess what you want to do is actually:
CGContextTranslateCTM(context, x, y);
CGContextConcatCTM(context, watermarkImageView.transform);
CGContextTranslateCTM(context,
-watermarkImageView.size.with * 0.5,
-watermarkImageView.size.height * 0.5
);
Now read them in the opposite order (from bottom to top). First you center the watermark around (0,0) by moving it up half the height and left half the width. Now the center of your watermark is exactly at (0,0). Then you rotate it as desired. Finally you move it to the desired position. Of course you wrote all transformations the other way round but that's only because you are transforming the coordinate space, not the object.
Centering your watermark prior to rotation is important because rotation always rotates around (0,0) coordinates. If you'd just rotate, the rotation looks like this:
That's not what you want as it will not just rotate the object but also changes its position. If you center the image around (0,0) first, the rotation looks like this instead:
The answer to my question was
I had to translate to the centre of where I want to draw the context.
CGContextTranslateCTM(context, imageView.center.x, imageView.center.y);
Then rotate context.
CGFloat angle = [(NSNumber *)[imageView valueForKeyPath:#"layer.transform.rotation.z"] floatValue];
CGContextRotateCTM(context, angle);
Then draw
[imageView.image drawInRect:CGRectMake(-width * 0.5f, -height * 0.5f, width, height)];

Convert Video frame's aspect ratio to screen dimensions

My app uses the camera, and it processes every frame, looking for some elements inside the image (such as faces).
When I find something on the image, I want to draw on the screen showing the video using drawRect:(CGRect)rect of custom view.
I'm not drawing only rectangles, and I'm not using GPU and GLKView.
When I run video frames with 1920*1080 I can draw the exact location by dividing screen width and height with video frame size.
However when I change to 480*360 video resolution, the elements are not drawn at the correct location, I believe due to aspect ratio differences.
Any idea what conversion I need to do here, perhaps AffineTransfrom?
Yes. it is due to the aspect ratio. In one of my project I had faced a similar problem and I did the following.
Find the scale factor
float xScale = destionation.size.width / imageSize.width;//destination is the max image drawing area.
float yScale = destionation.size.height / imageSize.height;
float scaleFactor = xScale < yScale ? xScale : yScale;
Find the drawing rectangle by
float destinationHeight = destionation.size.height * scaleFactor;
float destinationWidth = destionation.size.width * scaleFactor;
this will give you a rectangle which can hold the image in aspect ratio. And draw your image using this rectangle. Also you can align this rectangle to view's center by multiplying the center point with scale factor.
Now convert your face rectangle from Image size to drawing rectangles by multiplying with scale factor.
Use the converted points to draw the rectangle on destination rectangle.

How to Rotate CAShapeLayer containing UIBezierPath? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Rotate CGPath without changing its position
I searched and tested a variety of code for a couple of hours and I can't get this to work.
I am adding an arbitrary UIBezierPath at a random location to a CAShapeLayer which gets added to a view. I need to rotate the path so that I can handle device rotations. I can rotate the layer instead of the path. I just need the result to be rotated.
I already have methods to handle transforming the bezier path by scaling and translation. It works great, but now I need to simply rotate 90 degrees left or right.
Any recommendations on how to do this?
Basic code:
UIBezierPath *path = <create arbitrary path>
CAShapeLayer *layer = [CAShapeLayer layer];
[self addPathToLayer:layer
fromPath:path];
// I could get the center of the box but where is the box center for the view it is in?
// CGRect box = CGPathGetPathBoundingBox(path.CGPath);
// layer.anchorPoint = ? How to find the center of the box for the anchor point?
// Rotating here appears to rotate around 0,0 of the view
layer.transform = CATransform3DMakeRotation(DegreesToRadians(-90), 0.0, 0.0, 1.0);
I see the following post:
BezierPath Rotation in a UIView
I suppose I could rotate as-is and then translate the path back into place. I just need to figure out what the translation values would be.
I should also state that what I am seeing after I try to rotate is that the image moves off-screen somewhere. I tried rotating 25 degrees to see movement and it pivots around the view's origin of 0,0 so that if I rotate 90 degrees the image is off-screen. I am running these test WITHOUT rotating the device - just to see how rotation works.
UPDATE #1 - 12/4/2012: For some bizarre reason if I set the position to a value I found empirically it moves the rotated bezier path into the correct position after rotation:
layer.position = CGPointMake(280, 60);
This values are a guess from starting/stopping the app and making adjustments. I have no idea why I need to adjust the position on rotation. The anchor point should be in the center of the layer. However, I did find that both the frame and position of a CAShapeLayer are all ZERO even though the path is set, and also the fact that the path is in the correct position within the view. The 280, 60 position shifts the path into what would be the center of the path bounding box when a rotation of +90 is made. If I change the rotation value I need to adjust the position. I should not have to do this manually adjustment.
I think a last resort is to somehow convert the bezier path to an image and then add it. I found that if I set the layer content to an image, then rotate, it rotates about its center point with no positional adjustment needed. Not so with setting the path.
UPDATE #2 12/4/2012 - I tried setting the frame and with fiddling I get it to center as follows:
CGRect box = CGPathGetPathBoundingBox(path.CGPath);
CGRect rect = CGRectMake(0, 0, box.origin.x + (3.5 * box.size.width), box.origin.y + (3.5 * box.size.height));
layer.frame = rect;
layer.transform = CATransform3DMakeRotation(DegreesToRadians(90), 0.0, 0.0, 1.0);
Why multiply by 3.5? I have no clue. I found that adding the box origin with about 3.5 times the size of the box shifts the rotated CAShapeLayer path to about where it should be.
There must be a better way to do this. This is a better solution than my previous post since the frame size does not depend on the rotation angle. I just don't know why the frame needs to be set to the value I am setting it to. I THOUGHT it should be
CGRectMake(0, 0, box.origin.x + (box.size.width / 2), box.origin.y + (box.size.height / 2));
However, it shifts the image to the left too much.
Another clue I found is that if I set the frame of [self view].frame (the frame of the entire parent view, which is the screen of the iPhone), then rotate, the rotation point is the center of the screen, an the path/image orbits around this center point. This is why I tried shifting the frame to what the center of the path should be so that it orbits around the box center.
UPDATE #3 12/4/2012 - I tried to render the layer as an image. However, it appears that just setting the path of a layer does not make it an "image" in the layer since it is empty
CGRect box = CGPathGetPathBoundingBox(path.CGPath);
layer.frame = box;
UIImage *image = [ImageHelper imageFromLayer:layer]; // ImageHelper library I created
CAShapeLayer *newLayer = [CAShapeLayer layer];
newLayer.frame = CGRectMake(box.origin.x, box.origin.y, image.size.width, image.size.height);
newLayer.contents = (id) image.CGImage;
It appears that rotating the layer with its path set is no different than simply rotating the bezier path itself. I will go back to rotating the bezier path and see if I can fiddle with the position elements or something. There's got to be a solution to this.
Goal: Rotate a UIBezierPath around its center point within the view it was originally created in.
UPDATE #4 12/4/2012 - I ran a series of tests measuring the values needed for translation in order to place a UIBezierPath in its previous center location.
CGAffineTransform rotate = CGAffineTransformMakeRotation(DegreesToRadians(-15));
[path applyTransform:rotate];
// CGAffineTransform translate = CGAffineTransformMakeTranslation(-110, 70); // -45
CGAffineTransform translate = CGAffineTransformMakeTranslation(-52, -58); // -15
[path applyTransform:translate];
However, the ratios of x/y translations do not correspond so I cannot extrapolate what translation is required based on the angle. It appears that 'CGAffineTransformMakeRotation' uses some arbitrary anchor put to make the rotation, which at the moment appears to be maybe (viewWidth / 2, 0). I am making this much harder than it needs to be. There's something I am missing to make a simple rotation so that the center point is maintained. I just need to "spin" the path 90 degrees left or right.
UPDATE #5 12/4/2012 - After running additional tests it appears that the anchor point for rotating a UIBezierPath is the origin from where all of the points were drawn. In this case the origin is 0,0 and all of the points are relative to that point. Therefore, it a rotation is applied, the rotation is occurring around the origin, and is why the path shifts up-right on -90 and up-left on 90. I need to somehow set the anchor point for the rotation to the center so it "spins" around the center, rather than the original origin point. 12 hours spent on this one issue.
After some detailed analysis and graphing the bounding box on paper I found my assertion that the origin of 0,0 is correct.
A solution to this problem is to translate the path (the underlying matrix) to the origin, with the center of the bounding box at origin, rotate, then translate the path back to its original location.
Here's how to rotate a UIBezierPath 90 degrees:
CGAffineTransform translate = CGAffineTransformMakeTranslation(-1 * (box.origin.x + (box.size.width / 2)), -1 * (box.origin.y + (box.size.height / 2)));
[path applyTransform:translate];
CGAffineTransform rotate = CGAffineTransformMakeRotation(DegreesToRadians(90));
[path applyTransform:rotate];
translate = CGAffineTransformMakeTranslation((box.origin.x + (box.size.width / 2)), (box.origin.y + (box.size.height / 2)));
[path applyTransform:translate];
Plug in -90 degrees to rotate in the other direction.
This formula can be used when rotating the device from portrait to landscape and vice/versa.
I still don't think this is the ideal solution but the result is what I need for now.
If anyone has a better solution for this please post.
UPDATE 12/7/2012 - I found what I think is the best solution, and very simple as I though it would be. Rather than using rotate, translate, and scale methods on the bezier path, I instead extract the array of points as CGPoint objects, and scale/translate them as needed based on the view size as well as the orientation. I then create a new bezier path and set the layer to this path.
The result is perfect scaling, translation, rotation.

How to compose Core Animation CATransform3D matrices to animate simultaneous translation and scaling

I want to simultaneously scale and translate a CALayer from one CGrect (a small one, from a button) to a another (a bigger, centered one, for a view). Basically, the idea is that the user touches a button and from the button, a CALayer reveals and translates and scales up to end up centered on the screen. Then the CALayer (through another button) shrinks back to the position and size of the button.
I'm animating this through CATransform3D matrices. But the CALayer is actually the backing layer for a UIView (because I also need Responder functionality). And while applying my scale or translation transforms separately works fine. The concatenation of both (translation, followed by scaling) offsets the layer's position so that it doesn't align with the button when it shrinks.
My guess is that this is because the CALayer anchor point is in its center by default. The transform applies translation first, moving the 'big' CALayer to align with the button at the upper left corner of their frames. Then, when scaling takes place, since the CALayer anchor point is in the center, all directions scale down towards it. At this point, my layer is the button's size (what I want), but the position is offset (cause all points shrank towards the layer center).
Makes sense?
So I'm trying to figure out whether instead of concatenating translation + scale, I need to:
translate
change anchor point to upper-left.
scale.
Or, if I should be able to come up with some factor or constant to incorporate to the values of the translation matrix, so that it translates to a position offset by what the subsequent scaling will in turn offset, and then the final position would be right.
Any thoughts?
You should post your code. It is generally much easier for us to help you when we can look at your code.
Anyway, this works for me:
- (IBAction)showZoomView:(id)sender {
[UIView animateWithDuration:.5 animations:^{
self.zoomView.layer.transform = CATransform3DIdentity;
}];
}
- (IBAction)hideZoomView:(id)sender {
CGPoint buttonCenter = self.hideButton.center;
CGPoint zoomViewCenter = self.zoomView.center;
CATransform3D transform = CATransform3DIdentity;
transform = CATransform3DTranslate(transform, buttonCenter.x - zoomViewCenter.x, buttonCenter.y - zoomViewCenter.y, 0);
transform = CATransform3DScale(transform, .001, .001, 1);
[UIView animateWithDuration:.5 animations:^{
self.zoomView.layer.transform = transform;
}];
}
In my test case, self.hideButton and self.zoomView have the same superview.

Resources