Resizing UIView using CGAffineTransform scale doesn't update frame - ios

I'm trying to resize a UIView (Parent) with a few subviews in it using CGAffineTransform scale.
I'm resizing the parent by dragging it from one corner using pan gesture.
The resizing works as expected but if I try to resize it again, it jumps back to the initial frame. It's like it never knew it was resized.
These are the steps I am doing so far:
1.- Just when the pan gesture begins I get the initial frame and the touch location in superview:
if gesture.state == .began {
//We get all initial values from the first touch
initialFrame = self.frame;
touchStart = gesture.location(in: superview)
}
2.- Then I go to the handle I'm dragging (Top right in this case), set the anchor point, calculate deltas (Initial touch - gesture distance traveled), calculate new frame, scales, and apply transform.
case topRight:
if gesture.state == .began {
self.setAnchorPoint(anchorPoint: CGPoint(x: 0, y: 1))
}
let deltaX = -1 * (touchStart.x - gesture.location(in: superview).x)
let deltaY = 1 * (touchStart.y - gesture.location(in: superview).y)
let newWidth = initialFrame.width + deltaX;
let newHeight = initialFrame.height + deltaY;
let scaleX:CGFloat = newWidth / initialFrame.width;
let scaleY:CGFloat = newHeight / initialFrame.height;
self.transform = CGAffineTransform.identity.scaledBy(x: scaleX, y: scaleY)
3.- Finally I reset the anchor point to the middle of the UIView.
if gesture.state == .ended {
self.setAnchorPoint(anchorPoint: CGPoint(x: 0.5, y: 0.5))
}
I attached a gif where you can see the UIView is resized from the top right handle. When I try to resize it again, it jumps back to the initial frame. (It seems that the video is restarted, but this is the jump)
what am I missing? do I need to update something else?
Thank you all!

So the answer is actually quite simple, this following line of code was incorrect:
self.transform = CGAffineTransform.identity.scaledBy(x: scaleX, y: scaleY)
Here I am applying the scaling to the identity transform which is always the original size.
Just by changing this to:
self.transform = self.initialTransform.scaledBy(x: scaleX, y: scaleY)
Now I am applying the scaling to the current transform which is always saved when the gesture begins.
#objc func handleUserPan(gesture:UIPanGestureRecognizer) {
if gesture.state == .began {
self.initialTransform = self.transform;
self.touchStart = gesture.location(in: resizableSelf.superview);
}
switch gesture.view! {
case self.topRight: //This is just the circle view
if gesture.state == .began {
self..setAnchorPoint(anchorPoint: CGPoint(x: 1, y: 1))
}
let deltaX = 1 * (self.touchStart.x - gesture.location(in: superview).x)
let deltaY = 1 * (self.touchStart.y - gesture.location(in: superview).y)
let newWidth = self.initialFrame.width + deltaX;
let newHeight = self.initialFrame.height + deltaY;
let scaleX:CGFloat = newWidth / self.initialFrame.width;
let scaleY:CGFloat = newHeight / self.initialFrame.height;
self.transform = self.initialTransform.scaledBy(x: scaleX, y: scaleY)
}
default:()
}
gesture.setTranslation(CGPoint.zero, in: self)
self.updateDragHandles()
if gesture.state == .ended {
self.setAnchorPoint(anchorPoint: CGPoint(x: 0.5, y: 0.5))
}
}
Hopefully this can be helpful for those who would like to scale a UIView similar to like Photoshop does with it's layers. It even works if the subviews were rotated.
Keep in mind that once the view is transformed, you cannot set the frame value anymore. That's why by having all views inside a superView is more convenient so it takes care of all resizing.
The goal was to replicate the same behavior that Photoshop does when resizing layers.

Related

How to scale view from top right corner

I'm changing anchor point by methods in this topic:
Changing my CALayer's anchorPoint moves the view but it's not working for me
What I see
The view is misplaced
What I Expect
The grey rectangle must expand from 1.0 in x and 0.0 in y without misplacement
My code:
private func setAnchorPoint(anchorPoint: CGPoint, view: UIView) {
let oldOrigin = view.frame.origin
view.layer.anchorPoint = anchorPoint
let newOrigin = view.frame.origin
let translation = CGPoint(x: newOrigin.x - oldOrigin.x, y: newOrigin.y - oldOrigin.y)
view.center = CGPoint(x: view.center.x - translation.x, y: view.center.y - translation.y)
}
setAnchorPoint(anchorPoint: CGPoint(x: 1.0, y: 0.0), view: self.actionLayer)
What I've tried
Recording in a variable the frame before changing the anchor point and setting again after, same for layer position or view center. Not working.

Why does CGAffineTransform(Rotate) work differently for UIBezierCurve and UIImageView?

I’m using a drawingView which is simply a VC’s view to draw a rectangle that rotates depending on a pan gesture. I apply 3 transforms on it. I translate the rectangle half it’s width left to put the top center of the rectangle at 0,0; rotate it some multiple of 45º and then translate it to the position of the gesture begin. This works perfectly.
When the pan gesture ends I want to drop an image view of the same size exactly where the rectangle is. The code for the drawing view are the save as the VC with the exception that in the drawing view the CGAffineTransform is applied to the path
func addImageView(_ x: CGFloat, _ y: CGFloat, _ direction: Direction) {
let myImageView = MyImageView(frame: CGRect(x: 0, y: 0, width: 116, height: 200))
myImageView = UIColor.systemBlue
var viewTransform = CGAffineTransform.identity
viewTransform = viewTransform.translatedBy(x: -58, y: 0)
var angle: CGFloat?
let pi = CGFloat.pi
switch direction {
case .up:
angle = 0
case .upRight:
angle = pi / 4.0
case .right:
angle = pi / 2.0
case .downRight:
angle = 3 * pi / 4.0
case .down:
angle = pi
case .downLeft:
angle = 5 * pi / 4.0
case .left:
angle = 3 * pi / 2.0
case .upLeft:
angle = 7 * pi / 4.0
default:
angle = nil
}
if angle != nil {
viewTransform = viewTransform.rotated(by: angle!)
}
viewTransform = viewTransform.translatedBy(x: x, y: y)
myImageView = viewTransform
view.addSubview(myImageView)
print("myImageView: \(myImageView)")
}
If there is no rotation the imageView drops perfectly. If there is rotation it (in all
but one case) vanishes off the screen. In one case it basically draws (at the correct angle) but in a very wrong position.
I’m not sure what the image view is rotating “About” and have had no luck tracking down the correct mathematical changes.
Edit: When doing just the initial translate and the rotation; everything is fine and the UIImage rotates around the center of the frame. When I then add a translate after the rotation; it changes the point around which the view rotates making a bigger circle.
So the problem is moving the rotated view into place after the rotation is done.
Drawing code for drawing view
class DraggingView: UIView {
var currVector:(origin: CGPoint, direction: Direction)?
// Only override draw() if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
override func draw(_ rect: CGRect) {
guard currVector != nil else {
return
}
let drawRect: CGRect = CGRect(x: 0, y: 0,
width: 116, height: 200)
let drawPath = UIBezierPath(roundedRect: drawRect, cornerRadius: 10.0)
drawPath.apply(CGAffineTransform(translationX: -58, y: 0))
var transform: CGAffineTransform?
let pi = 3.1415927
switch currVector!.direction {
case .downLeft:
transform = CGAffineTransform(rotationAngle: pi / 4.0)
case .downRight:
transform = CGAffineTransform(rotationAngle: -pi / 4.0)
case .left:
transform = CGAffineTransform(rotationAngle: pi / 2.0)
case .right:
transform = CGAffineTransform(rotationAngle: -pi / 2.0)
case .up:
transform = CGAffineTransform(rotationAngle: pi)
case .upLeft:
transform = CGAffineTransform(rotationAngle: 3 * pi / 4.0)
case .upRight:
transform = CGAffineTransform(rotationAngle: -3 * pi / 4.0)
default:
transform = nil
}
if transform != nil {
drawPath.apply(transform!)
}
drawPath.apply(CGAffineTransform(translationX: currVector!.origin.x, y: currVector!.origin.y))
UIColor.label.setStroke()
drawPath.stroke()
}
public func setVector(origin: CGPoint, direction: Direction) {
currVector = (origin, direction)
self.setNeedsDisplay()
}
public func clearVector() {
currVector = nil
}
}

Facing Issue with CGAffineTransform in NIB

I have 2 Views (viewLeft, viewRight). I am rotating these views with angel 0 (initial setup) by calculating the same anchor point so that it looks like views are rotating across hinge.
I have done below code, which is working fine if I took the views in storyboard (viewcontroller) and it does not work if I copy and paste same views and code in nib. In Nib, at 0 angel views leaves some space between them. It is weird behavior, I checked all code outlets and constraints, everything is fine but unable to figure out why it is not working.
I want it to be working in NIB.
Any help will be appreciable.
Github dummy project reference here
Outputs :
From Storyboard
From XIB
Code work
Set Anchor point
extension UIView {
func setAnchorPoint(_ point: CGPoint) {
var newPoint = CGPoint(x: bounds.size.width * point.x, y: bounds.size.height * point.y)
var oldPoint = CGPoint(x: bounds.size.width * layer.anchorPoint.x, y: bounds.size.height * layer.anchorPoint.y);
newPoint = newPoint.applying(transform)
oldPoint = oldPoint.applying(transform)
var position = layer.position
position.x -= oldPoint.x
position.x += newPoint.x
position.y -= oldPoint.y
position.y += newPoint.y
layer.position = position
layer.anchorPoint = point
}
}
Rotation function
func rotateView(_ addAngel : CGFloat) {
let rotateToAngel = angel + addAngel
guard rotateToAngel <= 0.6 && rotateToAngel >= -0.6 else {
return
}
angel = rotateToAngel
let anchorPointLeft = CGPoint(x: (viewLeft.frame.width - viewLeft.frame.height/2)/viewLeft.frame.width,
y: ( viewLeft.frame.height/2)/viewLeft.frame.height)
viewLeft.setAnchorPoint(anchorPointLeft)
let anchorPointRight = CGPoint(x: (viewRight.frame.height/2/viewRight.frame.width),
y: ( viewRight.frame.height/2)/viewRight.frame.height)
viewRight.setAnchorPoint(anchorPointRight)
self.viewLeft.transform = CGAffineTransform.init(rotationAngle: rotateToAngel)
self.viewRight.transform = CGAffineTransform.init(rotationAngle: -rotateToAngel)
}

Rotate a view from its origin towards another view

I am trying to rotate a view towards another views center point(Remember not around, its towards).
Assume I have 2 views placed like this
Now I want to rotate the topview to point the bottom view like this
so this what I did
Change the top views anchor point to its origin. so that it can rotate and point its edge to the bottom view
Calculated the angle between the first views origin point and the bottom views center
And applied the calculated transform to the top view.
Below the code I am using
let rect = CGRect(x: 70, y: 200, width: 300, height: 100)
let rectView = UIView(frame: rect)
rectView.layer.borderColor = UIColor.green.cgColor;
rectView.layer.borderWidth = 2;
let endView = UIView(frame: CGRect(x: 250, y: 450, width: 70, height: 70))
endView.layer.borderColor = UIColor.green.cgColor;
endView.layer.borderWidth = 2;
let end = endView.center;
self.view.addSubview(endView)
self.view.addSubview(rectView!)
rectView.setAnchorPoint(CGPoint.zero)
let angle = rectView.bounds.origin.angle(to: end);
UIView.animate(withDuration: 3) {
rectView.transform = rectView.transform.rotated(by: angle)
}
I am using this extension from Get angle from 2 positions to calculate the angle between 2 points
extension CGPoint {
func angle(to comparisonPoint: CGPoint) -> CGFloat {
let originX = comparisonPoint.x - self.x
let originY = comparisonPoint.y - self.y
let bearingRadians = atan2f(Float(originY), Float(originX))
var bearingDegrees = CGFloat(bearingRadians).degrees
while bearingDegrees < 0 {
bearingDegrees += 360
}
return bearingDegrees
}
}
extension CGFloat {
var degrees: CGFloat {
return self * CGFloat(180.0 / M_PI)
}
}
extension UIView {
func setAnchorPoint(_ point: CGPoint) {
var newPoint = CGPoint(x: bounds.size.width * point.x, y: bounds.size.height * point.y)
var oldPoint = CGPoint(x: bounds.size.width * layer.anchorPoint.x, y: bounds.size.height * layer.anchorPoint.y);
newPoint = newPoint.applying(transform)
oldPoint = oldPoint.applying(transform)
var position = layer.position
position.x -= oldPoint.x
position.x += newPoint.x
position.y -= oldPoint.y
position.y += newPoint.y
layer.position = position
layer.anchorPoint = point
}
}
But this isn't working as expected, the rotation is way off. Check the below screen capture of the issue
I assume this issue is related to how the angle is calculated, but I could't figure out what?
any help is much appreciated.
Angles need to be in radians
The rotation angle needs to be specified in radians:
Change:
rectView.transform = rectView.transform.rotated(by: angle)
to:
rectView.transform = rectView.transform.rotated(by: angle / 180.0 * .pi)
or change your angle(to:) method to return radians instead of degrees.
Use frame instead of bounds
Also, you need to use the frame of your rectView when computing the angle. The bounds of a view is its internal coordinate space, which means its origin is always (0, 0). You want the frame which is the coordinates of the view in its parent's coordinate system.
Change:
let angle = rectView.bounds.origin.angle(to: end)
to:
let angle = rectView.frame.origin.angle(to: end)
Note: Because your anchorPoint is the corner of rectView, this will point the top edge of rectView to the center of endView. One way to fix that would be to change your anchorPoint to the center of the left edge of rectView and then use that point to compute your angle.

UIView layer moves when rotating about non-centre anchor point

I have a UIView which covers another UIView. When the covering view is tapped, I want it to “fall”, this is where we animate it around its x axis about the bottom centre point.
I’ve implemented this like so:
private func showCategories(animated: Bool)
{
UIView.animateWithDuration(2,
delay: 0,
options: [.CurveEaseIn],
animations: {
self.setAnchorPoint(CGPoint(x: 0.5, y: 1), forView: self.titleView)
var rotationAndPerspectiveTransform = CATransform3DIdentity
rotationAndPerspectiveTransform.m34 = 1 / -500
rotationAndPerspectiveTransform = CATransform3DRotate(rotationAndPerspectiveTransform, CGFloat(-179 * M_PI / 180), 1, 0, 0);
self.titleView.layer.transform = rotationAndPerspectiveTransform;
}) { (success) -> Void in
//
}
}
I found a solution which said that changing the anchor point moves the view, so to fix it you need to move it taking in to consider the anchor point and transform. It provided this code:
private func setAnchorPoint(anchorPoint: CGPoint, forView view: UIView)
{
var newPoint = CGPoint(x: view.bounds.width * anchorPoint.x, y: view.bounds.height * anchorPoint.y)
var oldPoint = CGPoint(x: view.bounds.width * view.layer.anchorPoint.x, y: view.bounds.height * view.layer.anchorPoint.y)
newPoint = CGPointApplyAffineTransform(newPoint, view.transform);
oldPoint = CGPointApplyAffineTransform(oldPoint, view.transform);
var position = view.layer.position;
position.x -= oldPoint.x;
position.x += newPoint.x;
position.y -= oldPoint.y;
position.y += newPoint.y;
view.layer.position = position;
view.layer.anchorPoint = anchorPoint;
}
However, this still doesn’t work for me. Am I using it wrong?
P.s, I realise the animated parameter isn’t being taken in to account yet… It’s for debugging purposes.
Okay, so moving the setAnchor:forView: method to before the animation block works.
My bad.

Resources