Translating a UIView after rotating - ios

I'm trying to translate a UIView that has been either rotated and/or scaled using touches from the user. I try to translate it with user input as well:
- (void)handleObjectMove:(UIPanGestureRecognizer *)recognizer
{
static CGPoint lastPoint;
UIView *moveView = [recognizer view];
CGPoint newCoord = [recognizer locationInView:playArea];
// Check if this is the first touch
if( [recognizer state]==UIGestureRecognizerStateBegan )
{
// Store the initial touch so when we change positions we do not snap
lastPoint = newCoord;
}
// Create the frame offsets to use our finger position in the view.
float dX = newCoord.x;
float dY = newCoord.y;
dX-=lastPoint.x;
dY-=lastPoint.y;
// Figure out the translation based on how we are scaled
CGAffineTransform transform = [moveView transform];
CGFloat xScale = transform.a;
CGFloat yScale = transform.d;
dX/=xScale;
dY/=yScale;
lastPoint = newCoord;
[moveView setTransform:CGAffineTransformTranslate( transform, dX, dY )];
[recognizer setTranslation:CGPointZero inView:playArea];
}
But when I touch and move the view it gets translated in all different weird ways. Can I apply some sort of formula using the rotation values to translate properly?

The best solution I've found with having to use the least amount of math was to store the original translation, rotation, and scaling values separately and redo the transform when they were changed. My solution was to subclass a UIView with the following properties:
#property (nonatomic) CGPoint translation;
#property (nonatomic) CGFloat rotation;
#property (nonatomic) CGPoint scaling;
And the following functions:
- (void)rotationDelta:(CGFloat)delta
{
[self setRotation:[self rotation]+delta];
}
- (void)scalingDelta:(CGPoint)delta
{
[self setScaling:
(CGPoint){ [self scaling].x*delta.x, [self scaling].y*delta.y }];
}
- (void)translationDelta:(CGPoint)delta
{
[self setTranslation:
(CGPoint){ [self translation].x+delta.x, [self translation].y+delta.y }];
}
- (void)transformMe
{
// Start with the translation
CGAffineTransform transform = CGAffineTransformMakeTranslation( [self translation].x, [self translation].y );
// Apply scaling
transform = CGAffineTransformScale( transform, [self scaling].x, [self scaling].y );
// Apply rotation
transform = CGAffineTransformRotate( transform, [self rotation] );
[self setTransform:transform];
}
- (void)setScaling:(CGPoint)newScaling
{
scaling = newScaling;
[self transformMe];
}
- (void)setRotation:(CGFloat)newRotation
{
rotation = newRotation;
[self transformMe];
}
- (void)setTranslation:(CGPoint)newTranslation
{
translation = newTranslation;
[self transformMe];
}
And to use the following in the handlers:
- (void)handleObjectPinch:(UIPinchGestureRecognizer *)recognizer
{
if( [recognizer state] == UIGestureRecognizerStateEnded
|| [recognizer state] == UIGestureRecognizerStateChanged )
{
// Get my stuff
if( !selectedView )
return;
SelectableImageView *view = selectedView;
CGFloat scaleDelta = [recognizer scale];
[view scalingDelta:(CGPoint){ scaleDelta, scaleDelta }];
[recognizer setScale:1.0];
}
}
- (void)handleObjectMove:(UIPanGestureRecognizer *)recognizer
{
static CGPoint lastPoint;
SelectableImageView *moveView = (SelectableImageView *)[recognizer view];
CGPoint newCoord = [recognizer locationInView:playArea];
// Check if this is the first touch
if( [recognizer state]==UIGestureRecognizerStateBegan )
{
// Store the initial touch so when we change positions we do not snap
lastPoint = newCoord;
}
// Create the frame offsets to use our finger position in the view.
float dX = newCoord.x;
float dY = newCoord.y;
dX-=lastPoint.x;
dY-=lastPoint.y;
lastPoint = newCoord;
[moveView translationDelta:(CGPoint){ dX, dY }];
[recognizer setTranslation:CGPointZero inView:playArea];
}
- (void)handleRotation:(UIRotationGestureRecognizer *)recognizer
{
if( [recognizer state] == UIGestureRecognizerStateEnded
|| [recognizer state] == UIGestureRecognizerStateChanged )
{
if( !selectedView )
return;
SelectableImageView *view = selectedView;
CGFloat rotation = [recognizer rotation];
[view rotationDelta:rotation];
[recognizer setRotation:0.0];
}
}

Try Change moveView.center instead of Set (x,y) directly or either "CGAffineTransformTranslate"

Here is the Swift 4/5 version for a transformable UIView
class TransformableImageView: UIView{
var translation:CGPoint = CGPoint(x:0,y:0)
var scale:CGPoint = CGPoint(x:1, y:1)
var rotation:CGFloat = 0
override init (frame : CGRect) {
super.init(frame: frame)
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
func rotationDelta(delta:CGFloat) {
rotation = rotation + delta
}
func scaleDelta(delta:CGPoint){
scale = CGPoint(x: scale.x*delta.x, y: scale.y * delta.y)
}
func translationDelta(delta:CGPoint){
translation = CGPoint(x: translation.x+delta.x, y: translation.y + delta.y)
}
func transform(){
self.transform = CGAffineTransform.identity.translatedBy(x: translation.x, y: translation.y).scaledBy(x: scale.x, y: scale.y ).rotated(by: rotation )
}
}

I'm leaving this here as I also encountered the same problem. Here is how to do it in swift 2:
Add your top view as subview to your bottom view:
self.view.addSubview(topView)
Then add a PanGesture Recognizer to move on touch:
//Add PanGestureRecognizer to move
let panMoveGesture = UIPanGestureRecognizer(target: self, action: #selector(YourViewController.moveViewPanGesture(_:)))
topView.addGestureRecognizer(panMoveGesture)
And the function to move:
//Move function
func moveViewPanGesture(recognizer:UIPanGestureRecognizer)
{
if recognizer.state == .Changed {
var center = recognizer.view!.center
let translation = recognizer.translationInView(recognizer.view?.superview)
center = CGPoint(x:center.x + translation.x,
y:center.y + translation.y)
recognizer.view!.center = center
recognizer.setTranslation(CGPoint.zero, inView: recognizer.view)
}
}
Basically, you need to translate your view based on the bottom view which is its superview not itself. Like this: recognizer.view?.superview
Or if you also rotate the bottom view, you may add a view which is not going to have any trasformation, and add your bottom view to that not transforming view (very bottom view) and add your top view to bottom view accordingly as subview. Then you should translate your top view based on the very bottom view.

Related

iOS, can UIPanGesture be used to do UIImageView corner drag for resizing?

I've searched for some time (days) for a solution, but none really do what I need. (iOS, Objective C, BTW).
I have a UIImageView that I resize with a UIPanGestureRecognizer. The typical pan works fine. It seems like I am so close.
But I want to resize the ImageView by dragging a corner of the image and only resizing dimensions relevant to the selected corner. It works great if I only do my "handleResize" UIPanGesture method. But if I pinch or rotate the image, the bounds or frame get messed up. I think I need some sort of CGAffineTransform but I have not been able to get it to work.
I need help to point me in the right direction. I've been working with CGAffineTransforms but I may be on the wrong track.
In my ViewController.h I have a float, touchRadius, set to 25:
float touchRadius = 25;
I have a UIPanGestureRecognizer in my ViewController.m:
- (IBAction)handleResize:(UIPanGestureRecognizer *)recognizer {
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
//get the translation amount in x,y
CGPoint translation = [recognizer translationInView:self.view];
if (recognizer.state == UIGestureRecognizerStateBegan ||
recognizer.state == UIGestureRecognizerStateChanged) {
CGRect frame = recognizer.view.frame;
CGRect topLeft = CGRectMake(frame.origin.x,
frame.origin.y,
touchRadius, touchRadius);
CGRect bottomLeft = CGRectMake(frame.origin.x,
frame.origin.y + frame.size.height - touchRadius,
touchRadius, touchRadius);
CGRect topRight = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y,
touchRadius, touchRadius);
CGRect bottomRight = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y + frame.size.height - touchRadius,
touchRadius, touchRadius);
Boolean useNewFrame = YES;
CGRect newFrame = frame;
if (CGRectContainsPoint(topLeft, touch)) {
newFrame.origin.x += translation.x;
newFrame.origin.y += translation.y;
newFrame.size.width -= translation.x;
newFrame.size.height -= translation.y;
recognizer.view.frame = newFrame;
} else if (CGRectContainsPoint(topRight, touch)) {
newFrame.origin.y += translation.y;
newFrame.size.width += translation.x;
newFrame.size.height -= translation.y;
recognizer.view.frame = newFrame;
} else if (CGRectContainsPoint(bottomLeft, touch)) {
newFrame.origin.x += translation.x;
newFrame.size.width -= translation.x;
newFrame.size.height += translation.y;
recognizer.view.frame = newFrame;
} else if (CGRectContainsPoint(bottomRight, touch)) {
newFrame.size.width += translation.x;
newFrame.size.height += translation.y;
recognizer.view.frame = newFrame;
} else {
useNewFrame = NO;
}
if (useNewFrame) {
// make sure it doesn't go too small to touch
if (newFrame.size.width < touchRadius)
newFrame.size.width = touchRadius;
if (newFrame.size.height < touchRadius)
newFrame.size.height = touchRadius;
recognizer.view.frame = newFrame;
// I THINK I NEED A TRANSFORM HERE
} else {
// use the fallback translate
[recognizer.view setTransform:CGAffineTransformTranslate(recognizer.view.transform, translation.x, translation.y)];
}
}
[recognizer setTranslation:CGPointZero inView:self.view];
}
- (void)handlePinch:(UIPinchGestureRecognizer *)recognizer
{
if (recognizer.state == UIGestureRecognizerStateBegan ||
recognizer.state == UIGestureRecognizerStateChanged)
{
// make sure it stays visible
float scale = recognizer.scale;
if (recognizer.view.frame.size.width * scale > touchRadius * 2 ||
recognizer.view.frame.size.height * scale > touchRadius * 2) {
[recognizer.view setTransform:CGAffineTransformScale(recognizer.view.transform, scale, scale)];
recognizer.scale = 1;
}
}
}
- (void)handleRotate:(UIRotationGestureRecognizer *)recognizer {
UIGestureRecognizerState state = [recognizer state];
if (state == UIGestureRecognizerStateBegan || state == UIGestureRecognizerStateChanged)
{
CGFloat rotation = [recognizer rotation];
[recognizer.view setTransform:CGAffineTransformRotate(recognizer.view.transform, rotation)];
}
[recognizer setRotation:0];
}
Do I need a transform after I modify the view frame? Or should I pursue another path?
I'm trying to do this natively without using libs from others so I understand it. This simplistic example will be part of a larger project.
After some research I figured out how to modify the transform for a corner/side dragging pan, to resize an image. Here are the key elements to make it work:
Save the CGAffineTransform in each gesture (pan, pinch, rotate) if
the recognizer state = UIGestureRecognizerStateBegan
Apply new CGAAffineTransforms on the saved initial transform in each gesture (pan, pinch, rotate)
Save the corner/side detected in the UIGestureRecognizerStateBegan state
Clear the corner/side detected in the UIGestureRecognizerStateEnded state
For pan gesture, adjust the translation x/y values based on corner/side detected
Make sure touch radius is large enough to be useful (24 was too small, 48 works well)
The transform worked like this:
// pan the image
recognizer.view.transform = CGAffineTransformTranslate(initialTransform, tx, ty);
if (scaleIt) {
// the origin or size changed
recognizer.view.frame = newFrame;
}
The tx and ty values were the defaults returned from the recognizer if the pan was from the center of the image. But if the user touch was near a corner or side of the view frame, the tx/ty and the frame origin are adjusted to resize the view making it appear as if that corner or side were being dragged to resize the view.
For example:
CGRect newFrame = recognizer.view.frame;
if (currentDragType == DRAG_TOPLEFT) {
tx = -translation.x;
ty = -translation.y;
newFrame.origin.x += translation.x;
newFrame.origin.y += translation.y;
newFrame.size.width -= translation.x;
newFrame.size.height -= translation.y;
} else if (currentDragType == DRAG_TOPRIGHT) {
tx = translation.x;
ty = -translation.y;
newFrame.origin.y += translation.y;
newFrame.size.width += translation.x;
newFrame.size.height -= translation.y;
}
This makes the top left corner or top right corner move in or out according to how far the touch moved. Unlike a center pan, where the whole view moves along with the touch, the opposite corner (or side) remains fixed.
There were 2 problems I did not resolve:
if the image is rotated significantly, the corner/side detection (pan) does not work because I did not check for the touch in the rotated coordinate system (but the center pan still works fine)
once the image is rotated, pinch gestures work erratically and can resize an image to zero, making it invisible
I created a simple demo and uploaded it to GitHub: https://github.com/ByteSlinger/ImageGestureDemo
Yes, I know that link could go away someday, so here's the code:
ViewController.h
//
// ViewController.h
// ImageGestureDemo
//
// Created by ByteSlinger on 6/21/18.
// Copyright © 2018 ByteSlinger. All rights reserved.
//
#import <UIKit/UIKit.h>
NSString *APP_TITLE = #"Image Gesture Demo";
NSString *INTRO_ALERT = #"\nDrag, Pinch and Rotate the Image!"
"\n\nYou can also Drag, Pinch and Rotate the background image."
"\n\nDouble tap an image to reset it";
float touchRadius = 48; // max distance from corners to touch point
typedef NS_ENUM(NSInteger, DragType) {
DRAG_OFF,
DRAG_ON,
DRAG_CENTER,
DRAG_TOP,
DRAG_BOTTOM,
DRAG_LEFT,
DRAG_RIGHT,
DRAG_TOPLEFT,
DRAG_TOPRIGHT,
DRAG_BOTTOMLEFT,
DRAG_BOTTOMRIGHT
};
#interface ViewController : UIViewController <UIGestureRecognizerDelegate>
//callback to process gesture events
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer;
- (IBAction)handlePinch:(UIPinchGestureRecognizer *)recognizer;
- (IBAction)handleRotate:(UIRotationGestureRecognizer *)recognizer;
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer;
#end
ViewController.m
//
// ViewController.m
// ImageGestureDemo
//
// This is a DEMO. It shows how to pan, pinch, rotate and drag/resize a UIImageView.
//
// There is a background image and a foreground image. Both images can be
// panned, pinched and rotated, but only the foreground image can be resized
// by dragging one of it's corners or it's sides.
//
// NOTE: Sure, much of this code could have been put into a subclass of UIView
// or UIImageView. But for simplicity and reference sake, all code and
// methods are in one place, this ViewController subclass. There is no
// error checking at all. App tested on an iPhone 6+ and an iPad gen3.
//
// Features:
// - allows an image to be resized with pan gesture by dragging corners and sides
// - background image can be modified (pan, pinch, rotate)
// - foreground image can be modified (pan, pinch, rotate, drag/resize)
// - all image manipulation done within gestures linked from storyboard
// - all finger touches on screen show with yellow circles
// - when dragging, the touch circles turn to red (so you know when gestures start)
// - double tap on foreground image resets it to original size and rotation
// - double tap on background resets it and also resets the foreground image
// - screen and image touch and size info displayed on screen
// - uses CGAffineTransform objects for image manipulation
// - uses UIGestureRecognizerStateBegan in gestures to save transforms (the secret sauce...)
//
// Known Issues:
// - when the image is rotated, determining if a touch is on a corner or side
// does not work for large rotations. Need to check touch points against
// non rotated view frame and adjust accordingly.
// - after rotations, pinch and resize can shrink image to invisibility despite
// code attempts to prevent it.
//
// Created by ByteSlinger on 6/21/18.
// Copyright © 2018 ByteSlinger. All rights reserved.
//
#import "ViewController.h"
#interface ViewController ()
#property (strong, nonatomic) IBOutlet UIImageView *backgroundImageView;
#property (strong, nonatomic) IBOutlet UIImageView *foregroundImageView;
#property (strong, nonatomic) IBOutlet UILabel *screenInfoLabel;
#property (strong, nonatomic) IBOutlet UILabel *touchInfoLabel;
#property (strong, nonatomic) IBOutlet UILabel *imageInfoLabel;
#property (strong, nonatomic) IBOutlet UILabel *backgroundInfoLabel;
#property (strong, nonatomic) IBOutlet UILabel *changeInfoLabel;
#property (strong, nonatomic) IBOutlet UITapGestureRecognizer *backgroundTapGesture;
#property (strong, nonatomic) IBOutlet UITapGestureRecognizer *foregroundTapGesture;
#end
#implementation ViewController
CGRect originalImageFrame;
CGRect originalBackgroundFrame;
CGAffineTransform originalImageTransform;
CGAffineTransform originalBackgroundTransform;
NSMutableArray* touchCircles = nil;
DragType currentDragType = DRAG_OFF;
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// set this to whatever your desired touch radius is
touchRadius = 48;
// In Storyboard this must have set to 1, then this seems to work ok
// when setting the double tap here
_foregroundTapGesture.numberOfTapsRequired = 2;
_backgroundTapGesture.numberOfTapsRequired = 2;
[self centerImageView:_foregroundImageView];
originalImageFrame = _foregroundImageView.frame;
originalBackgroundFrame = _backgroundImageView.frame;
originalImageTransform = _foregroundImageView.transform;
originalBackgroundTransform = _backgroundImageView.transform;
_backgroundImageView.contentMode = UIViewContentModeCenter;
_foregroundImageView.contentMode = UIViewContentModeScaleToFill; // allow stretch
[_backgroundImageView setUserInteractionEnabled:YES];
[_backgroundImageView setMultipleTouchEnabled:YES];
[_foregroundImageView setUserInteractionEnabled:YES];
[_foregroundImageView setMultipleTouchEnabled:YES];
[[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];
[[NSNotificationCenter defaultCenter]
addObserver:self selector:#selector(orientationChanged:)
name:UIDeviceOrientationDidChangeNotification
object:[UIDevice currentDevice]];
[_touchInfoLabel setText:nil];
[_changeInfoLabel setText:nil];
[_imageInfoLabel setText:nil];
[_backgroundInfoLabel setText:nil];
touchCircles = [[NSMutableArray alloc] init];
}
- (void)viewDidAppear:(BOOL)animated {
[self alert:APP_TITLE :INTRO_ALERT];
}
- (void) orientationChanged:(NSNotification *)note
{
UIDevice * device = note.object;
switch(device.orientation)
{
case UIDeviceOrientationPortrait:
/* start special animation */
break;
case UIDeviceOrientationPortraitUpsideDown:
/* start special animation */
break;
default:
break;
};
[_screenInfoLabel setText:[NSString stringWithFormat:#"Screen: %.0f/%.0f",
self.view.frame.size.width,self.view.frame.size.height]];
}
//
// Update the info labels from the passed objects
//
- (void) updateInfo:(UIView *)imageView touch:(CGPoint)touch change:(CGPoint)change {
NSString *label;
UILabel *infoLabel;
if (imageView == _foregroundImageView) {
label = #"Image: %0.f/%0.f, %0.f/%0.f";
infoLabel = _imageInfoLabel;
} else {
label = #"Background: %0.f/%0.f, %0.f/%0.f";
infoLabel = _backgroundInfoLabel;
}
[infoLabel setText:[NSString stringWithFormat:label,
imageView.layer.frame.origin.x,
imageView.layer.frame.origin.y,
imageView.layer.frame.size.width,
imageView.layer.frame.size.height]];
[_touchInfoLabel setText:[NSString stringWithFormat:#"Touch: %0.f/%.0f",
touch.x,touch.y]];
[_changeInfoLabel setText:[NSString stringWithFormat:#"Change: %0.f/%.0f",
change.x,change.y]];
}
//
// Center the passed image frame within it's bounds
//
- (void)centerImageView:(UIImageView *)imageView {
CGSize boundsSize = self.view.bounds.size;
CGRect frameToCenter = imageView.frame;
// center horizontally
if (frameToCenter.size.width < boundsSize.width)
frameToCenter.origin.x = (boundsSize.width - frameToCenter.size.width) / 2;
else
frameToCenter.origin.x = 0;
// center vertically
if (frameToCenter.size.height < boundsSize.height)
frameToCenter.origin.y = (boundsSize.height - frameToCenter.size.height) / 2;
else
frameToCenter.origin.y = 0;
imageView.frame = frameToCenter;
}
//
// Remove all touch circles
//
- (void)removeTouchCircles {
[touchCircles makeObjectsPerformSelector: #selector(removeFromSuperview)];
[touchCircles removeAllObjects];
}
//
// Draw a circle around the passed point where the user has touched the screen
//
- (void)drawTouchCircle:(UIView *)view fromCenter:(CGPoint)point ofRadius:(float)radius {
CGRect frame = CGRectMake(point.x - view.frame.origin.x - radius,
point.y - view.frame.origin.y - radius,
radius * 2, radius * 2);
UIView *circle = [[UIView alloc] initWithFrame:frame];
circle.alpha = 0.5;
circle.layer.cornerRadius = radius;
circle.backgroundColor = currentDragType == DRAG_OFF ? [UIColor yellowColor] : [UIColor redColor];
[circle.layer setBorderWidth:1.0];
[circle.layer setBorderColor:[[UIColor blackColor]CGColor]];
[view addSubview:circle];
[touchCircles addObject:circle];
}
//
// Draw a touch circle for the passed user touch
//
- (void)handleTouchEvent:(UIView *) view
atPoint:(CGPoint) point
forState:(UIGestureRecognizerState) state
clear:(Boolean) clear {
//NSLog(#"handleTouchEvent");
if (clear) {
[self removeTouchCircles];
}
if (state == UIGestureRecognizerStateEnded) {
[self removeTouchCircles];
} else {
[self drawTouchCircle:self.view fromCenter:point ofRadius:touchRadius];
}
[_touchInfoLabel setText:[NSString stringWithFormat:#"Touch: %0.f/%.0f",
point.x,point.y]];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//NSLog(#"touchesBegan");
[self removeTouchCircles];
NSSet *allTouches = [event allTouches];
NSArray *allObjects = [allTouches allObjects];
for (int i = 0;i < [allObjects count];i++)
{
UITouch *touch = [allObjects objectAtIndex:i];
CGPoint location = [touch locationInView: self.view];
[self handleTouchEvent:touch.view atPoint:location forState:UIGestureRecognizerStateBegan clear:NO];
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//NSLog(#"touchesMoved");
[self removeTouchCircles];
NSSet *allTouches = [event allTouches];
NSArray *allObjects = [allTouches allObjects];
for (int i = 0;i < [allObjects count];i++)
{
UITouch *touch = [allObjects objectAtIndex:i];
CGPoint location = [touch locationInView: self.view];
[self handleTouchEvent:touch.view atPoint:location forState:UIGestureRecognizerStateChanged clear:NO];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
//NSLog(#"touchesEnded");
[self removeTouchCircles];
}
- (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
//NSLog(#"touchesCancelled");
[self removeTouchCircles];
}
//
// Double tap resets passed image. If background image, also reset foreground image.
//
- (IBAction)handleDoubleTap:(UITapGestureRecognizer *)recognizer {
CGPoint touch = [recognizer locationInView: self.view];
if (recognizer.state == UIGestureRecognizerStateBegan ||
recognizer.state == UIGestureRecognizerStateChanged) {
[self handleTouchEvent:recognizer.view atPoint:touch forState:recognizer.state clear:NO];
} else {
[self removeTouchCircles];
}
[self alert:#"Reset" :#"The Image has been Reset!"];
CGRect frame = originalImageFrame;
CGAffineTransform transform = originalImageTransform;
if (recognizer.view == _backgroundImageView) {
_foregroundImageView.transform = transform;
_foregroundImageView.frame = frame;
[self updateInfo:_foregroundImageView touch:touch change:CGPointZero];
frame = originalBackgroundFrame;
transform = originalBackgroundTransform;
}
recognizer.view.transform = transform;
recognizer.view.frame = frame;
[self updateInfo:recognizer.view touch:touch change:CGPointZero];
}
- (void) setDragType:(CGRect)frame withTouch:(CGPoint)touch {
// the corners and sides of the current view frame
CGRect topLeft = CGRectMake(frame.origin.x,frame.origin.y,
touchRadius, touchRadius);
CGRect bottomLeft = CGRectMake(frame.origin.x,
frame.origin.y + frame.size.height - touchRadius,
touchRadius, touchRadius);
CGRect topRight = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y,
touchRadius, touchRadius);
CGRect bottomRight = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y + frame.size.height - touchRadius,
touchRadius, touchRadius);
CGRect leftSide = CGRectMake(frame.origin.x,frame.origin.y,
touchRadius, frame.size.height);
CGRect rightSide = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y,
touchRadius, frame.size.height);
CGRect topSide = CGRectMake(frame.origin.x,frame.origin.y,
frame.size.width, touchRadius);
CGRect bottomSide = CGRectMake(frame.origin.x,
frame.origin.y + frame.size.height - touchRadius,
frame.size.width, touchRadius);
if (CGRectContainsPoint(topLeft, touch)) {
currentDragType = DRAG_TOPLEFT;
} else if (CGRectContainsPoint(topRight, touch)) {
currentDragType = DRAG_TOPRIGHT;
} else if (CGRectContainsPoint(bottomLeft, touch)) {
currentDragType = DRAG_BOTTOMLEFT;
} else if (CGRectContainsPoint(bottomRight, touch)) {
currentDragType = DRAG_BOTTOMRIGHT;
} else if (CGRectContainsPoint(topSide, touch)) {
currentDragType = DRAG_TOP;
} else if (CGRectContainsPoint(bottomSide, touch)) {
currentDragType = DRAG_BOTTOM;
} else if (CGRectContainsPoint(leftSide, touch)) {
currentDragType = DRAG_LEFT;
} else if (CGRectContainsPoint(rightSide, touch)) {
currentDragType = DRAG_RIGHT;
} else if (CGRectContainsPoint(frame, touch)) {
currentDragType = DRAG_CENTER;
} else {
currentDragType = DRAG_OFF; // touch point is not in the view frame
}
}
//
// Return the unrotated size of the view
//
- (CGSize) getActualSize:(UIView *)view {
CGSize result;
//CGSize originalSize = view.frame.size;
CGAffineTransform originalTransform = view.transform;
float rotation = atan2f(view.transform.b, view.transform.a);
// reverse rotation of current transform
CGAffineTransform unrotated = CGAffineTransformRotate(view.transform, -rotation);
view.transform = unrotated;
// get the size of the "unrotated" view
result = view.frame.size;
// reset back to what it was
view.transform = originalTransform;
//NSLog(#"Size current = %0.f/%0.f, rotation = %0.2f, unrotated = %0.f/%0.f",
// originalSize.width,originalSize.height,
// rotation,
// result.width,result.height);
return result;
}
//
// Resize or Pan an image on the ViewController View
//
- (IBAction)handleResize:(UIPanGestureRecognizer *)recognizer {
static CGRect initialFrame;
static CGAffineTransform initialTransform;
static Boolean scaleIt = YES;
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
//get the translation amount in x,y
CGPoint translation = [recognizer translationInView:recognizer.view];
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialFrame = recognizer.view.frame;
initialTransform = recognizer.view.transform;
[self setDragType:recognizer.view.frame withTouch:touch];
scaleIt = YES;
}
if (recognizer.state == UIGestureRecognizerStateEnded) {
currentDragType = DRAG_OFF;
scaleIt = NO;
[self getActualSize:recognizer.view];
} else {
// our new view frame - start with the initial one
CGRect newFrame = initialFrame;
// adjust the translation point according to where the user touched the image
float tx = translation.x;
float ty = translation.y;
// resize by dragging a corner or a side
if (currentDragType == DRAG_TOPLEFT) {
tx = -translation.x;
ty = -translation.y;
newFrame.origin.x += translation.x;
newFrame.origin.y += translation.y;
newFrame.size.width -= translation.x;
newFrame.size.height -= translation.y;
} else if (currentDragType == DRAG_TOPRIGHT) {
ty = -translation.y;
newFrame.origin.y += translation.y;
newFrame.size.width += translation.x;
newFrame.size.height -= translation.y;
} else if (currentDragType == DRAG_BOTTOMLEFT) {
tx = -translation.x;
newFrame.origin.x += translation.x;
newFrame.size.width -= translation.x;
newFrame.size.height += translation.y;
} else if (currentDragType == DRAG_BOTTOMRIGHT) {
// origin does not change
newFrame.size.width += translation.x;
newFrame.size.height += translation.y;
} else if (currentDragType == DRAG_TOP) {
tx = 0;
newFrame.origin.y += translation.y;
newFrame.size.height -= translation.y;
} else if (currentDragType == DRAG_BOTTOM) {
tx = 0;
newFrame.size.height += translation.y;
} else if (currentDragType == DRAG_LEFT) {
tx = -translation.x;
ty = 0;
newFrame.origin.x += translation.x;
newFrame.size.width -= translation.x;
} else if (currentDragType == DRAG_BOTTOM) {
ty = 0;
newFrame.size.width += translation.x;
} else { //if (currentDragType == DRAG_CENTER) {
newFrame.origin.x += translation.x;
newFrame.origin.y += translation.y;
scaleIt = NO; // normal pan
}
// get the unrotated size of the view
CGSize actualSize = [self getActualSize:recognizer.view];
// make sure we can still touch the image
if (actualSize.width < touchRadius * 2) {
newFrame.size.width += touchRadius * 2;
tx = 0; // stop resizing
}
if (actualSize.height < touchRadius * 2) {
newFrame.size.height += touchRadius * 2;
ty = 0; // stop resizing
}
// pan the image
recognizer.view.transform = CGAffineTransformTranslate(initialTransform, tx, ty);
if (scaleIt) {
// the origin or size changed
recognizer.view.frame = newFrame;
}
}
[self updateInfo:recognizer.view touch:touch change:translation];
}
//
// Pan an image on the ViewController View
//
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
static CGAffineTransform initialTransform;
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
//get the translation amount in x,y
CGPoint translation = [recognizer translationInView:recognizer.view];
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialTransform = recognizer.view.transform;
currentDragType = DRAG_ON;
} else if (recognizer.state == UIGestureRecognizerStateEnded) {
currentDragType = DRAG_OFF;
[self getActualSize:recognizer.view];
}
recognizer.view.transform = CGAffineTransformTranslate(initialTransform, translation.x, translation.y);
[self updateInfo:recognizer.view touch:touch change:translation];
}
//
// Pinch (resize) an image on the ViewController View
//
- (void)handlePinch:(UIPinchGestureRecognizer *)recognizer {
static CGSize initialSize;
static CGAffineTransform initialTransform;
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
//get the translation amount in x,y
CGPoint change = CGPointMake(recognizer.view.transform.tx, recognizer.view.transform.ty);
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialSize = recognizer.view.frame.size;
initialTransform = recognizer.view.transform;
currentDragType = DRAG_ON;
} else if (recognizer.state == UIGestureRecognizerStateEnded) {
currentDragType = DRAG_OFF;
[self getActualSize:recognizer.view];
}
// make sure it stays visible
float scale = recognizer.scale;
float newWidth = initialSize.width * scale;
float newHeight = initialSize.height * scale;
// make sure we can still touch it
if (newWidth > touchRadius * 2 && newHeight > touchRadius * 2) {
// scale the image
recognizer.view.transform = CGAffineTransformScale(initialTransform, scale, scale);
}
[self updateInfo:recognizer.view touch:touch change:change];
}
//
// Rotate an image on the ViewController View
//
- (IBAction)handleRotate:(UIRotationGestureRecognizer *)recognizer {
static CGFloat initialRotation;
static CGAffineTransform initialTransform;
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialTransform = recognizer.view.transform;
initialRotation = atan2f(recognizer.view.transform.b, recognizer.view.transform.a);
currentDragType = DRAG_ON;
} else if (recognizer.state == UIGestureRecognizerStateEnded) {
currentDragType = DRAG_OFF;
[self getActualSize:recognizer.view];
}
recognizer.view.transform = CGAffineTransformRotate(initialTransform, recognizer.rotation);
[self updateInfo:recognizer.view touch:touch change:CGPointMake(initialRotation, recognizer.rotation)];
}
//
// Prevent simultaneous gestures so my transforms don't get funky
// (may not be necessary ... )
//
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return NO;
}
//
// Spew a message to the user
//
- (void)alert:(NSString *) title :(NSString *)message {
UIAlertController *alert = [UIAlertController
alertControllerWithTitle: title
message: message
preferredStyle:UIAlertControllerStyleAlert];
UIAlertAction *okButton = [UIAlertAction
actionWithTitle:#"Ok"
style:UIAlertActionStyleDefault
handler:^(UIAlertAction * action) {
}];
[alert addAction:okButton];
[self presentViewController:alert animated:YES completion:nil];
}
#end

UIPanGestureRecognizer to implement scale and rotate an UIView

There is a UIView A. I put a icon on view A and try to use pan gesture to scale and rotate this view A. The scale function works fine but I can't make rotation work. The code is as following. Any help will be appreciated. thanks
- (void)scaleAndRotateWatermark:(UIPanGestureRecognizer *)gesture
{
if (gesture.state == UIGestureRecognizerStateChanged
|| gesture.state == UIGestureRecognizerStateEnded)
{
UIView *child = gesture.view;
UIView *view = child.superview;
CGPoint translatedPoint = [gesture translationInView:view];
CGPoint originCenter = child.center;
CGPoint newCenter = CGPointMake(child.centerX+translatedPoint.x, child.centerY+translatedPoint.y);
float origX = originCenter.x;
float origY = originCenter.y;
float newX = newCenter.x;
float newY = newCenter.y;
float originDis = (origX*origX) + (origY*origY);
float newDis = (newX*newX) + (newY*newY);
float scale = newDis/originDis;
view.transform = CGAffineTransformScale(view.transform, scale, scale);
// rotation calulate here
// need your help
// end of rotation
[guesture setTranslation:CGPointZero inView:_videoPreviewLayer];
}
}

How to Resize UIView with Subview

My goal is to resize UIView with a handle - its subview. I got my project working perfectly to accomplish that: the handle has a PanGestureRecognizer and its handler method resizes the view (parent) using the following method:
-(IBAction)handleResizeGesture:(UIPanGestureRecognizer *)recognizer {
CGPoint touchLocation = [recognizer locationInView:container.superview];
CGPoint center = container.center;
switch (recognizer.state) {
case UIGestureRecognizerStateBegan: {
deltaAngle = atan2f(touchLocation.y - center.y, touchLocation.x - center.x) -
CGAffineTransformGetAngle(container.transform);
initialBounds = container.bounds;
initialDistance = CGPointGetDistance(center, touchLocation);
break;
}
case UIGestureRecognizerStateChanged: {
CGFloat scale = CGPointGetDistance(center, touchLocation)/initialDistance;
CGFloat minimumScale = self.minimumSize/MIN(initialBounds.size.width,
initialBounds.size.height);
scale = MAX(scale, minimumScale);
CGRect scaledBounds = CGRectScale(initialBounds, scale, scale);
container.bounds = scaledBounds;
[container setNeedsDisplay];
break;
}
case UIGestureRecognizerStateEnded:
break;
default:
break;
}
}
Please note that I use center and bounds properties because frame is NOT reliable when transform is applied to my view.
However, my requirement is really to resize the view in ANY direction - not only proportionally as the code does. The problem is that I am not finding the correct methods or approaches how this handle may resize its superview's bounds (width or height) so it always sticks to the corner while finger is dragging it around.
Here is my project if it is easier to see what I mean.
Updated solution as suggested by an answer below works very well but once transform is applied (e.g. in viewDidLoad I have container.transform = CGAffineTransformMakeRotation(90);) it does not:
case UIGestureRecognizerStateBegan: {
initialBounds = container.bounds;
initialDistance = CGPointGetDistance(center, touchLocation);
initialDistanceX = CGPointGetDistanceX(center, touchLocation);
initialDistanceY = CGPointGetDistanceY(center, touchLocation);
break;
}
case UIGestureRecognizerStateChanged: {
CGFloat scaleX = abs(center.x-touchLocation.x)/initialDistanceX;
CGFloat scaleY = abs(center.y-touchLocation.y)/initialDistanceY;
CGFloat minimumScale = self.minimumSize/MIN(initialBounds.size.width, initialBounds.size.height);
scaleX = MAX(scaleX, minimumScale);
scaleY = MAX(scaleY, minimumScale);
CGRect scaledBounds = CGRectScale(initialBounds, scaleX, scaleY);
container.bounds = scaledBounds;
[container setNeedsDisplay];
break;
}
where
CG_INLINE CGFloat CGPointGetDistanceX(CGPoint point1, CGPoint point2) {
return (point2.x - point1.x);
}
CG_INLINE CGFloat CGPointGetDistanceY(CGPoint point1, CGPoint point2) {
return (point2.y - point1.y);
}
You are setting the same scale parameter in your call to CGRectScale(initialBounds, scale, scale); try this:
case UIGestureRecognizerStateChanged: {
CGFloat scaleX = abs(center.x-touchLocation.x)/initialDistance;
CGFloat scaleY = abs(center.y-touchLocation.y)/initialDistance;
CGFloat minimumScale = self.minimumSize/MIN(initialBounds.size.width, initialBounds.size.height);
scaleX = MAX(scaleX, minimumScale);
scaleY = MAX(scaleY, minimumScale);
CGRect scaledBounds = CGRectScale(initialBounds, scaleX, scaleY);
container.bounds = scaledBounds;
[container setNeedsDisplay];
break;
You may also consider to store initialDistanceX and initialDistanceY.
Use UIPinchGestureRecognizer & UIPanGestureRecognizer.
Try this code
//--Create and configure the pinch gesture
UIPinchGestureRecognizer *pinchGestureRecognizer = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(pinchGestureDetected:)];
[pinchGestureRecognizer setDelegate:self];
[container.superview addGestureRecognizer:pinchGestureRecognizer];
//--Create and configure the pan gesture
UIPanGestureRecognizer *panGestureRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panGestureDetected:)];
[panGestureRecognizer setDelegate:self];
[container.superview addGestureRecognizer:panGestureRecognizer];
For UIPinchGestureRecognizer:
- (void)pinchGestureDetected:(UIPinchGestureRecognizer *)recognizer
{
UIGestureRecognizerState state = [recognizer state];
if (state == UIGestureRecognizerStateBegan || state == UIGestureRecognizerStateChanged)
{
CGFloat scale = [recognizer scale];
[recognizer.view setTransform:CGAffineTransformScale(recognizer.view.transform, scale, scale)];
[recognizer setScale:1.0];
panGesture = YES;
}
}
For UIPanGestureRecognizer :
- (void)panGestureDetected:(UIPanGestureRecognizer *)recognizer
{
UIGestureRecognizerState state = [recognizer state];
if (panGesture==YES)
{
if (state == UIGestureRecognizerStateBegan || state == UIGestureRecognizerStateChanged)
{
CGPoint translation = [recognizer translationInView:recognizer.view];
[recognizer.view setTransform:CGAffineTransformTranslate(recognizer.view.transform, translation.x, translation.y)];
[recognizer setTranslation:CGPointZero inView:recognizer.view];
}}
}
I'd not control it with a subview. It gets messy when you apply transform to the outer view.
Simply add two view to the ViewController, and hook them up with some code.
I've altered the project, find test-001 at GitHub.
ViewController code can be empty for this.
The view you want to transform TransformableView needs a scale property, and a method to apply it (if you want to add other transformations as well).
#interface TransformableView : UIView
#property (nonatomic) CGSize scale;
-(void)applyTransforms;
#end
#implementation TransformableView
-(void)applyTransforms
{
// Do whatever additional transform you want (e.g. concat additional rotations / mirroring to the transform).
self.transform = CGAffineTransformMakeScale(self.scale.width, self.scale.height);
}
#end
And the DraggableCorner can manage the rest (with some basic math).
#class TransformableView;
#interface DraggableCorner : UIView
#property (nonatomic, weak) IBOutlet TransformableView *targetView; // Hook up in IB.
-(IBAction)panGestureRecognized:(UIPanGestureRecognizer*) recognizer;
#end
CG_INLINE CGPoint offsetOfPoints(CGPoint point1, CGPoint point2)
{ return (CGPoint){point1.x - point2.x, point1.y -point2.y}; }
CG_INLINE CGPoint addPoints(CGPoint point1, CGPoint point2)
{ return (CGPoint){point1.x + point2.x, point1.y + point2.y}; }
#interface DraggableCorner ()
#property (nonatomic) CGPoint touchOffset;
#end
#implementation DraggableCorner
-(IBAction)panGestureRecognized:(UIPanGestureRecognizer*) recognizer
{
// Location in superview.
CGPoint touchLocation = [recognizer locationInView:self.superview];
// Began.
if (recognizer.state == UIGestureRecognizerStateBegan)
{
// Finger distance from handler.
self.touchOffset = offsetOfPoints(self.center, touchLocation);
}
// Moved.
if (recognizer.state == UIGestureRecognizerStateChanged)
{
// Drag.
self.center = addPoints(touchLocation, self.touchOffset);
// Desired size.
CGPoint distanceFromTargetCenter = offsetOfPoints(self.center, self.targetView.center);
CGSize desiredTargetSize = (CGSize){distanceFromTargetCenter.x * 2.0, distanceFromTargetCenter.y * 2.0};
// -----
// You can put limitations here, simply clamp `desiredTargetSize` to some value.
// -----
// Scale needed for desired size.
CGSize targetSize = self.targetView.bounds.size;
CGSize targetRatio = (CGSize){desiredTargetSize.width / targetSize.width, desiredTargetSize.height / targetSize.height};
// Apply.
self.targetView.scale = targetRatio;
[self.targetView applyTransforms];
}
}
Set the classes in IB, and hook up the IBAction and the IBOutlet of DraggableView.

How to restrict a moveable view by Pan gesture

I have a UIImageView which is moveable via a pan gesture.
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
[self.photoMask addGestureRecognizer:pan];
I would like to restrict the area this can be moved on screen. Rather than the user be able to drag the view right to the side of the screen, I want to restrict it by a margin of some sort. How can I do this?
Also, how is this then handled when rotated?
EDIT ---
#pragma mark - Gesture Recognizer
-(void)handlePan:(UIPanGestureRecognizer *)gesture {
NSLog(#"Pan Gesture");
gesture.view.center = [gesture locationInView:self.view];
}
This is my current method to handle the pan. What I need to do is continue to move the imageview by the center point and also restrict its movement when close to the edge of the screen by 50 for example.
One possible solution to this is in your handlePan method, check the location of the point on the screen, and only commit the change if it is within the bounds you wish to restrict it to.
For ex.
-(void) handlePan:(UIGestureRecognizer*)panGes{
CGPoint point = [panGes locationInView:self.view];
//Only allow movement up to within 100 pixels of the right bound of the screen
if (point.x < [UIScreen mainScreen].bounds.size.width - 100) {
CGRect newframe = CGRectMake(point.x, point.y, theImageView.frame.size.width, theImageView.frame.size.height);
theImageView.frame = newframe;
}
}
I believe this would also correctly handle any screen rotation
EDIT
To move your image view by the center of its frame, the handlePan method could look something like this.
-(void)handlePan:(UIPanGestureRecognizer *)gesture {
CGPoint point = [gesture locationInView:self.view];
//Only allow movement up to within 50 pixels of the bounds of the screen
//Ex. (IPhone 5)
CGRect boundsRect = CGRectMake(50, 50, 220, 448);
if (CGRectContainsPoint(boundsRect, point)) {
imgView.center = point;
}
}
Check whether the point is within your desired bounds, and if so, set the center of your image view frame to that point.
I'm not sure if I'm being over-simplistic here but I think you can accomplish this by using an if clause.
-(void)handlePan:(UIPanGestureRecognizer*)gesture {
UIImageView *viewToDrag = gesture.view; // this is the view you want to move
CGPoint translation = [gesture translationInView:viewToDrag.superview]; // get the movement delta
CGRect movedFrame = CGRectOffset(viewToDrag.frame, translation.x, translation.y); // this is the new (moved) frame
// Now this is the critical part because I don't know if your "margin"
// is a CGRect or maybe some int values, the important thing here is
// to compare if the "movedFrame" values are in the allowed movement area
// Assuming that your margin is a CGRect you could do the following:
if (CGRectContainsRect(yourPermissibleMargin, movedFrame)) {
CGPoint newCenter = CGPointMake(CGRectGetMidX(movedFrame), CGRectGetMidY(movedFrame));
viewToDrag.center = newCenter; // Move your view
}
// -OR-
// If you have your margins as int values you could do the following:
if ( (movedFrame.origin.x + movedFrame.size.width) < 50) {
CGPoint newCenter = CGPointMake(CGRectGetMidX(movedFrame), CGRectGetMidY(movedFrame));
viewToDrag.center = newCenter; // Move your view
}
}
You'll probably have to adapt this to meet your specific needs.
Hope this helps!
Here is the answer in Swift 4 -
Restrict the view's movement to superview
#objc func handlePan(_ gestureRecognizer: UIPanGestureRecognizer)
{
// Allows smooth movement of stickers.
if gestureRecognizer.state == .began || gestureRecognizer.state == .changed
{
let point = gestureRecognizer.location(in: self.superview)
if let superview = self.superview
{
let restrictByPoint : CGFloat = 30.0
let superBounds = CGRect(x: superview.bounds.origin.x + restrictByPoint, y: superview.bounds.origin.y + restrictByPoint, width: superview.bounds.size.width - 2*restrictByPoint, height: superview.bounds.size.height - 2*restrictByPoint)
if (superBounds.contains(point))
{
let translation = gestureRecognizer.translation(in: self.superview)
gestureRecognizer.view!.center = CGPoint(x: gestureRecognizer.view!.center.x + translation.x, y: gestureRecognizer.view!.center.y + translation.y)
gestureRecognizer.setTranslation(CGPoint.zero, in: self.superview)
}
}
}
}
If you want more control over it, match restrictByPoint value to your movable view's frame.
- (void)dragAction:(UIPanGestureRecognizer *)gesture{
UILabel *label = (UILabel *)gesture.view;
CGPoint translation = [gesture translationInView:label];
if (CGRectContainsPoint(label.frame, [gesture locationInView:label] )) {
label.center = CGPointMake(label.center.x,
label.center.y);
[gesture setTranslation:CGPointZero inView:label];
}
else{
label.center = CGPointMake(label.center.x,
label.center.y + translation.y);
[gesture setTranslation:CGPointZero inView:label];
}
}

Detecting if a CGAffineTransformed view is out of bounds of a screen/UIView

I have a several views that I can drag around, rotate, scale. I want to make it so they can't be drug, rotated or scaled off the screen.
Dragging seems to not be an Issue as I'm not using a Transform to generate the new position and see if that new position would put the view off the screen.
When I rotate or scale I use a CGAffineTransform (CGAffineTransformedRotate or CGAffineTransformScale) and I cant seem to get what the new frame would be without actually applying it to my view.
CGRect newElementBounds = CGRectApplyAffineTransform(element.bounds, CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale], [gestureRecognizer scale]));
CGRect elementBoundsInSuperView = [element convertRect:newElementBounds toView:[element superview]];
elementBoundsInSuperView is not the Rect that I would Expect it to be, Its way off.
I've also Tried to get the bounds in the SuperView first and then apply the transform to it, and that's not right either.
CGRect elementBoundsInSuperView = [element convertRect:element.bounds toView:[element superview]];
CGRect newElementBounds = CGRectApplyAffineTransform(newElementBounds, CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale], [gestureRecognizer scale]));
the [gestureRecognizer view] should be the same as element.
I came up with some gesture handlers that work so the view you are manipulatoing does not go off the area you specified. My View pallet was defined by kscreenEditorSpace, 2048.
The Pan gesture just calls the calcCenterFromXposition:yPosition:fromBoundsInSuperView: method to set its center, if the center falls out of bounds it just adjusts and keeps the element in bounds
//--------------------------------------------------------------------------------------------------------
// handlePanGesture
// Description: Called when scrollView got a DoubleFinger DoubleTap Gesture
// We want to Zoom out one ZOOM_STEP.
//
//--------------------------------------------------------------------------------------------------------
- (void)handlePanGesture:(UIPanGestureRecognizer *)gestureRecognizer {
UIView *element = [gestureRecognizer view];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ) {
[[self superview] bringSubviewToFront:self];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
//Front and Center Mr Element!
// Find out where we are going
CGPoint translation = [gestureRecognizer translationInView:[element superview]];
CGRect elementBoundsInSuperView = [element convertRect:element.bounds toView:[element superview]];
CGFloat xPosition = CGRectGetMidX(elementBoundsInSuperView) + translation.x;
CGFloat yPosition = CGRectGetMidY(elementBoundsInSuperView) + translation.y;
CGPoint newCenter = [self calcCenterFromXposition:xPosition yPosition:yPosition fromBoundsInSuperView:elementBoundsInSuperView];
//Re position ourselves
[element setCenter:newCenter];
//set the translation back to 0 point
[gestureRecognizer setTranslation:CGPointZero inView:[element superview]];
[self setNeedsDisplay];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateEnded ) {
}
}
So the handle Pinch and Rotation are pretty Similar. instead of calling the calc Center Directly, we call another method to help determine if we are in bounds.
//--------------------------------------------------------------------------------------------------------
// handlePinchGesture
// Description: Called when scrollView got a DoubleFinger DoubleTap Gesture
// We want to Zoom out one ZOOM_STEP.
//
//--------------------------------------------------------------------------------------------------------
- (void)handlePinchGesture:(UIPinchGestureRecognizer *)gestureRecognizer {
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ) {
[[self superview] bringSubviewToFront:self];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
BOOL aSelectedElementOffscreen = FALSE;
if ([[gestureRecognizer view] pinchOffScreen:[gestureRecognizer scale]]) {
aSelectedElementOffscreen = TRUE;
}
if (!aSelectedElementOffscreen) {
[gestureRecognizer view].transform = CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale], [gestureRecognizer scale]);
//Update ourself
[self contentSizeChanged];
}
[gestureRecognizer setScale:1];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateEnded) {
if (![self becomeFirstResponder]) {
NSLog(#" %# - %# - couldn't become first responder", INTERFACENAME, NSStringFromSelector(_cmd) );
return;
}
}
}
}
Pinch Off Screen Method
//--------------------------------------------------------------------------------------------------------
// pinchOffScreen
// Description: Called to see if the Pinch Gesture will cause element to go off screen Gesture
//
//--------------------------------------------------------------------------------------------------------
- (BOOL)pinchOffScreen:(CGFloat)scale {
// Save Our Current Transform incase we go off Screen
CGAffineTransform elementOrigTransform = [self transform];
//Apply our Transform
self.transform = CGAffineTransformScale([self transform], scale, scale);
// Get our new Bounds in the SuperView
CGRect newElementBoundsInSuperView = [self convertRect:self.bounds toView:[self superview]];
//Find out where we are in the SuperView
CGFloat xPosition = CGRectGetMidX( newElementBoundsInSuperView);
CGFloat yPosition = CGRectGetMidY( newElementBoundsInSuperView);
//See if we are off the Screen
BOOL offScreen = [self calcOffEditorFromXposition:xPosition yPosition:yPosition fromBoundsInSuperView: newElementBoundsInSuperView];
// We just wanted to Check. Revert to where we were
self.transform = elementOrigTransform;
return offScreen;
}
The Handle Rotation is Similar to Pinch, we have a helper method to see if we rotated off screen.
//--------------------------------------------------------------------------------------------------------
// handleRotationGesture
// Description: Called when we get a rotation gesture
// toggle the scroll/zoom lock
//
//--------------------------------------------------------------------------------------------------------
- (void) handleRotationGesture:(UIRotationGestureRecognizer *)gestureRecognizer{
UIView *element = [gestureRecognizer view];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ) {
[[self superview] bringSubviewToFront:self];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
BOOL aSelectedElementOffscreen = FALSE;
if ([element rotateOffScreen:[gestureRecognizer rotation]]) {
aSelectedElementOffscreen = TRUE;
}
if (!aSelectedElementOffscreen) {
[gestureRecognizer view].transform = CGAffineTransformRotate([element transform], [gestureRecognizer rotation]);
//Update ourself
[self contentSizeChanged];
}
[gestureRecognizer setRotation:0];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateEnded) {
}
}
}
Rotate Off Screen method
//--------------------------------------------------------------------------------------------------------
// rotateOffScreen
// Description: Called to see if the Rotation Gesture will cause element to go off screen Gesture
//
//--------------------------------------------------------------------------------------------------------
- (BOOL)rotateOffScreen:(CGFloat)rotation {
// Save Our Current Transform incase we go off Screen
CGAffineTransform elementOrigTransform = [self transform];
//Apply our Transform
self.transform = CGAffineTransformRotate([self transform], rotation);
// Get our new Bounds in the SuperView
CGRect newElementBoundsInSuperView = [self convertRect:self.bounds toView:[self superview]];
//Find out where we are in the SuperVire
CGFloat xPosition = CGRectGetMidX( newElementBoundsInSuperView);
CGFloat yPosition = CGRectGetMidY( newElementBoundsInSuperView);
//See if we are off the Screen
BOOL offScreen = [self calcOffEditorFromXposition:xPosition yPosition:yPosition fromBoundsInSuperView: newElementBoundsInSuperView];
// We just wanted to Check. Revert to where we were
self.transform = elementOrigTransform;
return offScreen;
}
Calc Screen Positioning Helper Methods
#pragma mark -
#pragma mark === Calc Screen Positioning ===
#pragma mark
//--------------------------------------------------------------------------------------------------------
// calcCenterFromXposition: yPosition: fromBoundsInSuperView:
// Description: calculate the center point in the element's super view from x, y
//
//--------------------------------------------------------------------------------------------------------
-(CGPoint) calcCenterFromXposition: (CGFloat) xPosition yPosition:(CGFloat) yPosition fromBoundsInSuperView:(CGRect) elementBoundsInSuperView{
// Ge the Height/width based on SuperView Bounds
CGFloat elementWidth = CGRectGetWidth(elementBoundsInSuperView);
CGFloat elementHeight = CGRectGetHeight(elementBoundsInSuperView);
//Determine our center.x from the new x
if (xPosition < elementWidth/2) {
xPosition = elementWidth/2;
} else if (xPosition + elementWidth/2 > kscreenEditorSpace) {
xPosition = kscreenEditorSpace - elementWidth/2;
}
//Determine our center.y from the new y
if (yPosition < elementHeight/2) {
yPosition = elementHeight/2;
} else if (yPosition + elementHeight/2 > kscreenEditorSpace) {
yPosition = kscreenEditorSpace - elementHeight/2;
}
return (CGPointMake(xPosition, yPosition));
}
//--------------------------------------------------------------------------------------------------------
// calcOffEditorFromXposition: yPosition: fromBoundsInSuperView:
// Description: Determine if moving the element to x, y will it be off the editor screen
//
//--------------------------------------------------------------------------------------------------------
-(BOOL) calcOffEditorFromXposition: (CGFloat) xPosition yPosition:(CGFloat) yPosition fromBoundsInSuperView:(CGRect) elementBoundsInSuperView{
BOOL offScreen = NO;
// Ge the Height/width based on SuperView Bounds
CGFloat elementWidth = CGRectGetWidth(elementBoundsInSuperView);
CGFloat elementHeight = CGRectGetHeight(elementBoundsInSuperView);
// Off Screen on Left
if (xPosition < elementWidth/2) {
offScreen = YES;
}
//Off Screen Right
if (xPosition + elementWidth/2 > kscreenEditorSpace) {
offScreen = YES;;
}
// Off Screen Top
if (yPosition < elementHeight/2) {
offScreen = YES;
}
//Off Screen Bottom
if (yPosition + elementHeight/2 > kscreenEditorSpace) {
offScreen = YES;
}
return (offScreen);
}

Resources