Drag separator to resize UIViews - ios

What would be to the best way of implementing an interface which consists of UIViews which are separated by a line, and the line can resize the views?
In it's simplest form, it could look like this:
----------------
| |
| View A |
| |
|--------------| < line which can be moved up and down, resizing the views
| |
| View B |
| |
----------------
It could have many more views.
My first thought would be making the line a draggable UIView with something like Touches, which resized the views according to it's position, but I'm sure there must be a more elegant solution.

First, define a gesture that detects whether you started on a border, and if the gesture changes, moves said borders:
#import <UIKit/UIGestureRecognizerSubclass.h>
- (void)viewDidLoad
{
[super viewDidLoad];
// I use long press gesture recognizer so it's recognized immediately
UILongPressGestureRecognizer *gesture = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
gesture.minimumPressDuration = 0.0;
gesture.allowableMovement = CGFLOAT_MAX;
gesture.delegate = self;
[self.containerView addGestureRecognizer:gesture];
}
- (void)handlePan:(UILongPressGestureRecognizer *)gesture
{
static NSArray *matches;
static CGPoint firstLocation;
if (gesture.state == UIGestureRecognizerStateBegan)
{
firstLocation = [gesture locationInView:gesture.view];
matches = [BorderBeingDragged findBordersBeingDraggedForView:gesture.view fromLocation:firstLocation];
if (!matches)
{
gesture.state = UIGestureRecognizerStateFailed;
return;
}
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
CGPoint location = [gesture locationInView:gesture.view];
CGPoint translation = CGPointMake(location.x - firstLocation.x, location.y - firstLocation.y);
[BorderBeingDragged dragBorders:matches translation:translation];
}
}
// if your subviews are scrollviews, you might need to tell the gesture recognizer
// to allow simultaneous gestures
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return TRUE;
}
Second, define a BordersBeingDragged class that does the detection of borders and the changing of borders:
typedef enum NSInteger {
kBorderTypeNone = 0,
kBorderTypeLeft = 1 << 0,
kBorderTypeRight = 1 << 1,
kBorderTypeTop = 1 << 2,
kBorderTypeBottom = 1 << 3
} BorderType;
#interface BorderBeingDragged : NSObject
#property (nonatomic, weak) UIView *view;
#property (nonatomic) BorderType borderTypes;
#property (nonatomic) CGRect originalFrame;
#end
static CGFloat const kTolerance = 15.0;
#implementation BorderBeingDragged
+ (NSArray *)findBordersBeingDraggedForView:(UIView *)view fromLocation:(CGPoint)point
{
NSMutableArray *matches = nil;
for (UIView *subview in view.subviews)
{
BorderType types = kBorderTypeNone;
CGRect frame = subview.frame;
// test top and bottom borders
if (point.x >= (frame.origin.x - kTolerance) &&
point.x <= (frame.origin.x + frame.size.width + kTolerance))
{
if (point.y >= (frame.origin.y - kTolerance) && point.y <= (frame.origin.y + kTolerance))
types |= kBorderTypeTop;
else if (point.y >= (frame.origin.y + frame.size.height - kTolerance) && point.y <= (frame.origin.y + frame.size.height + kTolerance))
types |= kBorderTypeBottom;
}
// test left and right borders
if (point.y >= (frame.origin.y - kTolerance) &&
point.y <= (frame.origin.y + frame.size.height + kTolerance))
{
if (point.x >= (frame.origin.x - kTolerance) && point.x <= (frame.origin.x + kTolerance))
types |= kBorderTypeLeft;
else if (point.x >= (frame.origin.x + frame.size.width - kTolerance) && point.x <= (frame.origin.x + frame.size.width + kTolerance))
types |= kBorderTypeRight;
}
// if we found any borders, add it to our array of matches
if (types != kBorderTypeNone)
{
if (!matches)
matches = [NSMutableArray array];
BorderBeingDragged *object = [[BorderBeingDragged alloc] init];
object.borderTypes = types;
object.view = subview;
object.originalFrame = frame;
[matches addObject:object];
}
}
return matches;
}
+ (void)dragBorders:(NSArray *)matches translation:(CGPoint)translation
{
for (BorderBeingDragged *object in matches)
{
CGRect newFrame = object.originalFrame;
if (object.borderTypes & kBorderTypeLeft)
{
newFrame.origin.x += translation.x;
newFrame.size.width -= translation.x;
}
else if (object.borderTypes & kBorderTypeRight)
{
newFrame.size.width += translation.x;
}
if (object.borderTypes & kBorderTypeTop)
{
newFrame.origin.y += translation.y;
newFrame.size.height -= translation.y;
}
else if (object.borderTypes & kBorderTypeBottom)
{
newFrame.size.height += translation.y;
}
object.view.frame = newFrame;
}
}
#end

You do essentially need to make the line-view draggable, but it doesn't need to be complicated.
Put viewA and viewB into a containerView
Add a pan gesture recognizer to the containerView configured for a single touch and set its delegate to your controller.
Implement gestureRecognizerShouldBegin: from the UIGestureRecognizerDelegate protocol and only allow it to begin if the touch in in the vicinity of the line-view.
In the gesture handler get the touch position in the containerView and set the line-view position and frames for viewA and viewB
That's pretty much it.

I suggest other behavior:
1. Press and hold 2 sec on line
2. Appears some imageView which you will drag

simplest way is to add gesture recognizer to your views and resize them according to the pan.

Related

iOS, can UIPanGesture be used to do UIImageView corner drag for resizing?

I've searched for some time (days) for a solution, but none really do what I need. (iOS, Objective C, BTW).
I have a UIImageView that I resize with a UIPanGestureRecognizer. The typical pan works fine. It seems like I am so close.
But I want to resize the ImageView by dragging a corner of the image and only resizing dimensions relevant to the selected corner. It works great if I only do my "handleResize" UIPanGesture method. But if I pinch or rotate the image, the bounds or frame get messed up. I think I need some sort of CGAffineTransform but I have not been able to get it to work.
I need help to point me in the right direction. I've been working with CGAffineTransforms but I may be on the wrong track.
In my ViewController.h I have a float, touchRadius, set to 25:
float touchRadius = 25;
I have a UIPanGestureRecognizer in my ViewController.m:
- (IBAction)handleResize:(UIPanGestureRecognizer *)recognizer {
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
//get the translation amount in x,y
CGPoint translation = [recognizer translationInView:self.view];
if (recognizer.state == UIGestureRecognizerStateBegan ||
recognizer.state == UIGestureRecognizerStateChanged) {
CGRect frame = recognizer.view.frame;
CGRect topLeft = CGRectMake(frame.origin.x,
frame.origin.y,
touchRadius, touchRadius);
CGRect bottomLeft = CGRectMake(frame.origin.x,
frame.origin.y + frame.size.height - touchRadius,
touchRadius, touchRadius);
CGRect topRight = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y,
touchRadius, touchRadius);
CGRect bottomRight = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y + frame.size.height - touchRadius,
touchRadius, touchRadius);
Boolean useNewFrame = YES;
CGRect newFrame = frame;
if (CGRectContainsPoint(topLeft, touch)) {
newFrame.origin.x += translation.x;
newFrame.origin.y += translation.y;
newFrame.size.width -= translation.x;
newFrame.size.height -= translation.y;
recognizer.view.frame = newFrame;
} else if (CGRectContainsPoint(topRight, touch)) {
newFrame.origin.y += translation.y;
newFrame.size.width += translation.x;
newFrame.size.height -= translation.y;
recognizer.view.frame = newFrame;
} else if (CGRectContainsPoint(bottomLeft, touch)) {
newFrame.origin.x += translation.x;
newFrame.size.width -= translation.x;
newFrame.size.height += translation.y;
recognizer.view.frame = newFrame;
} else if (CGRectContainsPoint(bottomRight, touch)) {
newFrame.size.width += translation.x;
newFrame.size.height += translation.y;
recognizer.view.frame = newFrame;
} else {
useNewFrame = NO;
}
if (useNewFrame) {
// make sure it doesn't go too small to touch
if (newFrame.size.width < touchRadius)
newFrame.size.width = touchRadius;
if (newFrame.size.height < touchRadius)
newFrame.size.height = touchRadius;
recognizer.view.frame = newFrame;
// I THINK I NEED A TRANSFORM HERE
} else {
// use the fallback translate
[recognizer.view setTransform:CGAffineTransformTranslate(recognizer.view.transform, translation.x, translation.y)];
}
}
[recognizer setTranslation:CGPointZero inView:self.view];
}
- (void)handlePinch:(UIPinchGestureRecognizer *)recognizer
{
if (recognizer.state == UIGestureRecognizerStateBegan ||
recognizer.state == UIGestureRecognizerStateChanged)
{
// make sure it stays visible
float scale = recognizer.scale;
if (recognizer.view.frame.size.width * scale > touchRadius * 2 ||
recognizer.view.frame.size.height * scale > touchRadius * 2) {
[recognizer.view setTransform:CGAffineTransformScale(recognizer.view.transform, scale, scale)];
recognizer.scale = 1;
}
}
}
- (void)handleRotate:(UIRotationGestureRecognizer *)recognizer {
UIGestureRecognizerState state = [recognizer state];
if (state == UIGestureRecognizerStateBegan || state == UIGestureRecognizerStateChanged)
{
CGFloat rotation = [recognizer rotation];
[recognizer.view setTransform:CGAffineTransformRotate(recognizer.view.transform, rotation)];
}
[recognizer setRotation:0];
}
Do I need a transform after I modify the view frame? Or should I pursue another path?
I'm trying to do this natively without using libs from others so I understand it. This simplistic example will be part of a larger project.
After some research I figured out how to modify the transform for a corner/side dragging pan, to resize an image. Here are the key elements to make it work:
Save the CGAffineTransform in each gesture (pan, pinch, rotate) if
the recognizer state = UIGestureRecognizerStateBegan
Apply new CGAAffineTransforms on the saved initial transform in each gesture (pan, pinch, rotate)
Save the corner/side detected in the UIGestureRecognizerStateBegan state
Clear the corner/side detected in the UIGestureRecognizerStateEnded state
For pan gesture, adjust the translation x/y values based on corner/side detected
Make sure touch radius is large enough to be useful (24 was too small, 48 works well)
The transform worked like this:
// pan the image
recognizer.view.transform = CGAffineTransformTranslate(initialTransform, tx, ty);
if (scaleIt) {
// the origin or size changed
recognizer.view.frame = newFrame;
}
The tx and ty values were the defaults returned from the recognizer if the pan was from the center of the image. But if the user touch was near a corner or side of the view frame, the tx/ty and the frame origin are adjusted to resize the view making it appear as if that corner or side were being dragged to resize the view.
For example:
CGRect newFrame = recognizer.view.frame;
if (currentDragType == DRAG_TOPLEFT) {
tx = -translation.x;
ty = -translation.y;
newFrame.origin.x += translation.x;
newFrame.origin.y += translation.y;
newFrame.size.width -= translation.x;
newFrame.size.height -= translation.y;
} else if (currentDragType == DRAG_TOPRIGHT) {
tx = translation.x;
ty = -translation.y;
newFrame.origin.y += translation.y;
newFrame.size.width += translation.x;
newFrame.size.height -= translation.y;
}
This makes the top left corner or top right corner move in or out according to how far the touch moved. Unlike a center pan, where the whole view moves along with the touch, the opposite corner (or side) remains fixed.
There were 2 problems I did not resolve:
if the image is rotated significantly, the corner/side detection (pan) does not work because I did not check for the touch in the rotated coordinate system (but the center pan still works fine)
once the image is rotated, pinch gestures work erratically and can resize an image to zero, making it invisible
I created a simple demo and uploaded it to GitHub: https://github.com/ByteSlinger/ImageGestureDemo
Yes, I know that link could go away someday, so here's the code:
ViewController.h
//
// ViewController.h
// ImageGestureDemo
//
// Created by ByteSlinger on 6/21/18.
// Copyright © 2018 ByteSlinger. All rights reserved.
//
#import <UIKit/UIKit.h>
NSString *APP_TITLE = #"Image Gesture Demo";
NSString *INTRO_ALERT = #"\nDrag, Pinch and Rotate the Image!"
"\n\nYou can also Drag, Pinch and Rotate the background image."
"\n\nDouble tap an image to reset it";
float touchRadius = 48; // max distance from corners to touch point
typedef NS_ENUM(NSInteger, DragType) {
DRAG_OFF,
DRAG_ON,
DRAG_CENTER,
DRAG_TOP,
DRAG_BOTTOM,
DRAG_LEFT,
DRAG_RIGHT,
DRAG_TOPLEFT,
DRAG_TOPRIGHT,
DRAG_BOTTOMLEFT,
DRAG_BOTTOMRIGHT
};
#interface ViewController : UIViewController <UIGestureRecognizerDelegate>
//callback to process gesture events
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer;
- (IBAction)handlePinch:(UIPinchGestureRecognizer *)recognizer;
- (IBAction)handleRotate:(UIRotationGestureRecognizer *)recognizer;
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer;
#end
ViewController.m
//
// ViewController.m
// ImageGestureDemo
//
// This is a DEMO. It shows how to pan, pinch, rotate and drag/resize a UIImageView.
//
// There is a background image and a foreground image. Both images can be
// panned, pinched and rotated, but only the foreground image can be resized
// by dragging one of it's corners or it's sides.
//
// NOTE: Sure, much of this code could have been put into a subclass of UIView
// or UIImageView. But for simplicity and reference sake, all code and
// methods are in one place, this ViewController subclass. There is no
// error checking at all. App tested on an iPhone 6+ and an iPad gen3.
//
// Features:
// - allows an image to be resized with pan gesture by dragging corners and sides
// - background image can be modified (pan, pinch, rotate)
// - foreground image can be modified (pan, pinch, rotate, drag/resize)
// - all image manipulation done within gestures linked from storyboard
// - all finger touches on screen show with yellow circles
// - when dragging, the touch circles turn to red (so you know when gestures start)
// - double tap on foreground image resets it to original size and rotation
// - double tap on background resets it and also resets the foreground image
// - screen and image touch and size info displayed on screen
// - uses CGAffineTransform objects for image manipulation
// - uses UIGestureRecognizerStateBegan in gestures to save transforms (the secret sauce...)
//
// Known Issues:
// - when the image is rotated, determining if a touch is on a corner or side
// does not work for large rotations. Need to check touch points against
// non rotated view frame and adjust accordingly.
// - after rotations, pinch and resize can shrink image to invisibility despite
// code attempts to prevent it.
//
// Created by ByteSlinger on 6/21/18.
// Copyright © 2018 ByteSlinger. All rights reserved.
//
#import "ViewController.h"
#interface ViewController ()
#property (strong, nonatomic) IBOutlet UIImageView *backgroundImageView;
#property (strong, nonatomic) IBOutlet UIImageView *foregroundImageView;
#property (strong, nonatomic) IBOutlet UILabel *screenInfoLabel;
#property (strong, nonatomic) IBOutlet UILabel *touchInfoLabel;
#property (strong, nonatomic) IBOutlet UILabel *imageInfoLabel;
#property (strong, nonatomic) IBOutlet UILabel *backgroundInfoLabel;
#property (strong, nonatomic) IBOutlet UILabel *changeInfoLabel;
#property (strong, nonatomic) IBOutlet UITapGestureRecognizer *backgroundTapGesture;
#property (strong, nonatomic) IBOutlet UITapGestureRecognizer *foregroundTapGesture;
#end
#implementation ViewController
CGRect originalImageFrame;
CGRect originalBackgroundFrame;
CGAffineTransform originalImageTransform;
CGAffineTransform originalBackgroundTransform;
NSMutableArray* touchCircles = nil;
DragType currentDragType = DRAG_OFF;
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
// set this to whatever your desired touch radius is
touchRadius = 48;
// In Storyboard this must have set to 1, then this seems to work ok
// when setting the double tap here
_foregroundTapGesture.numberOfTapsRequired = 2;
_backgroundTapGesture.numberOfTapsRequired = 2;
[self centerImageView:_foregroundImageView];
originalImageFrame = _foregroundImageView.frame;
originalBackgroundFrame = _backgroundImageView.frame;
originalImageTransform = _foregroundImageView.transform;
originalBackgroundTransform = _backgroundImageView.transform;
_backgroundImageView.contentMode = UIViewContentModeCenter;
_foregroundImageView.contentMode = UIViewContentModeScaleToFill; // allow stretch
[_backgroundImageView setUserInteractionEnabled:YES];
[_backgroundImageView setMultipleTouchEnabled:YES];
[_foregroundImageView setUserInteractionEnabled:YES];
[_foregroundImageView setMultipleTouchEnabled:YES];
[[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];
[[NSNotificationCenter defaultCenter]
addObserver:self selector:#selector(orientationChanged:)
name:UIDeviceOrientationDidChangeNotification
object:[UIDevice currentDevice]];
[_touchInfoLabel setText:nil];
[_changeInfoLabel setText:nil];
[_imageInfoLabel setText:nil];
[_backgroundInfoLabel setText:nil];
touchCircles = [[NSMutableArray alloc] init];
}
- (void)viewDidAppear:(BOOL)animated {
[self alert:APP_TITLE :INTRO_ALERT];
}
- (void) orientationChanged:(NSNotification *)note
{
UIDevice * device = note.object;
switch(device.orientation)
{
case UIDeviceOrientationPortrait:
/* start special animation */
break;
case UIDeviceOrientationPortraitUpsideDown:
/* start special animation */
break;
default:
break;
};
[_screenInfoLabel setText:[NSString stringWithFormat:#"Screen: %.0f/%.0f",
self.view.frame.size.width,self.view.frame.size.height]];
}
//
// Update the info labels from the passed objects
//
- (void) updateInfo:(UIView *)imageView touch:(CGPoint)touch change:(CGPoint)change {
NSString *label;
UILabel *infoLabel;
if (imageView == _foregroundImageView) {
label = #"Image: %0.f/%0.f, %0.f/%0.f";
infoLabel = _imageInfoLabel;
} else {
label = #"Background: %0.f/%0.f, %0.f/%0.f";
infoLabel = _backgroundInfoLabel;
}
[infoLabel setText:[NSString stringWithFormat:label,
imageView.layer.frame.origin.x,
imageView.layer.frame.origin.y,
imageView.layer.frame.size.width,
imageView.layer.frame.size.height]];
[_touchInfoLabel setText:[NSString stringWithFormat:#"Touch: %0.f/%.0f",
touch.x,touch.y]];
[_changeInfoLabel setText:[NSString stringWithFormat:#"Change: %0.f/%.0f",
change.x,change.y]];
}
//
// Center the passed image frame within it's bounds
//
- (void)centerImageView:(UIImageView *)imageView {
CGSize boundsSize = self.view.bounds.size;
CGRect frameToCenter = imageView.frame;
// center horizontally
if (frameToCenter.size.width < boundsSize.width)
frameToCenter.origin.x = (boundsSize.width - frameToCenter.size.width) / 2;
else
frameToCenter.origin.x = 0;
// center vertically
if (frameToCenter.size.height < boundsSize.height)
frameToCenter.origin.y = (boundsSize.height - frameToCenter.size.height) / 2;
else
frameToCenter.origin.y = 0;
imageView.frame = frameToCenter;
}
//
// Remove all touch circles
//
- (void)removeTouchCircles {
[touchCircles makeObjectsPerformSelector: #selector(removeFromSuperview)];
[touchCircles removeAllObjects];
}
//
// Draw a circle around the passed point where the user has touched the screen
//
- (void)drawTouchCircle:(UIView *)view fromCenter:(CGPoint)point ofRadius:(float)radius {
CGRect frame = CGRectMake(point.x - view.frame.origin.x - radius,
point.y - view.frame.origin.y - radius,
radius * 2, radius * 2);
UIView *circle = [[UIView alloc] initWithFrame:frame];
circle.alpha = 0.5;
circle.layer.cornerRadius = radius;
circle.backgroundColor = currentDragType == DRAG_OFF ? [UIColor yellowColor] : [UIColor redColor];
[circle.layer setBorderWidth:1.0];
[circle.layer setBorderColor:[[UIColor blackColor]CGColor]];
[view addSubview:circle];
[touchCircles addObject:circle];
}
//
// Draw a touch circle for the passed user touch
//
- (void)handleTouchEvent:(UIView *) view
atPoint:(CGPoint) point
forState:(UIGestureRecognizerState) state
clear:(Boolean) clear {
//NSLog(#"handleTouchEvent");
if (clear) {
[self removeTouchCircles];
}
if (state == UIGestureRecognizerStateEnded) {
[self removeTouchCircles];
} else {
[self drawTouchCircle:self.view fromCenter:point ofRadius:touchRadius];
}
[_touchInfoLabel setText:[NSString stringWithFormat:#"Touch: %0.f/%.0f",
point.x,point.y]];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//NSLog(#"touchesBegan");
[self removeTouchCircles];
NSSet *allTouches = [event allTouches];
NSArray *allObjects = [allTouches allObjects];
for (int i = 0;i < [allObjects count];i++)
{
UITouch *touch = [allObjects objectAtIndex:i];
CGPoint location = [touch locationInView: self.view];
[self handleTouchEvent:touch.view atPoint:location forState:UIGestureRecognizerStateBegan clear:NO];
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//NSLog(#"touchesMoved");
[self removeTouchCircles];
NSSet *allTouches = [event allTouches];
NSArray *allObjects = [allTouches allObjects];
for (int i = 0;i < [allObjects count];i++)
{
UITouch *touch = [allObjects objectAtIndex:i];
CGPoint location = [touch locationInView: self.view];
[self handleTouchEvent:touch.view atPoint:location forState:UIGestureRecognizerStateChanged clear:NO];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
//NSLog(#"touchesEnded");
[self removeTouchCircles];
}
- (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
//NSLog(#"touchesCancelled");
[self removeTouchCircles];
}
//
// Double tap resets passed image. If background image, also reset foreground image.
//
- (IBAction)handleDoubleTap:(UITapGestureRecognizer *)recognizer {
CGPoint touch = [recognizer locationInView: self.view];
if (recognizer.state == UIGestureRecognizerStateBegan ||
recognizer.state == UIGestureRecognizerStateChanged) {
[self handleTouchEvent:recognizer.view atPoint:touch forState:recognizer.state clear:NO];
} else {
[self removeTouchCircles];
}
[self alert:#"Reset" :#"The Image has been Reset!"];
CGRect frame = originalImageFrame;
CGAffineTransform transform = originalImageTransform;
if (recognizer.view == _backgroundImageView) {
_foregroundImageView.transform = transform;
_foregroundImageView.frame = frame;
[self updateInfo:_foregroundImageView touch:touch change:CGPointZero];
frame = originalBackgroundFrame;
transform = originalBackgroundTransform;
}
recognizer.view.transform = transform;
recognizer.view.frame = frame;
[self updateInfo:recognizer.view touch:touch change:CGPointZero];
}
- (void) setDragType:(CGRect)frame withTouch:(CGPoint)touch {
// the corners and sides of the current view frame
CGRect topLeft = CGRectMake(frame.origin.x,frame.origin.y,
touchRadius, touchRadius);
CGRect bottomLeft = CGRectMake(frame.origin.x,
frame.origin.y + frame.size.height - touchRadius,
touchRadius, touchRadius);
CGRect topRight = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y,
touchRadius, touchRadius);
CGRect bottomRight = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y + frame.size.height - touchRadius,
touchRadius, touchRadius);
CGRect leftSide = CGRectMake(frame.origin.x,frame.origin.y,
touchRadius, frame.size.height);
CGRect rightSide = CGRectMake(frame.origin.x + frame.size.width - touchRadius,
frame.origin.y,
touchRadius, frame.size.height);
CGRect topSide = CGRectMake(frame.origin.x,frame.origin.y,
frame.size.width, touchRadius);
CGRect bottomSide = CGRectMake(frame.origin.x,
frame.origin.y + frame.size.height - touchRadius,
frame.size.width, touchRadius);
if (CGRectContainsPoint(topLeft, touch)) {
currentDragType = DRAG_TOPLEFT;
} else if (CGRectContainsPoint(topRight, touch)) {
currentDragType = DRAG_TOPRIGHT;
} else if (CGRectContainsPoint(bottomLeft, touch)) {
currentDragType = DRAG_BOTTOMLEFT;
} else if (CGRectContainsPoint(bottomRight, touch)) {
currentDragType = DRAG_BOTTOMRIGHT;
} else if (CGRectContainsPoint(topSide, touch)) {
currentDragType = DRAG_TOP;
} else if (CGRectContainsPoint(bottomSide, touch)) {
currentDragType = DRAG_BOTTOM;
} else if (CGRectContainsPoint(leftSide, touch)) {
currentDragType = DRAG_LEFT;
} else if (CGRectContainsPoint(rightSide, touch)) {
currentDragType = DRAG_RIGHT;
} else if (CGRectContainsPoint(frame, touch)) {
currentDragType = DRAG_CENTER;
} else {
currentDragType = DRAG_OFF; // touch point is not in the view frame
}
}
//
// Return the unrotated size of the view
//
- (CGSize) getActualSize:(UIView *)view {
CGSize result;
//CGSize originalSize = view.frame.size;
CGAffineTransform originalTransform = view.transform;
float rotation = atan2f(view.transform.b, view.transform.a);
// reverse rotation of current transform
CGAffineTransform unrotated = CGAffineTransformRotate(view.transform, -rotation);
view.transform = unrotated;
// get the size of the "unrotated" view
result = view.frame.size;
// reset back to what it was
view.transform = originalTransform;
//NSLog(#"Size current = %0.f/%0.f, rotation = %0.2f, unrotated = %0.f/%0.f",
// originalSize.width,originalSize.height,
// rotation,
// result.width,result.height);
return result;
}
//
// Resize or Pan an image on the ViewController View
//
- (IBAction)handleResize:(UIPanGestureRecognizer *)recognizer {
static CGRect initialFrame;
static CGAffineTransform initialTransform;
static Boolean scaleIt = YES;
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
//get the translation amount in x,y
CGPoint translation = [recognizer translationInView:recognizer.view];
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialFrame = recognizer.view.frame;
initialTransform = recognizer.view.transform;
[self setDragType:recognizer.view.frame withTouch:touch];
scaleIt = YES;
}
if (recognizer.state == UIGestureRecognizerStateEnded) {
currentDragType = DRAG_OFF;
scaleIt = NO;
[self getActualSize:recognizer.view];
} else {
// our new view frame - start with the initial one
CGRect newFrame = initialFrame;
// adjust the translation point according to where the user touched the image
float tx = translation.x;
float ty = translation.y;
// resize by dragging a corner or a side
if (currentDragType == DRAG_TOPLEFT) {
tx = -translation.x;
ty = -translation.y;
newFrame.origin.x += translation.x;
newFrame.origin.y += translation.y;
newFrame.size.width -= translation.x;
newFrame.size.height -= translation.y;
} else if (currentDragType == DRAG_TOPRIGHT) {
ty = -translation.y;
newFrame.origin.y += translation.y;
newFrame.size.width += translation.x;
newFrame.size.height -= translation.y;
} else if (currentDragType == DRAG_BOTTOMLEFT) {
tx = -translation.x;
newFrame.origin.x += translation.x;
newFrame.size.width -= translation.x;
newFrame.size.height += translation.y;
} else if (currentDragType == DRAG_BOTTOMRIGHT) {
// origin does not change
newFrame.size.width += translation.x;
newFrame.size.height += translation.y;
} else if (currentDragType == DRAG_TOP) {
tx = 0;
newFrame.origin.y += translation.y;
newFrame.size.height -= translation.y;
} else if (currentDragType == DRAG_BOTTOM) {
tx = 0;
newFrame.size.height += translation.y;
} else if (currentDragType == DRAG_LEFT) {
tx = -translation.x;
ty = 0;
newFrame.origin.x += translation.x;
newFrame.size.width -= translation.x;
} else if (currentDragType == DRAG_BOTTOM) {
ty = 0;
newFrame.size.width += translation.x;
} else { //if (currentDragType == DRAG_CENTER) {
newFrame.origin.x += translation.x;
newFrame.origin.y += translation.y;
scaleIt = NO; // normal pan
}
// get the unrotated size of the view
CGSize actualSize = [self getActualSize:recognizer.view];
// make sure we can still touch the image
if (actualSize.width < touchRadius * 2) {
newFrame.size.width += touchRadius * 2;
tx = 0; // stop resizing
}
if (actualSize.height < touchRadius * 2) {
newFrame.size.height += touchRadius * 2;
ty = 0; // stop resizing
}
// pan the image
recognizer.view.transform = CGAffineTransformTranslate(initialTransform, tx, ty);
if (scaleIt) {
// the origin or size changed
recognizer.view.frame = newFrame;
}
}
[self updateInfo:recognizer.view touch:touch change:translation];
}
//
// Pan an image on the ViewController View
//
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
static CGAffineTransform initialTransform;
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
//get the translation amount in x,y
CGPoint translation = [recognizer translationInView:recognizer.view];
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialTransform = recognizer.view.transform;
currentDragType = DRAG_ON;
} else if (recognizer.state == UIGestureRecognizerStateEnded) {
currentDragType = DRAG_OFF;
[self getActualSize:recognizer.view];
}
recognizer.view.transform = CGAffineTransformTranslate(initialTransform, translation.x, translation.y);
[self updateInfo:recognizer.view touch:touch change:translation];
}
//
// Pinch (resize) an image on the ViewController View
//
- (void)handlePinch:(UIPinchGestureRecognizer *)recognizer {
static CGSize initialSize;
static CGAffineTransform initialTransform;
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
//get the translation amount in x,y
CGPoint change = CGPointMake(recognizer.view.transform.tx, recognizer.view.transform.ty);
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialSize = recognizer.view.frame.size;
initialTransform = recognizer.view.transform;
currentDragType = DRAG_ON;
} else if (recognizer.state == UIGestureRecognizerStateEnded) {
currentDragType = DRAG_OFF;
[self getActualSize:recognizer.view];
}
// make sure it stays visible
float scale = recognizer.scale;
float newWidth = initialSize.width * scale;
float newHeight = initialSize.height * scale;
// make sure we can still touch it
if (newWidth > touchRadius * 2 && newHeight > touchRadius * 2) {
// scale the image
recognizer.view.transform = CGAffineTransformScale(initialTransform, scale, scale);
}
[self updateInfo:recognizer.view touch:touch change:change];
}
//
// Rotate an image on the ViewController View
//
- (IBAction)handleRotate:(UIRotationGestureRecognizer *)recognizer {
static CGFloat initialRotation;
static CGAffineTransform initialTransform;
// where the user has touched down
CGPoint touch = [recognizer locationInView: self.view];
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialTransform = recognizer.view.transform;
initialRotation = atan2f(recognizer.view.transform.b, recognizer.view.transform.a);
currentDragType = DRAG_ON;
} else if (recognizer.state == UIGestureRecognizerStateEnded) {
currentDragType = DRAG_OFF;
[self getActualSize:recognizer.view];
}
recognizer.view.transform = CGAffineTransformRotate(initialTransform, recognizer.rotation);
[self updateInfo:recognizer.view touch:touch change:CGPointMake(initialRotation, recognizer.rotation)];
}
//
// Prevent simultaneous gestures so my transforms don't get funky
// (may not be necessary ... )
//
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return NO;
}
//
// Spew a message to the user
//
- (void)alert:(NSString *) title :(NSString *)message {
UIAlertController *alert = [UIAlertController
alertControllerWithTitle: title
message: message
preferredStyle:UIAlertControllerStyleAlert];
UIAlertAction *okButton = [UIAlertAction
actionWithTitle:#"Ok"
style:UIAlertActionStyleDefault
handler:^(UIAlertAction * action) {
}];
[alert addAction:okButton];
[self presentViewController:alert animated:YES completion:nil];
}
#end

Add boundaries to a UIPangesture

I have a UILabel in a subview and I have a panGesture and pinchGesture on the UILabel. As of right now I can move the UILabel crossed all views. I want this UILabel do stay within the area of the subView. How would I accomplish this?
- (void)handlePanGesture:(UIPanGestureRecognizer *)panGesture {
CGPoint translation = [panGesture translationInView:panGesture.view.superview];
if (UIGestureRecognizerStateBegan == panGesture.state ||UIGestureRecognizerStateChanged == panGesture.state) {
panGesture.view.center = CGPointMake(panGesture.view.center.x + translation.x,
panGesture.view.center.y + translation.y);
[panGesture setTranslation:CGPointZero inView:self.view];
}
}
In this line,
CGPoint translation = [panGesture translationInView:panGesture.view.superview];
It is setting it to the superView and I am trying to set it to my subView but I can't seem to figure it out.
Here is my code to handle draggable button and restrict it to the main view boundary
I hope this code will help you
CGPoint translation = [recognizer translationInView:self.view];
CGRect recognizerFrame = recognizer.view.frame;
recognizerFrame.origin.x += translation.x;
recognizerFrame.origin.y += translation.y;
// Check if UIImageView is completely inside its superView
if (CGRectContainsRect(self.view.bounds, recognizerFrame)) {
recognizer.view.frame = recognizerFrame;
}
// Else check if UIImageView is vertically and/or horizontally outside of its
// superView. If yes, then set UImageView's frame accordingly.
// This is required so that when user pans rapidly then it provides smooth translation.
else {
// Check vertically
if (recognizerFrame.origin.y < self.view.bounds.origin.y) {
recognizerFrame.origin.y = 0;
}
else if (recognizerFrame.origin.y + recognizerFrame.size.height > self.view.bounds.size.height) {
recognizerFrame.origin.y = self.view.bounds.size.height - recognizerFrame.size.height;
}
// Check horizantally
if (recognizerFrame.origin.x < self.view.bounds.origin.x) {
recognizerFrame.origin.x = 0;
}
else if (recognizerFrame.origin.x + recognizerFrame.size.width > self.view.bounds.size.width) {
recognizerFrame.origin.x = self.view.bounds.size.width - recognizerFrame.size.width;
}
}
// Reset translation so that on next pan recognition
// we get correct translation value
[recognizer setTranslation:CGPointZero inView:self.view];

UIPanGestureRecognizer calculate frame origin to maintain adjacent subview position for image cropping

I have a cropView that holds a cropBoxView subview that also holds four subviews that are square and each have a UIPanGestureRecognizer on them to enable resizing of the crop area.
What I'm trying to do is change the frame size but maintain the adjacent square corner's position, which means I need to calculate a new origin. I'm able to successfully change the frame size, but I can't figure out how to calculate the new origin.
Currently if I pan the bottom right corner of the view it works the way I want (without needing to adjust the origin in the code below), because the adjacent corner is the top left corner so its origin doesn't need to change.
I'd appreciate any help offered.
Edit: See my answer below for a sample GIF of the result and code
CGPoint translation = [recognizer translationInView:self.cropView];
CGRect recognizerFrame = self.cropView.cropBoxView.frame;
// Todo: calculate new origin based on adjacent crop corner
CGFloat testX = recognizerFrame.size.width += translation.x;
CGFloat testY = recognizerFrame.size.height += translation.y;
recognizerFrame.origin.x = recognizerFrame.origin.x - (recognizerFrame.size.width - testX);
recognizerFrame.origin.y = recognizerFrame.origin.y - (recognizerFrame.size.height - testY);
recognizerFrame.size.width += translation.x;
recognizerFrame.size.height += translation.y;
[recognizer setTranslation:CGPointZero inView:self.cropView];
I figured this out on my own, and it seems to work quite nicely.
Result:
What I did was subclass a UIPanGestureRecognizer and define an enum for when the gesture's shouldReceiveTouch delegate method is called to determine which corner was touched in the cropBoxView. So now instead of having a seperate UIPanGestureRecognizer for each corner, I have just one for all four corners now.
Code:
CropBoxCornerPanGestureRecognizer.h
#import <UIKit/UIGestureRecognizerSubclass.h>
#import <UIKit/UIPanGestureRecognizer.h>
typedef NS_ENUM(NSUInteger, corner)
{
TopLeftCorner = 1,
TopRightCorner,
BottomLeftCorner,
BottomRightCorner
};
#interface CropBoxCornerPanGestureRecognizer : UIPanGestureRecognizer
#property (nonatomic, assign) NSUInteger corner;
#end
CropBoxCornerPanGestureRecognizer.m
#import "CropBoxCornerPanGestureRecognizer.h"
#interface CropBoxCornerPanGestureRecognizer ()
#end
#implementation CropBoxCornerPanGestureRecognizer
#end
ViewController.h:
#interface ViewController : UIViewController <UIGestureRecognizerDelegate>
#end
ViewController.m:
#interface ViewController ()
#property (nonatomic, strong) CropBoxCornerPanGestureRecognizer *cropBoxCornerPanRecognizer;
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
self.cropBoxCornerPanRecognizer = [[CropBoxCornerPanGestureRecognizer alloc]init];
self.cropBoxCornerPanRecognizer.maximumNumberOfTouches = 1;
self.cropBoxCornerPanRecognizer.delaysTouchesBegan = NO;
self.cropBoxCornerPanRecognizer.delaysTouchesEnded = NO;
self.cropBoxCornerPanRecognizer.cancelsTouchesInView = NO;
[self.cropBoxCornerPanRecognizer addTarget:self action:#selector(panCropBoxCorner:)];
self.cropBoxCornerPanRecognizer.delegate = self;
self.cropView.cropBoxView addGestureRecognizer:self.cropBoxCornerPanRecognizer];
}
- (void)panCropBoxCorner:(CropBoxCornerPanGestureRecognizer *)recognizer
{
if (recognizer.state == UIGestureRecognizerStateBegan || recognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint translation = [recognizer translationInView:self.cropView];
CGRect recognizerFrame = self.cropView.cropBoxView.frame;
if (recognizer.corner == TopLeftCorner)
{
recognizerFrame.size.width -= translation.x;
recognizerFrame.size.height -= translation.y;
recognizerFrame.origin.x += translation.x;
recognizerFrame.origin.y += translation.y;
}
else if (recognizer.corner == TopRightCorner)
{
recognizerFrame.size.width += translation.x;
recognizerFrame.size.height -= translation.y;
recognizerFrame.origin.y += translation.y;
}
else if (recognizer.corner == BottomLeftCorner)
{
recognizerFrame.size.width -= translation.x;
recognizerFrame.size.height += translation.y;
recognizerFrame.origin.x += translation.x;
}
else if (recognizer.corner == BottomRightCorner)
{
recognizerFrame.size.width += translation.x;
recognizerFrame.size.height += translation.y;
}
CGFloat minFrameSize = 40.0;
CGFloat maxFrameWidth = self.cropView.frame.size.width;
CGFloat maxFrameHeight = self.cropView.frame.size.height;
if (recognizerFrame.size.width < minFrameSize)
{
recognizerFrame.size = CGSizeMake(minFrameSize, recognizerFrame.size.height);
}
if (recognizerFrame.size.height < minFrameSize)
{
recognizerFrame.size = CGSizeMake(recognizerFrame.size.width, minFrameSize);
}
if (recognizerFrame.size.width > maxFrameWidth)
{
recognizerFrame.size = CGSizeMake(maxFrameWidth, recognizerFrame.size.height);
}
if (recognizerFrame.size.height > maxFrameHeight)
{
recognizerFrame.size = CGSizeMake(recognizerFrame.size.width, maxFrameHeight);
}
[recognizer setTranslation:CGPointZero inView:self.cropView];
}
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch
{
if (gestureRecognizer == self.cropBoxCornerPanRecognizer)
{
CropBoxCornerPanGestureRecognizer *recognizer = (CropBoxCornerPanGestureRecognizer *)gestureRecognizer;
if (CGRectContainsPoint(self.cropView.cropBoxView.topLeftCorner.frame, [touch locationInView:self.cropView.cropBoxView]))
{
recognizer.corner = TopLeftCorner;
return YES;
}
if (CGRectContainsPoint(self.cropView.cropBoxView.topRightCorner.frame, [touch locationInView:self.cropView.cropBoxView]))
{
recognizer.corner = TopRightCorner;
return YES;
}
if (CGRectContainsPoint(self.cropView.cropBoxView.bottomLeftCorner.frame, [touch locationInView:self.cropView.cropBoxView]))
{
recognizer.corner = BottomLeftCorner;
return YES;
}
if (CGRectContainsPoint(self.cropView.cropBoxView.bottomRightCorner.frame, [touch locationInView:self.cropView.cropBoxView]))
{
recognizer.corner = BottomRightCorner;
return YES;
}
return NO;
}
return YES;
}

Pinching/Panning a CCNode in Cocos2d 3.0

I want to zoom in out a CCNode by pinching and panning the screen. The node has a background which is very large but the portionof it shown on the screen. That node also contains other sprites.
What I have done by now is that first I register UIPinchGestureRecognizer
UIPinchGestureRecognizer * pinchRecognizer = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(handlePinchFrom:)];
[[[CCDirector sharedDirector] view] addGestureRecognizer: pinchRecognizer];
-(void)handlePinchFrom:(UIPinchGestureRecognizer *) pinch
{
if(pinch.state == UIGestureRecognizerStateEnded) {
prevScale = 1;
}
else {
CGFloat dscale = [self scale] - prevScale + pinch.scale;
if(dscale > 0)
{
deltaScale = dscale;
}
CGAffineTransform transform = CGAffineTransformScale(pinch.view.transform, deltaScale, deltaScale);
[pinch.view setTransform: transform];
// [_contentNode setScale:deltaScale];
prevScale = pinch.scale;
}
}
The problem is that it scalw whole UIView not the CCNode. I have also tried to by setting the scale of my _contentNode.
**EDIT
I ave also tried this
- (void)handlePinchGesture:(UIPinchGestureRecognizer*)aPinchGestureRecognizer
{
if (pinch.state == UIGestureRecognizerStateBegan || pinch.state == UIGestureRecognizerStateChanged) {
CGPoint midpoint = [pinch locationInView:[CCDirector sharedDirector].view];
CGSize winSize = [CCDirector sharedDirector].viewSize;
float x = midpoint.x/winSize.width;
float y = midpoint.y/winSize.height;
_contentNode.anchorPoint = CGPointMake(x, y);
float scale = [pinch scale];
_contentNode.scale *= scale;
pinch.scale = 1;
}
}
But it zoom from the bottom left of the screen.
I had the same problem. I use CCScrollView, that contains CCNode that larger than device screen. I want scroll and zoom it, but node shouldnt scroll out of screen, and scale smaller than screen. So, i create my subclass of CCScrollView, where i handle pinch. It has some strange glitches, but it works fine at all.
When pinch began i set anchor point of my node to pinch center on node space. Then i need change position of my node proportional to shift of anchor point, so moving anchor point doesn't change nodes location on view:
- (void)handlePinch:(UIPinchGestureRecognizer*)recognizer
{
if (recognizer.state == UIGestureRecognizerStateEnded) {
_previousScale = self.contentNode.scale;
}
else if (recognizer.state == UIGestureRecognizerStateBegan) {
float X = [recognizer locationInNode:self.contentNode].x / self.contentNode.contentSize.width;
float Y = [recognizer locationInNode:self.contentNode].y / self.contentNode.contentSize.height;
float positionX = self.contentNode.position.x + self.contentNode.boundingBox.size.width * (X - self.contentNode.anchorPoint.x);
float positionY = self.contentNode.position.y + self.contentNode.boundingBox.size.height * (Y - self.contentNode.anchorPoint.y);
self.contentNode.anchorPoint = ccp(X, Y);
self.contentNode.position = ccp(positionX, positionY);
}
else {
CGFloat scale = _previousScale * recognizer.scale;
if (scale >= maxScale) {
self.contentNode.scale = maxScale;
}
else if (scale <= [self minScale]) {
self.contentNode.scale = [self minScale];
}
else {
self.contentNode.scale = scale;
}
}
}
Also i need change CCScrollView min and max scroll, so my node never scroll out of view. Default anchor point is (0,1), so i need shift min and max scroll proportional to the new anchor point.
- (float) maxScrollX
{
if (!self.contentNode) return 0;
float maxScroll = self.contentNode.boundingBox.size.width - self.contentSizeInPoints.width;
if (maxScroll < 0) maxScroll = 0;
return maxScroll - self.contentNode.boundingBox.size.width * self.contentNode.anchorPoint.x;
}
- (float) maxScrollY
{
if (!self.contentNode) return 0;
float maxScroll = self.contentNode.boundingBox.size.height - self.contentSizeInPoints.height;
if (maxScroll < 0) maxScroll = 0;
return maxScroll - self.contentNode.boundingBox.size.height * (1 - self.contentNode.anchorPoint.y);
}
- (float) minScrollX
{
float minScroll = [super minScrollX];
return minScroll - self.contentNode.boundingBox.size.width * self.contentNode.anchorPoint.x;
}
- (float) minScrollY
{
float minScroll = [super minScrollY];
return minScroll - self.contentNode.boundingBox.size.height * (1 - self.contentNode.anchorPoint.y);
}
UIGestureRecognizerStateEnded doesn't have locationInNode: method, so i added it by category. It just return touch location on node space:
#import "UIGestureRecognizer+locationInNode.h"
#implementation UIGestureRecognizer (locationInNode)
- (CGPoint) locationInNode:(CCNode*) node
{
CCDirector* dir = [CCDirector sharedDirector];
CGPoint touchLocation = [self locationInView: [self view]];
touchLocation = [dir convertToGL: touchLocation];
return [node convertToNodeSpace:touchLocation];
}
- (CGPoint) locationInWorld
{
CCDirector* dir = [CCDirector sharedDirector];
CGPoint touchLocation = [self locationInView: [self view]];
return [dir convertToGL: touchLocation];
}
#end

UIScrollView Adjust Bounciness

Is it possible to adjust the amount of bounce in a UIScrollView?
For example, if dragging the scroll view 100pt past the end of the content moves the scroll view by 50pt, I'd like to be able to reduce the distance travelled by the same drag to 20pt.
I have experimented with a few things without success.
Adjusting the contentOffset in the scrollViewDidScroll: delegate method does not seem possible because you lose the original scrolling position when the value is set and the content offset approaches zero.
Adding a transform which translates the scroll view by some fraction of the content offset in reverse seems like a good idea. This breaks all the touch tracking and causes a mess of problems. Additionally, in many other situations, moving the scroll view would break the layout of a view.
One thing you can do is to make your own scroll view with UIView + UIPanGestureRecognizer.
(Kind of an over-kill, depending on how bad you want this custom effect.)
Here's a good explanation of Apple's rubber band algorithm.
The core of it is:
b = (1.0 – (1.0 / ((x * c / d) + 1.0))) * d
where:
x = distance from the edge
c = constant value, UIScrollView uses 0.55
d = dimension, either width or height
You can then play with the value of c and x to get the effect you desired.
Some code snippet to get you started:
- (IBAction)handlePan:(UIPanGestureRecognizer *)sender
{
static CGRect beginRect;
CGPoint translation = [sender translationInView:sender.view.superview];
CGPoint velocity = [sender velocityInView:sender.view.superview];
CGRect currentRect = CGRectOffset(beginRect, 0, translation.y);
if (sender.state == UIGestureRecognizerStateBegan) {
beginRect = sender.view.frame;
}
else if (sender.state == UIGestureRecognizerStateChanged) {
if (currentRect.origin.y > self.naturalHidePosition) {
CGFloat distanceFromEdge = currentRect.origin.y - self.naturalHidePosition;
CGFloat height = CGRectGetHeight(sender.view.frame);
CGFloat b = [self rubberBandDistance:distanceFromEdge dimension:height];
currentRect.origin.y = self.naturalHidePosition + b;
}
else if (currentRect.origin.y < self.naturalShowPosition) {
CGFloat distanceFromEdge = self.naturalShowPosition - currentRect.origin.y;
CGFloat height = CGRectGetHeight(sender.view.frame);
CGFloat b = [self rubberBandDistance:distanceFromEdge dimension:height];
currentRect.origin.y = self.naturalShowPosition - b;
}
sender.view.frame = currentRect;
}
else if (sender.state == UIGestureRecognizerStateEnded) {
if (velocity.y < 0) {
currentRect.origin.y = self.naturalShowPosition;
}
else if (velocity.y > 0) {
currentRect.origin.y = self.naturalHidePosition;
}
else if (currentRect.origin.y > (0.5 * (self.naturalShowPosition + self.naturalHidePosition))) {
currentRect.origin.y = self.naturalHidePosition;
}
else {
currentRect.origin.y = self.naturalShowPosition;
}
[UIView animateWithDuration:0.25 delay:0 options:UIViewAnimationOptionCurveEaseOut animations:^{
sender.view.frame = currentRect;
} completion:nil];
}
}
- (CGFloat)rubberBandDistance:(CGFloat)distanceFromEdge dimension:(CGFloat)d
{
CGFloat c = 0.55;
CGFloat b = (1.0 - (1.0 / ((distanceFromEdge * c / d) + 1.0))) * d;
return b;
}

Resources