Using CGAffineTransformScale with UIAttachmentBehavior (UIDynamicAnimator) - ios

In a subclass of UIButton, I attach the UIButton to a UIAttachmentBehavior that lets a user drag the button around the screen with their finger.
In - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event I add the button to the UIAttachmentBehavior, then add the behavior to the UIDynamicAnimator. During - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event I update the anchor of the UIAttachmentBehavior to the touch point; this creates the desired drag effect.
Now I wish to use CGAffineTransformScale to increase the size of the button when the touch begins so the user can see the button under their finger. My issue is that the transform I apply with CGAffineTransformScale is immediately over written the second I add the attachment behavior. The result is a quick flicker of the button scaling up, but then it returns back to the original size.
I have tried [_animator removeAllBehaviors] before applying the CGAffineTransformScale, then adding the behaviors back. I have also tried [_animator updateItemUsingCurrentState:self] after applying the CGAffineTransformScale, just before adding the attachment behavior. Neither resolve the issue.
UPDATE 1: Thinking about HalR's answer below, I decided to try applying the scale transform with every touch. So, I added the CGAffineTransformScale call to both the touchesMoved: and touchesEnded. I am using CGAffineTransformScale vs CGAffineTransformMakeScale because it allows me to preserve the slight rotation the attachment behavior adds. It got me a lot closer. The button now moves around the screen while being scaled. It isn't perfect though. There is a flicker when you are not moving around the screen, and if you stop moving, but keep the touch down, the button returns to the original size. Almost there...any suggestions?
Here is my updated code:
#interface DragButton : UIButton < UIDynamicAnimatorDelegate >
#import "DragButton"
#import <QuartzCore/QuartzCore.h>
#implementation DragButton
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self.referenceView];
self.transform = CGAffineTransformMakeScale(1.5, 1.5);
_touchAttachmentBehavior = [[UIAttachmentBehavior alloc] initWithItem:self attachedToAnchor:touchLocation];
[_animator addBehavior:_touchAttachmentBehavior];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self.referenceView];
self.transform = CGAffineTransformScale(self.transform, 1.5, 1.5);
_touchAttachmentBehavior.anchorPoint = touchLocation;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
self.transform = CGAffineTransformScale(self.transform, 1.5, 1.5);
[_animator removeBehavior:_touchAttachmentBehavior];
}

How does UIAttachmentBehavior affect a view?
It does it by modifying the view's transform.
In this Ray Wenderlich tutorial Colin Eberhardt logs the transform as the view is affected by the behavior.
Even though the transform is never set, or modified directly, it changes as the view is moved by the behavior.
So you can't set your transform, and hope for it to stay set, because it is being set by the behavior.
In a side note, if you are trying to set the transform to scale by 1.5, you should use this:
self.transform = CGAffineTransformMakeScale(1.5, 1.5);
otherwise every time your touchesBegan is called it will grow by another 50%.
In the documentation for UIDynamicAnimator, its says:
"A dynamic animator automatically reads the initial state (position
and rotation) of each dynamic item you add to it, and then takes
responsibility for updating the item’s state. If you actively change
the state of a dynamic item after you’ve added it to a dynamic
animator, call this method to ask the animator to read and incorporate
the new state."
So just call your transform after adding the behavior, then call updateItemUsingCurrentState:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self.referenceView];
_touchAttachmentBehavior = [[UIAttachmentBehavior alloc] initWithItem:self attachedToAnchor:touchLocation];
[_animator addBehavior:_touchAttachmentBehavior];
self.transform = CGAffineTransformMakeScale(1.5, 1.5);
[_animator updateItemUsingCurrentState:self];
}

Related

Transfer touches from uiview to uiscrollview

I have a UIScrollview (With image) in UIView (Red color).
UIView - {0,0, 320, 236} UIScrollview - {0, 8, 300, 220}
In my uiview, I got touches.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{}
I need to transfer touch from UIView to UIScrollview. (e.g if user swipe from right side of uiview to left, uiscrollview should scroll to left in sync with user touch). May I know how to do?
The idea is to get a reference to your scrollView inside your UIView, calculate the x touches delta and adjust the contentOffset property of the scrollView accordingly.
So, in your UIView class:
#implementation MyView{
CGPoint _startPosition;
CGPoint _previousPosition;
}
Initialize the above variables in your
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
_startPosition = [touch locationInView:self];
_previousPosition = _startPosition;
}
Then in your touchesMoved:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:self.scrollView];
if(!CGPointEqualToPoint(_previousPosition, _startPosition)){
CGFloat deltaX = _previousPosition.x-currentPosition.x;
[UIView animateWithDuration:0.1 animations:^{
//
self.scrollView.contentOffset = CGPointMake(self.scrollView.contentOffset.x +
deltaX, self.scrollView.contentOffset.y);
}];
}
_previousPosition = currentPosition;
}
The scrollView.contentOffset is adjusted in an animation block to make the scroll smooth.
Link to a working project: https://github.com/sliaquat/stack_overlfow_answer_28036976

Drag an image when touched on the object itself

I created a very simple app that contains a draggable UIImageView but I faced a problem that the image can be dragged without touching it. I'm sure that the hole issue is from anyObject but I have no idea how to replace it I tried self.view.image but it failed. So here's my code
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
lineView.center = location;
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
}
any ideas?
Consider using a pan gesture recogniser. Add it only to the image view (and enable user interaction on the view). Now you will only get action method callbacks from the gesture when the view has actually been touched, your code becomes simpler, and you can move the view with that position.
Also, in your current code, it's weird to call touchesBegan: from touchesMoved:. If you keep that code (because you don't want to use a gesture), you should refactor the code that moves the view out into a different method and have both touchesBegan: and touchesMoved: call that method.
Try the following
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
if ([lineView pointInside:location withEvent:event])
lineView.center = location;
}

iOS >> Drag and Drop Issue >> UIView Returns to Original Position

I have 4 UIImageViews set in IB. I also have a UILabel describing the status (as described in code below).
I use the touchesBegan, touchesMoved and touchesEnd methods to capture the object move as follow:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* aTouch = [touches anyObject];
UIView* aView = [aTouch view];
if (aView != self.view) {
[self.view bringSubviewToFront:aView];
self.statusLabel.text = #"You're Dragging...";
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* aTouch = [touches anyObject];
UIView* aView = [aTouch view];
if (aView != self.view) {
aView.center = [aTouch locationInView:self.view];
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
self.statusLabel.text = #"At Ease";
}
The problem is that after I move (in the simulator) one of the UIImageViews they "jump" back to their original position as its set in IB instead of remaining where I dropped them.
Why does it happen?
How can I "capture" the UIImageView new position at "touchesEnd" and avoid this "jump"?
P.S. I noticed that if I don't update the label at "touchesEnd" the UIImageView remains at its last position until I click on another UIImageView; what's going on over here?
your code seems to be correct..you just try it after unchecking auto layout ...
Later i found author has posted the answer himself...so,sorry for this post..
I un-checked the "Auto Layout" checkbox in IB and it worked fine.
That can't be the "best practice" to do that... but it works.

uiview touch move get the anchorpoint on uiview

I have a view which, when I click on it, will move left and right. I only get the point center of the view. How can I get the uiview on the anchorpoint because where ever i click on the view it will move.
Here is my code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesMoved called");
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchpoint=[touch locationInView:self.view];
if ([touch view]==secondView)
{
secondView.center=touchpoint;
[self animateFirstPoint:[touch view]];
}
}
If I get you right you're searching for the anchorpoint of the UIView.
First import QuartzCore:
#import <QuartzCore/QuartzCore.h>
Now you can use the anchorpoint of your view:
secondView.layer.anchorPoint = touchpoint;
Remember you have to specify anchor point like CGPointMake(x,y), where x and y are more or equal to 0 or less or equal to 1;
For example:
CGPoint selectionAnchorPoint = CGPointMake(0.15, 0.2);

Move a UIView with my finger

I'm trying to move a UIView (let's call it viewA), which is located on top of another UIView (let's call it viewB). viewB has the size of the iPad screen, and view A is much much smaller.
I manage to move things, but the whole screen moves (viewA + viewB), not only viewA.
Here is my code for viewA class:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self->view];
if (CGRectContainsPoint(self.window.frame, touchLocation)) {
dragging = YES;
oldY = touchLocation.y;
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self->view];
if (dragging) {
CGRect frame = self.window.frame;
frame.origin.y = self.window.frame.origin.y + touchLocation.y - oldY;
self.window.frame = frame;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
dragging = NO;
}
What I do not understand, is that, I implemented these methods in the viewA class. "Self" should concern this viewA, right? So why does the whole page move when my finger move on the screen?
You are moving the entire window, which contains all the views, you should only move the view you want, viewA in your case.

Resources