addSubview resets the anchorPoint - ios

I'm performing panning and pinching using apple example methods to maintain the zooming where the fingers pinch. The code moves the layer anchor point where the fingers touch. Everything works fine but when I add a subview the anchor point resets and everything is out of place.
I tried to change the subviews anchor points to match the superview ones but it comes back to the default anchor point.
Code example for pinching and the anchor point adjustment:
/**
Scale and rotation transforms are applied relative to the layer's anchor point
this method moves a gesture recognizer's view's anchor point between the user's
fingers.
*/
- (void)adjustAnchorPointForGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
UIView *piece = gestureRecognizer.view;
CGPoint locationInView = [gestureRecognizer locationInView:piece];
CGPoint locationInSuperview = [gestureRecognizer locationInView:piece.superview];
piece.layer.anchorPoint = CGPointMake(locationInView.x / piece.bounds.size.width, locationInView.y / piece.bounds.size.height);
piece.center = locationInSuperview;
}
}
/**
Scale the piece by the current scale.
Reset the gesture recognizer's rotation to 0 after applying so the next callback
is a delta from the current scale.
*/
- (IBAction)scalePiece:(UIPinchGestureRecognizer *)gestureRecognizer
{
[self adjustAnchorPointForGestureRecognizer:gestureRecognizer];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
[gestureRecognizer view].transform = CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale],[gestureRecognizer scale]);
[gestureRecognizer setScale:1];
}
}
I have a self.canvasView and a self.workingView The gestures are in self.canvasView and self.workingView is a subview of self.canvasView.
Everything works fine, but every time I add a a subview to the self.workingView the anchor point resets and everything is out of place.
How can I avoid this behaviour? I couldn't find in the documentation that addSubview resets the anchorPoint, so I assume that I'm doing something wrong here.

Related

Resize UIImageView with finger IOS

I have a UIImageView (and a UITextView) and I am changing the hight and with of them both using a plus and minus button. However I wanted to do like you can do in most programs, where a box appears round my views, and the user can drag a corner of it to resize. Only one corner needs to be dragged. The opposite is fixed. How do you do this?
Another way - is by GestureRecognizer. In some task i resized image like this:
- (void)resizeImage:(UIPinchGestureRecognizer *)recognizer
{
if ([recognizer state] == UIGestureRecognizerStateBegan)
previousScale = [recognizer scale];
UIView *viewToResize = recognizer.view;
if ([recognizer state] == UIGestureRecognizerStateChanged)
{
CGFloat currentScale = [[viewToResize.layer valueForKeyPath:#"transform.scale"] floatValue];
CGFloat newScale = 1 - (previousScale - [recognizer scale]);
newScale = MIN(newScale, MAX_SCALE / currentScale);
newScale = MAX(newScale, MIN_SCALE / currentScale);
CGAffineTransform transform = CGAffineTransformScale([viewToResize transform], newScale, newScale);
viewToResize.transform = transform;
previousScale = [recognizer scale];
}
}
ios objective-c
I never done this before but i think you would have to override the touchesBegan and touchesMoved functions of the view. When you initiate the touchesBegan method make sure you are touching the correct view, (Imageview or UItextView) set a boolen in their stating. imTouchingOneOrTheOther. So when you now hit the touchesMoved function you can adjust the size of the frame accordingly. I would try to adjust the frame first with UIView block based animations and if that doesn't look ok then i would play around with coreAnimation. Let me know how it works out.

Issue scaling layer at the center of a pinch gesture

I am currently having a map(tilemap) within a layer that i would like to pan/zoom using the following code:
- (void) pinchGestureUpdated: (UIPinchGestureRecognizer *) recognizer {
if([recognizer state] == UIGestureRecognizerStateBegan) {
_lastScale = [recognizer scale];
CGPoint touchLocation1 = [recognizer locationOfTouch:0 inView:recognizer.view];
CGPoint touchLocation2 = [recognizer locationOfTouch:1 inView:recognizer.view];
CGPoint centerGL = [[CCDirector sharedDirector] convertToGL: ccpMidpoint(touchLocation1, touchLocation2)];
_pinchCenter = [self convertToNodeSpace:centerGL];
}
else if ([recognizer state] == UIGestureRecognizerStateChanged) {
// NSLog(#"%d", recognizer.scale);
CGFloat newDeltaScale = 1 - (_lastScale - [recognizer scale]);
newDeltaScale = MIN(newDeltaScale, kMaxScale / self.scale);
newDeltaScale = MAX(newDeltaScale, kMinScale / self.scale);
CGFloat newScale = self.scale * newDeltaScale;
//self.scale = newScale;
[self scale: newScale atCenter:_pinchCenter];
_lastScale = [recognizer scale];
}
}
- (void) scale: (CGFloat) newScale atCenter: (CGPoint) center {
CGPoint oldCenterPoint = ccp(center.x * self.scale, center.y * self.scale);
// Set the scale.
self.scale = newScale;
// Get the new center point.
CGPoint newCenterPoint = ccp(center.x * self.scale, center.y * self.scale);
// Then calculate the delta.
CGPoint centerPointDelta = ccpSub(oldCenterPoint, newCenterPoint);
// Now adjust your layer by the delta.
self.position = ccpAdd(self.position, centerPointDelta);
}
my issue is that the zoom is not taking in effect at the center of the pinch even though i am trying to change it at the same time i am zooming in through this method: (void) scale: (CGFloat) newScale atCenter: (CGPoint) center. Is there any reason this might not happening properly? Also how do i convert to the center location of the pinch into the coordinates system for my scene/layer?
Everything was actually fine in my approach. The problem i was having though is that the layer anchor point was different from the map one i defined, which was introducing an offset during scaling. I had to set both anchor to ccp(0,0).
The concersion from the screen coordinates of the pinch gesture's center to the layer is correct and can be achived by the following instructions when using UIKIt gesture recognizers:
CGPoint centerGL = [[CCDirector sharedDirector] convertToGL: ccpMidpoint(touchLocation1, touchLocation2)];
_pinchCenter = [self convertToNodeSpace:centerGL];
First of all, you cannot do ([recognizer state] == UIGestureRecognizerStateBegan) because state is a bitfield! So you have to do:
([recognizer state] & UIGestureRecognizerStateBegan)
The center location of your pinch is going to be in coordinates on the screen basically. As far as how you convert that into your own coordinate system, you need to figure out what the bounds on the device's screen are of the part of your scene/layer that is shown at the time the gesture starts. That's going to be coordinates like 10,10 x 200,200 or something, representing the pixel grid of the screen. Then you will have to figure out, in the coordinate system of your own app's scene/layer, what 10,10 maps to, and what 200,200 maps to. From there, you can derive a factor to apply to the screen coordinates of the pinch gesture's center, that would translate the pinch gesture's screen coordinates into the scene/layer coordinates.
What you're trying to do is tricky because as you scale the scene/layer, your centering that scaling around a point that's not in the center of the view. I'm sure if you look through some of Apple's sample code in one of the map-related apps, you can probably find some examples of methods that have this kind of pinch zooming.
I hope this helps.

What does locationInView: mean for a multi-finger gesture recognizer?

I'm using a UIRotationGestureRecognizer and in the target method have this code:
if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
UIView *piece = gestureRecognizer.view;
CGPoint locationInView = [gestureRecognizer locationInView:piece];
CGPoint locationInSuperview = [gestureRecognizer locationInView:piece.superview];
piece.layer.anchorPoint = CGPointMake(locationInView.x / piece.bounds.size.width, locationInView.y / piece.bounds.size.height);
piece.center = locationInSuperview;
}
But I'm not quite understanding which value the locationInView: return value indicates, because there are supposed to be two fingers touching the screen.
The locationInView: method returns the center point of both touches. If you want to know the positions of the two individual touches, use locationOfTouch:inView:.
Reading the documentation is a good idea:
http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIGestureRecognizer_Class/Reference/Reference.html
It is the "centroid" of the multiple touches.

Why does panning let the user move the zoomed view outside the superview?

I am working on pinch in and pinch out feature on pdf pages. My pinch in and panning(moving) is working properly, but when user continuously moves the zoomed view, the zoom view goes outside the super view bounds.Something like this:
how can i limit the pan move so that user could not move the zoomed view/pdf outside the superview.
the relevant code i am using is:
// This method will handle the PINCH / ZOOM gesture
- (void)pinchZoom:(UIPinchGestureRecognizer *)gestureRecognizer
{
if([gestureRecognizer state] == UIGestureRecognizerStateBegan) {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = [gestureRecognizer scale];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
if (!zoomActive) {
zoomActive = YES;
panActive = YES;
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panMove:)];
[panGesture setMaximumNumberOfTouches:2];
[panGesture setDelegate:self];
[self addGestureRecognizer:panGesture];
[panGesture release];
}
CGFloat currentScale = [[[gestureRecognizer view].layer valueForKeyPath:#"transform.scale"] floatValue];
// Constants to adjust the max/min values of zoom
const CGFloat kMaxScale = 2.0;
const CGFloat kMinScale = 1.0;
CGFloat newScale = 1 - (lastScale - [gestureRecognizer scale]);
newScale = MIN(newScale, kMaxScale / currentScale);
newScale = MAX(newScale, kMinScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[gestureRecognizer view] transform], newScale, newScale);
[gestureRecognizer view].transform = transform;
lastScale = [gestureRecognizer scale]; // Store the previous scale factor for the next pinch gesture call
[delegate leavesView:self zoomingCurrentView:[gestureRecognizer scale]];
}
}
the method where i am handling the pan move:
// This method will handle the PAN / MOVE gesture
- (void)panMove:(UIPanGestureRecognizer *)gestureRecognizer
{
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
CGPoint translation = [gestureRecognizer translationInView:[[gestureRecognizer view] superview]];
[[gestureRecognizer view] setCenter:CGPointMake([[gestureRecognizer view] center].x + translation.x, [[gestureRecognizer view] center].y + translation.y)];
[gestureRecognizer setTranslation:CGPointZero inView:[[gestureRecognizer view] superview]];
}
}
Please suggest how to handle pan/move limiting panning within its superview bounds.
Try this code in you panMove method. Its working fine in my case.
static CGPoint initialCenter;
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialCenter = recognizer.view.center;
}
CGPoint translation = [recognizer translationInView:recognizer.view];
CGPoint newCenter = CGPointMake(initialCenter.x + translation.x,
initialCenter.y + translation.y);
CGRect newFrame = recognizer.view.frame;
CGRect superViewBounds = recognizer.view.superview.bounds;
CGPoint superViewOrigin = recognizer.view.superview.frame.origin;
if(newCenter.x-(newFrame.size.width/2) >= (superViewBounds.size.width+superViewOrigin.x)-200 /*right*/
||
newCenter.x+(newFrame.size.width/2) <= (superViewOrigin.x+200) /*left*/
||
newCenter.y-(newFrame.size.height/2) >= (superViewBounds.size.height+superViewOrigin.y)-200 /*bottom*/
||
newCenter.y+(newFrame.size.height/2) <= (superViewOrigin.y+100)) /*top*/
{
return;
}else{
recognizer.view.center = newCenter;
}
You can add something like this to the code: (This is a little rough so you may have to work out some errors)
if([gestureRecognizer view].frame.bounds.x < self.view.bounds.x - panExapantion)
{
//then don't move it
}
//... repeat this for all sides (this is left), bottom, right, and top.
Edit:
Ok, consider a box inside of a boxes, thus we have an inside box and an outer box. If we don't want the inside box to go outside the outer box then we must have all these statements be true:
The moved left side of the inside box is not outside the left side of the outer box.
The moved right side of the inside box is not outside the right side of the outer box.
The moved bottom side of the inside box is not outside the bottom side of the outer box.
The moved top side of the inside box is not outside the top side of the outer box.
In your case the PDF is the inside box and the iPad is the outer box. In order to stop the pdf from going outside the box we need to check if each of these statements is true, and if one is false we do not move the PDF to it's new location OR we move the PDF just near the edge of the iPhone screen.
The problem is if the pinch and zoom is used then the suddenly the box will always be outside the outer box, so how do we fix it? We get how much pixels were added to the inside box when it was zoomed (for the sack of this explanation lets just call this the expansion). So we get how much the box was expanded by and subtract that value. Like so: (This is a dumbed down if statement and will not work in code)
If(outerBox.leftSide is less than innerBox.leftSide - panExpantion)
{
//Then the innerBox is outside the outterBox
}
I hoped this helped clarify!

UIPinchGestureRecognizer position the pinched view between the two fingers

I successfully implemented a pinch a zoom of a view. However, the view doesn't position itself where I wished it to be. For the stackoverflowers with an iPad, I would like my view to be centered like on the iPad Photos.app : when you pinch&zoom on an album, the photos present themselves in a view that is expanding. This view is approximately centered with the top right hand corner on the first finger and the bottom left hand finger on the other finger. I mixed it with a pan recognizer, but this way the user always has to pinch, and then pan to adjust.
Here are so graphic explanation, I could post a video of my app if that's unclear (no secret, i'm trying to reproduce the Photos.app of the iPad...)
So for an initial position of the fingers, begining zooming :
This is the actual "zoomed" frame for now. The square is bigger, but the position is below the fingers
Here is what I would like to have : same size, but different origin.x and y :
(sorry about my poor photoshop skills ^^)
You can get the CGPoint of the midpoint between two fingers via the following code in the method handlingPinchGesture.
CGPoint point = [sender locationInView:self];
My whole handlePinchGesture method is below.
/*
instance variables
CGFloat lastScale;
CGPoint lastPoint;
*/
- (void)handlePinchGesture:(UIPinchGestureRecognizer *)sender {
if ([sender numberOfTouches] < 2)
return;
if (sender.state == UIGestureRecognizerStateBegan) {
lastScale = 1.0;
lastPoint = [sender locationInView:self];
}
// Scale
CGFloat scale = 1.0 - (lastScale - sender.scale);
[self.layer setAffineTransform:
CGAffineTransformScale([self.layer affineTransform],
scale,
scale)];
lastScale = sender.scale;
// Translate
CGPoint point = [sender locationInView:self];
[self.layer setAffineTransform:
CGAffineTransformTranslate([self.layer affineTransform],
point.x - lastPoint.x,
point.y - lastPoint.y)];
lastPoint = [sender locationInView:self];
}
Have a look at the Touches sample project. Specifically these methods could help you:
// scale and rotation transforms are applied relative to the layer's anchor point
// this method moves a gesture recognizer's view's anchor point between the user's fingers
- (void)adjustAnchorPointForGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer {
if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
UIView *piece = gestureRecognizer.view;
CGPoint locationInView = [gestureRecognizer locationInView:piece];
CGPoint locationInSuperview = [gestureRecognizer locationInView:piece.superview];
piece.layer.anchorPoint = CGPointMake(locationInView.x / piece.bounds.size.width, locationInView.y / piece.bounds.size.height);
piece.center = locationInSuperview;
}
}
// scale the piece by the current scale
// reset the gesture recognizer's rotation to 0 after applying so the next callback is a delta from the current scale
- (void)scalePiece:(UIPinchGestureRecognizer *)gestureRecognizer
{
[self adjustAnchorPointForGestureRecognizer:gestureRecognizer];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
[gestureRecognizer view].transform = CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale], [gestureRecognizer scale]);
[gestureRecognizer setScale:1];
}
}

Resources