Resize UIImageView with finger IOS - ios

I have a UIImageView (and a UITextView) and I am changing the hight and with of them both using a plus and minus button. However I wanted to do like you can do in most programs, where a box appears round my views, and the user can drag a corner of it to resize. Only one corner needs to be dragged. The opposite is fixed. How do you do this?

Another way - is by GestureRecognizer. In some task i resized image like this:
- (void)resizeImage:(UIPinchGestureRecognizer *)recognizer
{
if ([recognizer state] == UIGestureRecognizerStateBegan)
previousScale = [recognizer scale];
UIView *viewToResize = recognizer.view;
if ([recognizer state] == UIGestureRecognizerStateChanged)
{
CGFloat currentScale = [[viewToResize.layer valueForKeyPath:#"transform.scale"] floatValue];
CGFloat newScale = 1 - (previousScale - [recognizer scale]);
newScale = MIN(newScale, MAX_SCALE / currentScale);
newScale = MAX(newScale, MIN_SCALE / currentScale);
CGAffineTransform transform = CGAffineTransformScale([viewToResize transform], newScale, newScale);
viewToResize.transform = transform;
previousScale = [recognizer scale];
}
}
ios objective-c

I never done this before but i think you would have to override the touchesBegan and touchesMoved functions of the view. When you initiate the touchesBegan method make sure you are touching the correct view, (Imageview or UItextView) set a boolen in their stating. imTouchingOneOrTheOther. So when you now hit the touchesMoved function you can adjust the size of the frame accordingly. I would try to adjust the frame first with UIView block based animations and if that doesn't look ok then i would play around with coreAnimation. Let me know how it works out.

Related

iOS: Zoom using Pinch gesture with inertia and overhead

I am fairly new to iOS and I am trying to figure out how to use a Pinch gesture to zoom with inertia and overhead (I do not know if the word overhead is correct in this context, in german it would be called "Überschwingen").
Basically what it shall do: It should have a max and min scale (in my case 1.0 to 4.0) in which you can zoom. When the gesture is finished it should take the given velocity and make a curve out animation, also allowing the view to over- and underflow the given scales and then move back to the min or max like with tension.
I got the Gesture Recognizer running for this and also managed to get it make use of my minimum and maximum scale (using examples from stackoverflow). This is what I got so far:
- (void)handle_pinch:(UIPinchGestureRecognizer *)recognizer
{
if([recognizer state] == UIGestureRecognizerStateBegan) {
previousScale = 1.0;
lastPoint = [recognizer locationInView:[recognizer view]];
}
if ([recognizer state] == UIGestureRecognizerStateChanged) {
CGFloat currentScale = [[[recognizer view].layer valueForKeyPath:#"transform.scale"] floatValue];
// Constants to adjust the max/min values of zoom
const CGFloat kMaxScale = 4.0;
const CGFloat kMinScale = 1.0;
CGFloat newScale = 1 - (previousScale - [recognizer scale]); // new scale is in the range (0-1)
newScale = MIN(newScale, kMaxScale / currentScale);
newScale = MAX(newScale, kMinScale / currentScale);
scale = newScale;
CGAffineTransform transform = CGAffineTransformScale([[recognizer view] transform], newScale, newScale);
[recognizer view].transform = transform;
CGPoint point = [recognizer locationInView:[recognizer view]];
CGAffineTransform transformTranslate = CGAffineTransformTranslate([[recognizer view] transform], point.x-lastPoint.x, point.y-lastPoint.y);
[recognizer view].transform = transformTranslate;
NSLog(#"Transformed");
}
}
But I do now know how I can add the animation here. Thanks for any help!
You should be using a UIScrollView to achieve the pinch to zoom effect as UIScrollView is already integrated along with animation. Just add your UIView inside a UIScrollView.
Here is great tutorial on UIScrollView. He is using a UIImageView but UIView will behave in a similar way.
https://www.raywenderlich.com/122139/uiscrollview-tutorial

Translucent UINavigationBar flickers when transform is applied in iOS7

In an app I am working on, I'm performing a transition where I'm using a transform to scale a containerView that holds on to the view of a UINavigationController. This transform is tied to a pan gesture recognizer, so depending on the distance of the pan, the transform changes.
On iOS7, the navigationBar is set to be translucent but dark. As I apply the transform, certain values cause the navigationBar to render oddly... see pictures
The result is that I get this flickering affect where the navigationBar changes the degree of translucency as the pan occurs. Any idea why this might be?
Thanks!
EDIT:
Here is the code that does it:
- (void)handlePanRecognizer:(UIPanGestureRecognizer *)recognizer
{
if ([recognizer isEqual:self.panGestureRecognizer])
{
if ([recognizer state] == UIGestureRecognizerStateBegan || [recognizer state] == UIGestureRecognizerStateChanged)
{
CGPoint translation = [recognizer translationInView:recognizer.view];
CGFloat xMin = CGRectGetMidX([self centerContainerFrame]);
CGFloat xValue = self.rightContainer.center.x + translation.x;
[self.rightContainer setCenter:CGPointMake(MAX(xMin, xValue), self.rightContainer.center.y)];
[recognizer setTranslation:CGPointZero inView:recognizer.view];
CGFloat normalizedTranslation = self.rightContainer.frame.origin.x / self.rightContainer.frame.size.width;
CGFloat relativeZoom = (1.0 - self.transformScaleFactor) * normalizedTranslation;
CGFloat relativeAlpha = (1.0 - self.minFadeAlpha) * normalizedTranslation;
CGAffineTransform transform = CGAffineTransformMakeScale(self.transformScaleFactor + relativeZoom, self.transformScaleFactor + relativeZoom);
CGFloat newAlpha = relativeAlpha + self.minFadeAlpha;
[self.centerContainer setTransform:transform];
[self.centerContainer setAlpha:newAlpha];
}
...etc

Issue scaling layer at the center of a pinch gesture

I am currently having a map(tilemap) within a layer that i would like to pan/zoom using the following code:
- (void) pinchGestureUpdated: (UIPinchGestureRecognizer *) recognizer {
if([recognizer state] == UIGestureRecognizerStateBegan) {
_lastScale = [recognizer scale];
CGPoint touchLocation1 = [recognizer locationOfTouch:0 inView:recognizer.view];
CGPoint touchLocation2 = [recognizer locationOfTouch:1 inView:recognizer.view];
CGPoint centerGL = [[CCDirector sharedDirector] convertToGL: ccpMidpoint(touchLocation1, touchLocation2)];
_pinchCenter = [self convertToNodeSpace:centerGL];
}
else if ([recognizer state] == UIGestureRecognizerStateChanged) {
// NSLog(#"%d", recognizer.scale);
CGFloat newDeltaScale = 1 - (_lastScale - [recognizer scale]);
newDeltaScale = MIN(newDeltaScale, kMaxScale / self.scale);
newDeltaScale = MAX(newDeltaScale, kMinScale / self.scale);
CGFloat newScale = self.scale * newDeltaScale;
//self.scale = newScale;
[self scale: newScale atCenter:_pinchCenter];
_lastScale = [recognizer scale];
}
}
- (void) scale: (CGFloat) newScale atCenter: (CGPoint) center {
CGPoint oldCenterPoint = ccp(center.x * self.scale, center.y * self.scale);
// Set the scale.
self.scale = newScale;
// Get the new center point.
CGPoint newCenterPoint = ccp(center.x * self.scale, center.y * self.scale);
// Then calculate the delta.
CGPoint centerPointDelta = ccpSub(oldCenterPoint, newCenterPoint);
// Now adjust your layer by the delta.
self.position = ccpAdd(self.position, centerPointDelta);
}
my issue is that the zoom is not taking in effect at the center of the pinch even though i am trying to change it at the same time i am zooming in through this method: (void) scale: (CGFloat) newScale atCenter: (CGPoint) center. Is there any reason this might not happening properly? Also how do i convert to the center location of the pinch into the coordinates system for my scene/layer?
Everything was actually fine in my approach. The problem i was having though is that the layer anchor point was different from the map one i defined, which was introducing an offset during scaling. I had to set both anchor to ccp(0,0).
The concersion from the screen coordinates of the pinch gesture's center to the layer is correct and can be achived by the following instructions when using UIKIt gesture recognizers:
CGPoint centerGL = [[CCDirector sharedDirector] convertToGL: ccpMidpoint(touchLocation1, touchLocation2)];
_pinchCenter = [self convertToNodeSpace:centerGL];
First of all, you cannot do ([recognizer state] == UIGestureRecognizerStateBegan) because state is a bitfield! So you have to do:
([recognizer state] & UIGestureRecognizerStateBegan)
The center location of your pinch is going to be in coordinates on the screen basically. As far as how you convert that into your own coordinate system, you need to figure out what the bounds on the device's screen are of the part of your scene/layer that is shown at the time the gesture starts. That's going to be coordinates like 10,10 x 200,200 or something, representing the pixel grid of the screen. Then you will have to figure out, in the coordinate system of your own app's scene/layer, what 10,10 maps to, and what 200,200 maps to. From there, you can derive a factor to apply to the screen coordinates of the pinch gesture's center, that would translate the pinch gesture's screen coordinates into the scene/layer coordinates.
What you're trying to do is tricky because as you scale the scene/layer, your centering that scaling around a point that's not in the center of the view. I'm sure if you look through some of Apple's sample code in one of the map-related apps, you can probably find some examples of methods that have this kind of pinch zooming.
I hope this helps.

Why does panning let the user move the zoomed view outside the superview?

I am working on pinch in and pinch out feature on pdf pages. My pinch in and panning(moving) is working properly, but when user continuously moves the zoomed view, the zoom view goes outside the super view bounds.Something like this:
how can i limit the pan move so that user could not move the zoomed view/pdf outside the superview.
the relevant code i am using is:
// This method will handle the PINCH / ZOOM gesture
- (void)pinchZoom:(UIPinchGestureRecognizer *)gestureRecognizer
{
if([gestureRecognizer state] == UIGestureRecognizerStateBegan) {
// Reset the last scale, necessary if there are multiple objects with different scales
lastScale = [gestureRecognizer scale];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
if (!zoomActive) {
zoomActive = YES;
panActive = YES;
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panMove:)];
[panGesture setMaximumNumberOfTouches:2];
[panGesture setDelegate:self];
[self addGestureRecognizer:panGesture];
[panGesture release];
}
CGFloat currentScale = [[[gestureRecognizer view].layer valueForKeyPath:#"transform.scale"] floatValue];
// Constants to adjust the max/min values of zoom
const CGFloat kMaxScale = 2.0;
const CGFloat kMinScale = 1.0;
CGFloat newScale = 1 - (lastScale - [gestureRecognizer scale]);
newScale = MIN(newScale, kMaxScale / currentScale);
newScale = MAX(newScale, kMinScale / currentScale);
CGAffineTransform transform = CGAffineTransformScale([[gestureRecognizer view] transform], newScale, newScale);
[gestureRecognizer view].transform = transform;
lastScale = [gestureRecognizer scale]; // Store the previous scale factor for the next pinch gesture call
[delegate leavesView:self zoomingCurrentView:[gestureRecognizer scale]];
}
}
the method where i am handling the pan move:
// This method will handle the PAN / MOVE gesture
- (void)panMove:(UIPanGestureRecognizer *)gestureRecognizer
{
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
CGPoint translation = [gestureRecognizer translationInView:[[gestureRecognizer view] superview]];
[[gestureRecognizer view] setCenter:CGPointMake([[gestureRecognizer view] center].x + translation.x, [[gestureRecognizer view] center].y + translation.y)];
[gestureRecognizer setTranslation:CGPointZero inView:[[gestureRecognizer view] superview]];
}
}
Please suggest how to handle pan/move limiting panning within its superview bounds.
Try this code in you panMove method. Its working fine in my case.
static CGPoint initialCenter;
if (recognizer.state == UIGestureRecognizerStateBegan)
{
initialCenter = recognizer.view.center;
}
CGPoint translation = [recognizer translationInView:recognizer.view];
CGPoint newCenter = CGPointMake(initialCenter.x + translation.x,
initialCenter.y + translation.y);
CGRect newFrame = recognizer.view.frame;
CGRect superViewBounds = recognizer.view.superview.bounds;
CGPoint superViewOrigin = recognizer.view.superview.frame.origin;
if(newCenter.x-(newFrame.size.width/2) >= (superViewBounds.size.width+superViewOrigin.x)-200 /*right*/
||
newCenter.x+(newFrame.size.width/2) <= (superViewOrigin.x+200) /*left*/
||
newCenter.y-(newFrame.size.height/2) >= (superViewBounds.size.height+superViewOrigin.y)-200 /*bottom*/
||
newCenter.y+(newFrame.size.height/2) <= (superViewOrigin.y+100)) /*top*/
{
return;
}else{
recognizer.view.center = newCenter;
}
You can add something like this to the code: (This is a little rough so you may have to work out some errors)
if([gestureRecognizer view].frame.bounds.x < self.view.bounds.x - panExapantion)
{
//then don't move it
}
//... repeat this for all sides (this is left), bottom, right, and top.
Edit:
Ok, consider a box inside of a boxes, thus we have an inside box and an outer box. If we don't want the inside box to go outside the outer box then we must have all these statements be true:
The moved left side of the inside box is not outside the left side of the outer box.
The moved right side of the inside box is not outside the right side of the outer box.
The moved bottom side of the inside box is not outside the bottom side of the outer box.
The moved top side of the inside box is not outside the top side of the outer box.
In your case the PDF is the inside box and the iPad is the outer box. In order to stop the pdf from going outside the box we need to check if each of these statements is true, and if one is false we do not move the PDF to it's new location OR we move the PDF just near the edge of the iPhone screen.
The problem is if the pinch and zoom is used then the suddenly the box will always be outside the outer box, so how do we fix it? We get how much pixels were added to the inside box when it was zoomed (for the sack of this explanation lets just call this the expansion). So we get how much the box was expanded by and subtract that value. Like so: (This is a dumbed down if statement and will not work in code)
If(outerBox.leftSide is less than innerBox.leftSide - panExpantion)
{
//Then the innerBox is outside the outterBox
}
I hoped this helped clarify!

UIPinchGestureRecognizer position the pinched view between the two fingers

I successfully implemented a pinch a zoom of a view. However, the view doesn't position itself where I wished it to be. For the stackoverflowers with an iPad, I would like my view to be centered like on the iPad Photos.app : when you pinch&zoom on an album, the photos present themselves in a view that is expanding. This view is approximately centered with the top right hand corner on the first finger and the bottom left hand finger on the other finger. I mixed it with a pan recognizer, but this way the user always has to pinch, and then pan to adjust.
Here are so graphic explanation, I could post a video of my app if that's unclear (no secret, i'm trying to reproduce the Photos.app of the iPad...)
So for an initial position of the fingers, begining zooming :
This is the actual "zoomed" frame for now. The square is bigger, but the position is below the fingers
Here is what I would like to have : same size, but different origin.x and y :
(sorry about my poor photoshop skills ^^)
You can get the CGPoint of the midpoint between two fingers via the following code in the method handlingPinchGesture.
CGPoint point = [sender locationInView:self];
My whole handlePinchGesture method is below.
/*
instance variables
CGFloat lastScale;
CGPoint lastPoint;
*/
- (void)handlePinchGesture:(UIPinchGestureRecognizer *)sender {
if ([sender numberOfTouches] < 2)
return;
if (sender.state == UIGestureRecognizerStateBegan) {
lastScale = 1.0;
lastPoint = [sender locationInView:self];
}
// Scale
CGFloat scale = 1.0 - (lastScale - sender.scale);
[self.layer setAffineTransform:
CGAffineTransformScale([self.layer affineTransform],
scale,
scale)];
lastScale = sender.scale;
// Translate
CGPoint point = [sender locationInView:self];
[self.layer setAffineTransform:
CGAffineTransformTranslate([self.layer affineTransform],
point.x - lastPoint.x,
point.y - lastPoint.y)];
lastPoint = [sender locationInView:self];
}
Have a look at the Touches sample project. Specifically these methods could help you:
// scale and rotation transforms are applied relative to the layer's anchor point
// this method moves a gesture recognizer's view's anchor point between the user's fingers
- (void)adjustAnchorPointForGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer {
if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
UIView *piece = gestureRecognizer.view;
CGPoint locationInView = [gestureRecognizer locationInView:piece];
CGPoint locationInSuperview = [gestureRecognizer locationInView:piece.superview];
piece.layer.anchorPoint = CGPointMake(locationInView.x / piece.bounds.size.width, locationInView.y / piece.bounds.size.height);
piece.center = locationInSuperview;
}
}
// scale the piece by the current scale
// reset the gesture recognizer's rotation to 0 after applying so the next callback is a delta from the current scale
- (void)scalePiece:(UIPinchGestureRecognizer *)gestureRecognizer
{
[self adjustAnchorPointForGestureRecognizer:gestureRecognizer];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
[gestureRecognizer view].transform = CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale], [gestureRecognizer scale]);
[gestureRecognizer setScale:1];
}
}

Resources