face detection and image placing on it - ios

i am trying to detect face in UIImageview and place image on mouth.
i have tried this method, but i can't transform CoreImage Coordination system to UIkit coordination system. here is my code:
code updated but still not functioning, just rotating view
#interface ProcessImageViewController ()
#end
#implementation ProcessImageViewController
#synthesize receivedImageData;
#synthesize renderImageView;
#synthesize viewToRender;
#synthesize preview;
#synthesize pancontrol;
#synthesize pinchcontrol;
#synthesize rotatecontrol;
- (BOOL)prefersStatusBarHidden {
return YES;
}
- (void)viewDidLoad
{
[super viewDidLoad];
renderImageView.image = receivedImageData;
renderImageView.contentMode = UIViewContentModeScaleToFill;
}
-(void)tryAddCliparts
{
NSLog(#"button clicked");
[self performSelectorInBackground:#selector(markFaces:) withObject:renderImageView];
}
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
CGFloat firstX = recognizer.view.center.x;
CGFloat firstY = recognizer.view.center.y;
CGPoint translationPoint = [recognizer translationInView:self.view];
CGPoint translatedPoint = CGPointMake(firstX + translationPoint.x, firstY+ translationPoint.y);
CGFloat viewW = renderImageView.frame.size.width;
CGFloat viewH = renderImageView.frame.size.height;
if (translatedPoint.x<0 || translatedPoint.x>viewW)
translatedPoint.x = renderImageView.frame.origin.x;
if (translatedPoint.y<0|| translatedPoint.y>viewH)
translatedPoint.y = renderImageView.frame.origin.y;
recognizer.view.center = CGPointMake(translatedPoint.x, translatedPoint.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
- (IBAction)handlePinch:(UIPinchGestureRecognizer *)recognizer {
recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, recognizer.scale);
recognizer.scale = 1;
}
- (IBAction)handleRotate:(UIRotationGestureRecognizer *)recognizer {
recognizer.view.transform = CGAffineTransformRotate(recognizer.view.transform, recognizer.rotation);
recognizer.rotation = 0;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}
-(void)markFaces:(UIImageView *)facePicture
{
NSLog(#"face detection started");
// draw a ci image from view
CIImage *image = [CIImage imageWithCGImage:facePicture.image.CGImage];
// Create face detector with high accuracy
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
CGAffineTransform transform = CGAffineTransformMakeScale(1, -1);
transform = CGAffineTransformTranslate(transform,
0,-facePicture.bounds.size.height);
// Get features from the image
NSArray* features = [detector featuresInImage:image];
for(CIFaceFeature* faceFeature in features) {
// Transform CoreImage coordinates to UIKit
CGRect faceRect = CGRectApplyAffineTransform(faceFeature.bounds, transform);
UIImage *mustache = [UIImage imageNamed:#"mustacheok.png"];
UIImageView *mustacheview = [[UIImageView alloc] initWithImage:mustache];
mustacheview.contentMode = UIViewContentModeScaleAspectFill;
[mustacheview.layer setBorderColor:[[UIColor whiteColor] CGColor]];
[mustacheview.layer setBorderWidth:3];
[mustacheview addGestureRecognizer:pancontrol];
[mustacheview addGestureRecognizer:pinchcontrol];
[mustacheview addGestureRecognizer:rotatecontrol];
mustacheview.userInteractionEnabled=YES;
CGPoint mouthPos = CGPointApplyAffineTransform(faceFeature.mouthPosition, transform);
[mustacheview setFrame:CGRectMake(mouthPos.x, mouthPos.y, mustacheview.frame.size.width, mustacheview.frame.size.height)];
[viewToRender addSubview:mustacheview];
[viewToRender bringSubviewToFront:mustacheview];
}
}
#end

CGAffineTransform transform = CGAffineTransformMakeScale(1, -1);
transform = CGAffineTransformTranslate(transform,
0,-facePicture.bounds.size.height);
for (CIFaceFeature *faceFeature in features) {
// Transform CoreImage coordinates to UIKit
CGRect faceRect = CGRectApplyAffineTransform(faceFeature.bounds, transform);
if (faceFeature.hasMouthPosition) {
// Transform CoreImage coordinates to UIKit
CGPoint mouthPos = CGPointApplyAffineTransform(faceFeature.mouthPosition, transform);
}
}
the only thing I see wrong on your code is this:
[mustacheview setFrame:CGRectMake(mouthPos.x, mouthPos.y, mustacheview.frame.size.width, mustacheview.frame.size.height)];
you should use:
[mustacheview setCenter:mouthPos];
because the detector returns the mouth center point.

CoreImage uses the same coordinate system as CoreGraphics, a bottom left coordinate system, as opposed to the top left coordinate system of UIKit.
So you basically have to flip along the Y-axis (multiply with -1 and offset the height of the screen)
CGAffineTransformation flipVertical =
CGAffineTransformMake(1, 0, 0, -1, 0, self.bounds.size.height);

Related

How to prevent rotation of all UIImage overlays with MapKit

I am placing overlays with MapOverlayRenderer. I first use a UIImage in a view over the map and can rotate it in its view and then convert the points (define boundingMapRect, coordinate, etc.) and then place it (Renderer/addOverlay) onto the map with the correct inserted image (CGContextDrawImage) and correct rotation. Each time that I create a new overlay [UIImage in view, rotate, convert points, addOverlay) it works just fine. But, on each successive overlay added to the Map, the code also rotates all of the other overlays already on the map. It seems as though my rotation code (below) is being applied to all of the overlays.
Should I be making an array which includes boundingMapRect and coordinates for each overlay - if so, should these be in viewcontroller?
Should I be naming each overlay [overlay.title? - how do I set it and call it?]
Appreciate some guidance - thanks.
MapOverlayView.h
#interface MapOverlayView ()
#property (nonatomic, strong) UIImage *overlayImage;
#end
#implementation MapOverlayView
- (instancetype)initWithOverlay:(id<MKOverlay>)overlay overlayImage:
(UIImage *)overlayImage {
self = [super initWithOverlay:overlay];
if (self) {
_overlayImage = overlayImage;
}
return self;
}
- (void)drawMapRect:(MKMapRect)mapRect zoomScale:
(MKZoomScale)zoomScale inContext:(CGContextRef)context {
CGImageRef imageReference = self.overlayImage.CGImage;
MKMapRect theMapRect = self.overlay.boundingMapRect;
CGRect theRect = [self rectForMapRect:theMapRect];
IonQuestSingleton *tmp = [IonQuestSingleton sharedSingleton];
float new_angle = tmp.image_angle;
NSLog(#"%s%f","New angle is ",new_angle);
{
if (new_angle<90) {
CGContextRotateCTM(context,new_angle*M_PI/180);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextTranslateCTM(context, 0.0, -theRect.size.height);
CGContextDrawImage(context, theRect, imageReference);
NSLog(#"%s%f","New angle 90 is ",new_angle);//tick
}
else if (new_angle >90 && new_angle <=180)
{
CGContextRotateCTM(context,new_angle*M_PI/180);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextTranslateCTM(context, 0.0, 0.0);
CGContextDrawImage(context, theRect, imageReference);
NSLog(#"%s%f","New angle 180 is ",new_angle);//tick
}
else if (new_angle >180 && new_angle <=270)
{
CGContextRotateCTM(context,new_angle*M_PI/180);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextTranslateCTM(context, -theRect.size.width, 0.0);
CGContextDrawImage(context, theRect, imageReference);
NSLog(#"%s%f","New angle 270 is ",new_angle);//tick
}
else if (new_angle >270) {
CGContextRotateCTM(context,new_angle*M_PI/180);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextTranslateCTM(context, -theRect.size.width,
-theRect.size.height);
CGContextDrawImage(context, theRect, imageReference);//tick
NSLog(#"%s%f","New angle 360 is ",new_angle);
}
}
}
#end
MapOverlay.h
#import "MapOverlay.h"
#import "IonQuestSingleton.h"
#implementation MapOverlay
#define debug 1
#synthesize boundingMapRect;
#synthesize coordinate;
-(CLLocationCoordinate2D)coordinate {
//Image center point
IonQuestSingleton *tmp = [IonQuestSingleton sharedSingleton];
return CLLocationCoordinate2DMake(tmp.image_ctr_lat,
tmp.image_ctr_long);
}
- (instancetype)init
{
self = [super init];
if (self) {
IonQuestSingleton *tmp = [IonQuestSingleton sharedSingleton];
MKMapPoint upperLeft =
MKMapPointForCoordinate(CLLocationCoordinate2DMake(tmp.image_tl_lat,
tmp.image_tl_long));
MKMapPoint boundsLeft = ```MKMapPointForCoordinate(CLLocationCoordinate2DMake(tmp.image_tl_bounds_lat, tmp.image_tl_bounds_long));
boundingMapRect = MKMapRectMake(upperLeft.x, upperLeft.y,
fabs(upperLeft.x - boundsLeft.x), fabs(upperLeft.x - boundsLeft.x));
}
return self;
}
#end
Viewcontroller.h
-(void) loadOverlay
{
MapOverlay *overlay = [[MapOverlay alloc] init];
[self.mapView addOverlay:overlay level:MKOverlayLevelAboveLabels];
}
- (MKOverlayRenderer *)mapView:(MKMapView *)mapView rendererForOverlay:
(id<MKOverlay>)overlay {
NSString *mapNameFromMVCPos = [NSString
stringWithFormat:#"%#%s",self.mapNameFromMVC,"Pos"];//not working
{
if ([overlay isKindOfClass:[MapOverlay class]]) {
UIImage *magicMountainImage = [UIImage
imageNamed:mapNameFromMVCPos];
MKOverlayRenderer *overlayView = [[MapOverlayView alloc]
initWithOverlay:overlay overlayImage:magicMountainImage];
return overlayView;
} etc [evaluate other overlaytypes]

iOS: Resize and Rotate UIView Concurrently

Using a UIPanGestureRecognizer in my view controller, I'm trying to draw a view (ArrowView) at an angle based upon the touch location. I'm trying to use CGAffineTransformRotate to rotate the view based up the angle between the first touch and the current touch, but this isn't working unless the view has already been drawn by at lease 20 or more pixels. Also, when drawing, the view doesn't always line up under my finger. Is this the correct approach for this situation? If not, does anyone recommend a better way of accomplishing this? If so, what am I doing wrong?
ViewController.m
#implementation ViewController {
ArrowView *_selectedArrowView;
UIColor *_selectedColor;
CGFloat _selectedWeight;
CGPoint _startPoint;
}
- (void)viewDidLoad {
[super viewDidLoad];
_selectedColor = [UIColor yellowColor];
_selectedWeight = 3;
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panHandler:)];
[self.view addGestureRecognizer:pan];
}
- (void) panHandler: (UIPanGestureRecognizer *) sender {
if (sender.state == UIGestureRecognizerStateBegan) {
//Instantiate the arrow
CGPoint touchPoint = [sender locationInView:sender.view];
_startPoint = touchPoint;
_selectedArrowView = [[ArrowView alloc] initWithFrame:CGRectMake(touchPoint.x, touchPoint.y, 0, 25) withColor:_selectedColor withWeight:_selectedWeight];
_selectedArrowView.delegate = self;
[self.view addSubview:_selectedArrowView];
[self.view bringSubviewToFront:_selectedArrowView];
} else if (sender.state == UIGestureRecognizerStateChanged) {
//"Draw" the arrow based upon finger postion
CGPoint touchPoint = [sender locationInView:sender.view];
[_selectedArrowView drawArrow:_startPoint to:touchPoint];
}
}
#end
ArrowView.m
- (void) drawArrow: (CGPoint) startPoint to: (CGPoint) endPoint {
startPoint = [self convertPoint:startPoint fromView:self.superview];
endPoint = [self convertPoint:endPoint fromView:self.superview];
if (_initialAngle == -1000 /*Initially set to an arbitrary value so I know when the draw began*/) {
_initialAngle = atan2(startPoint.y - endPoint.y, startPoint.x - endPoint.x);
[self setPosition:0];
} else {
CGFloat ang = atan2(startPoint.y - endPoint.y, startPoint.x - endPoint.x);
ang -= _initialAngle;
self.transform = CGAffineTransformRotate(self.transform, ang);
CGFloat diff = (endPoint.x - self.bounds.size.width);
NSLog(#"\n\n diff: %f \n\n", diff);
self.bounds = CGRectMake(0, 0, self.bounds.size.width + diff, self.bounds.size.height);
_endPoint = CGPointMake(self.bounds.size.width, self.bounds.size.height);
[self setNeedsDisplay];
}
}
- (void) setPosition: (CGFloat) anchorPointX {
CGPoint layerLoc;
if (anchorPointX == 0) {
layerLoc = CGPointMake(self.layer.bounds.origin.x, self.layer.bounds.origin.y + (self.layer.bounds.size.height / 2));
} else {
layerLoc = CGPointMake(self.layer.bounds.origin.x + self.layer.bounds.size.width, self.layer.bounds.origin.y + (self.layer.bounds.size.height / 2));
}
CGPoint superLoc = [self convertPoint:layerLoc toView:self.superview];
self.layer.anchorPoint = CGPointMake(anchorPointX, 0.5);
self.layer.position = superLoc;
}
- (CGFloat) pToA: (CGPoint) touchPoint {
CGPoint start;
if (_dotButtonIndex == kDotButtonFirst) {
start = CGPointMake(CGRectGetMaxX(self.bounds), CGRectGetMaxY(self.bounds));
} else {
start = CGPointMake(CGRectGetMinX(self.bounds), CGRectGetMinY(self.bounds));
}
return atan2(start.y - touchPoint.y, start.x - touchPoint.x);
}
Link to project on GitHub: Project Link
Figured it out.
I had to make it have an initial width so the angle would work.
_initialAngle = atan2(startPoint.y - endPoint.y, startPoint.x - (endPoint.x + self.frame.size.width));

Get scale value and rotate value in CGAffineTransform xcode

I'm developing an application on which user can pinch and rotate the the image. After pinching and rotating the image. the image must be print. I want to get the scaling factor and rotating factor of my image so that I can draw a new image and then print that new image. But my problem is I cant get the scaling and rotating factor of my image. Can anybody teach me on how to get the scaling factor and rotate factor using CGAffineTransform. Or can anybody teach me on how to save the image after scaling and rotating. thank you so much guys. here is my code.
- (void)rotationImage:(UIRotationGestureRecognizer*)gesture {
[self.view bringSubviewToFront:gesture.view];
if ([gesture state] == UIGestureRecognizerStateEnded) {
lastRotation = 0;
return;
}
CGAffineTransform currentTransform = img_tempImage.transform;
CGFloat rotation = 0.0 - (lastRotation - gesture.rotation);
CGAffineTransform newTransform = CGAffineTransformRotate(currentTransform, rotation);
img_tempImage.transform = newTransform;
myTransform = img_tempImage.transform;
lastRotation = gesture.rotation;
isTransform = YES;
}
- (void)pinchImage:(UIPinchGestureRecognizer*)pinchGestureRecognizer {
[self.view bringSubviewToFront:pinchGestureRecognizer.view];
pinchGestureRecognizer.view.transform = CGAffineTransformScale(pinchGestureRecognizer.view.transform, pinchGestureRecognizer.scale, pinchGestureRecognizer.scale);
pinchGestureRecognizer.scale = 1;
myTransform = img_tempImage.transform;
isTransform = YES;
}
- (void)panpan:(UIPanGestureRecognizer *)sender {
[self.view bringSubviewToFront:img_tempImage];
CGPoint translation = [sender translationInView:self.view];
CGPoint imageViewPosition = img_tempImage.center;
imageViewPosition.x += translation.x;
imageViewPosition.y += translation.y;
myPoint = imageViewPosition;
img_tempImage.center = imageViewPosition;
[sender setTranslation:CGPointZero inView:self.view];
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}
I found the answer on my question on how to get the scale and rotate value.
CGFloat angle = [(NSNumber *)[img_tempImage valueForKeyPath:#"layer.transform.rotation.z"] floatValue];
NSLog(#"by angle: %.2f",angle);
CGFloat xScale = img_tempImage.transform.a;
CGFloat yScale = img_tempImage.transform.d;
NSLog(#"xScale :%.2f",xScale);
NSLog(#"yScale :%.2f",yScale);

Rotating rectangle around circumference of a circle (iOS)?

I am trying to rotate the rectangle around the circle. So far after putting together some code I found in various places (mainly here: https://stackoverflow.com/a/4657476/861181) , I am able to rotate rectangle around it's center axis.
How can I make it rotate around the circle?
Here is what I have:
OverlaySelectionView.h
#import <QuartzCore/QuartzCore.h>
#interface OverlaySelectionView : UIView {
#private
UIView* dragArea;
CGRect dragAreaBounds;
UIView* vectorArea;
UITouch *currentTouch;
CGPoint touchLocationpoint;
CGPoint PrevioustouchLocationpoint;
}
#property CGRect vectorBounds;
#end
OverlaySelectionView.m
#import "OverlaySelectionView.h"
#interface OverlaySelectionView()
#property (nonatomic, retain) UIView* vectorArea;
#end
#implementation OverlaySelectionView
#synthesize vectorArea, vectorBounds;
#synthesize delegate;
- (void) initialize {
self.userInteractionEnabled = YES;
self.multipleTouchEnabled = NO;
self.backgroundColor = [UIColor clearColor];
self.opaque = NO;
self.autoresizingMask = UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth;
UIPanGestureRecognizer *panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(rotateVector:)];
panRecognizer.maximumNumberOfTouches = 1;
[self addGestureRecognizer:panRecognizer];
}
- (id) initWithCoder: (NSCoder*) coder {
self = [super initWithCoder: coder];
if (self != nil) {
[self initialize];
}
return self;
}
- (id) initWithFrame: (CGRect) frame {
self = [super initWithFrame: frame];
if (self != nil) {
[self initialize];
}
return self;
}
- (void)drawRect:(CGRect)rect {
if (vectorBounds.origin.x){
UIView* area = [[UIView alloc] initWithFrame: vectorBounds];
area.backgroundColor = [UIColor grayColor];
area.opaque = YES;
area.userInteractionEnabled = NO;
vectorArea = area;
[self addSubview: vectorArea];
}
}
- (void)rotateVector: (UIPanGestureRecognizer *)panRecognizer{
if (touchLocationpoint.x){
PrevioustouchLocationpoint = touchLocationpoint;
}
if ([panRecognizer numberOfTouches] >= 1){
touchLocationpoint = [panRecognizer locationOfTouch:0 inView:self];
}
CGPoint origin;
origin.x=240;
origin.y=160;
CGPoint previousDifference = [self vectorFromPoint:origin toPoint:PrevioustouchLocationpoint];
CGAffineTransform newTransform =CGAffineTransformScale(vectorArea.transform, 1, 1);
CGFloat previousRotation = atan2(previousDifference.y, previousDifference.x);
CGPoint currentDifference = [self vectorFromPoint:origin toPoint:touchLocationpoint];
CGFloat currentRotation = atan2(currentDifference.y, currentDifference.x);
CGFloat newAngle = currentRotation- previousRotation;
newTransform = CGAffineTransformRotate(newTransform, newAngle);
[self animateView:vectorArea toPosition:newTransform];
}
-(CGPoint)vectorFromPoint:(CGPoint)firstPoint toPoint:(CGPoint)secondPoint
{
CGPoint result;
CGFloat x = secondPoint.x-firstPoint.x;
CGFloat y = secondPoint.y-firstPoint.y;
result = CGPointMake(x, y);
return result;
}
-(void)animateView:(UIView *)theView toPosition:(CGAffineTransform) newTransform
{
[UIView setAnimationsEnabled:YES];
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationCurve:UIViewAnimationCurveLinear];
[UIView setAnimationBeginsFromCurrentState:YES];
[UIView setAnimationDuration:0.0750];
vectorArea.transform = newTransform;
[UIView commitAnimations];
}
#end
here is attempt to clarify. I am creating the rectangle from a coordinates on a map. Here is the function that creates that rectangle in the main view. Essentially it is the middle of the screen:
overlay is the view created with the above code.
- (void)mapView:(MKMapView *)mapView didUpdateUserLocation:(MKUserLocation *)userLocation
{
if (!circle){
circle = [MKCircle circleWithCenterCoordinate: userLocation.coordinate radius:100];
[mainMapView addOverlay:circle];
CGPoint centerPoint = [mapView convertCoordinate:userLocation.coordinate toPointToView:self.view];
CGPoint upPoint = CGPointMake(centerPoint.x, centerPoint.y - 100);
overlay = [[OverlaySelectionView alloc] initWithFrame: self.view.frame];
overlay.vectorBounds = CGRectMake(upPoint.x, upPoint.y, 30, 100);
[self.view addSubview: overlay];
}
}
Here is the sketch of what I am trying to achieve:
Introduction
A rotation is always done around (0,0).
What you already know:
To rotate around the center of the rectangle you translate the rect to origin, rotate and translate back.
Now for your question:
to rotate around a center point of a circle, simply move the center of the rectangle such that the circle is at (0,0) then rotate, and move back.
start positioning the rectangle at 12 o clock, with the center line at 12.
1) as explained you always rotate around 0,0, so move the center of the circle to 0,0
CGAffineTransform trans1 = CGAffineTransformTranslation(-circ.x, -circ.y);
2) rotate by angle
CGAffineTransform transRot = CGAffineTransformRotation(angle); // or -angle try out.
3) Move back
CGAffineTransform transBack = CGAffineTransformTranslation(circ.x, circ.y);
Concat these 3 rotation matrices to one combibed matrix, and apply it to the rectangle.
CGAffineTransformation tCombo = CGAffineTransformConcat(trans1, transRot);
tCombo = CGTransformationConcat(tCombo, transback);
Apply
rectangle.transform = tCombo;
You probably should also read the chapter about Transformation matrices in Quartz docu.
This code is written with a text editor only, so expect slighly different function names.

ios - Pinch/zoom from current scale

The following code correctly pinches/zooms the container view, but only after it jumps to a scale of 1.0. How can I modify it so that the container view scales from it's current scale?
UIPinchGestureRecognizer *twoFingerPinch = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(twoFingerPinch:)];
[self.container addGestureRecognizer:twoFingerPinch];
- (void)twoFingerPinch:(UIPinchGestureRecognizer *)recognizer
{
_scale = recognizer.scale;
CGAffineTransform tr = CGAffineTransformScale(self.view.transform, _scale, _scale);
self.container.transform = tr;
}
In .h file, add:
CGFloat _lastScale;
In .m file,
- (id)init {
...
_lastScale = 1.0f;
...
}
- (void)twoFingerPinch:(UIPinchGestureRecognizer *)recognizer
{
if (recognizer.state == UIGestureRecognizerStateEnded) {
_lastScale = 1.0f;
return;
}
CGFloat scale = 1.0f - (_lastScale - recognizer.scale);
CGAffineTransform tr = CGAffineTransformScale(self.view.transform, scale, scale);
self.container.transform = tr;
_lastScale = recognizer.scale;
}
Here's how I do it:
- (void)handlePinchGesture:(UIPinchGestureRecognizer *)recognizer {
static float initialDifference = 0.0;
static float oldScale = 1.0;
if (recognizer.state == UIGestureRecognizerStateBegan){
initialDifference = oldScale - recognizer.scale;
}
CGFloat scale = oldScale - (oldScale - recognizer.scale) + initialDifference;
myView.transform = CGAffineTransformScale(self.view.transform, scale, scale);
oldScale = scale;
}

Resources