Cropping an UIImage. Cropped Image is Not Correct - ios

I have a UIImageView on the screen. It displays an image with several colors. When I click on the image it crops a small portion of the image and set the cropped image to a new small UIImageView. For some reason the cropped image is always wrong. Check out the screenshot below:
Here is the complete code for touchesBegin:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint location = [[touches anyObject] locationInView:self.imageView];
UIImage *originalImage = self.imageView.image;
NSLog(#"x = %f, y = %f",location.x,location.y);
CGRect cropRegion = CGRectMake(location.x, location.y, 10, 10);
CGImageRef subImage = CGImageCreateWithImageInRect(originalImage.CGImage, cropRegion);
UIImage *croppedImage = [UIImage imageWithCGImage:subImage scale:originalImage.scale orientation:originalImage.imageOrientation];
[self.previewImageView setImage:croppedImage];
CGImageRelease(subImage);
}

The most likely explanation is that the image size is larger than the view size, and so the image has been shrunk to fit the view. The result is that the view coordinates do not match the image coordinates, and hence the point returned by locationInView cannot be used directly for the CGImageCreateWithImageInRect.
To solve this problem you need to determine the x and y scale factors based on the self.imageView.bounds.size and the originalImage.size, and then scale the location accordingly.

Related

Drawing with finger on UIImageView that does not cover entire screen

I have managed to get drawing on top of a recently captured UIImage working, but it only works when the UIImageView covers the entire screen. It begins to lose accuracy the moment I start drawing away from the horizontal center. Any idea how I can get it to only work on the ImageView itself and have it remain accurate?
- (void)viewDidLoad
{
[super viewDidLoad];
float height = (self.view.frame.size.width * self.chosenImage.size.height)/self.chosenImage.size.width;
self.imageView.frame = CGRectMake(0, self.navigationController.navigationBar.frame.size.height, self.view.frame.size.width, height);
self.imageView.center = self.imageView.superview.center;
self.imageView.image = self.chosenImage;
}
- (UIImage *)drawLineFromPoint:(CGPoint)from_Point toPoint:(CGPoint)to_Point image:(UIImage *)image
{
CGSize sizeOf_Screen = self.imageView.frame.size;
UIGraphicsBeginImageContext(sizeOf_Screen);
CGContextRef current_Context = UIGraphicsGetCurrentContext();
[image drawInRect:CGRectMake(0, 0, sizeOf_Screen.width, sizeOf_Screen.height)];
CGContextSetLineCap(current_Context, kCGLineCapRound);
CGContextSetLineWidth(current_Context, 1.0);
CGContextSetRGBStrokeColor(current_Context, 1, 0, 0, 1);
CGContextBeginPath(current_Context);
CGContextMoveToPoint(current_Context, from_Point.x, from_Point.y);
CGContextAddLineToPoint(current_Context, to_Point.x, to_Point.y);
CGContextStrokePath(current_Context);
UIImage *rect = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return rect;
}
- (void)touchesBegan:(NSSet *)_touches withEvent:(UIEvent *)_event
{
// retrieve the touch point
UITouch *_touch = [_touches anyObject];
CGPoint current_Point = [_touch locationInView:self.imageView];
// Its record the touch points to use as input to our line smoothing algorithm
self.drawnPoints = [NSMutableArray arrayWithObject:[NSValue valueWithCGPoint:current_Point]];
self.previousPoint = current_Point;
// we need to save the unmodified image to replace the jagged polylines with the smooth polylines
self.cleanImage = self.imageView.image;
}
- (void)touchesMoved:(NSSet *)_touches withEvent:(UIEvent *)_event
{
UITouch *_touch = [_touches anyObject];
CGPoint current_Point = [_touch locationInView:self.imageView];
[self.drawnPoints addObject:[NSValue valueWithCGPoint:current_Point]];
self.imageView.image = [self drawLineFromPoint:self.previousPoint toPoint:current_Point image:self.imageView.image];
self.previousPoint = current_Point;
}
EDIT: With these settings I have accuracy in the UIImageView, but the image blurs inwards as I am drawing.
When I edit [image drawInRect:CGRectMake(0, 0, sizeOf_Screen.width, sizeOf_Screen.height)]; to be [image drawInRect:CGRectMake(0, self.imageView.frame.origin.y, sizeOf_Screen.width, sizeOf_Screen.height)]; and try to draw, the image flies off out of the view when I try to draw.
I've tried tweaking the sizeOf_screen value and CGRectMake of image in the drawLineFromPoint method. I've also attempted to set the locationInView in both instances to self.imageView, but to no avail. It's very strange. Any ideas or code samples that might accomplish what I'm looking for? I can't seem to find anything.

How to Show TouchLocation of ImageView1 in Imageview2 ios programmatically

I am Displaying Image in Imageview1 . on touch i am able to get X,Y CO-Ordinates of touch point , Now i a want to show the image area around touch points in another imageview2 ? my reference is Snap seed Application Feature Selective Adjust Zoom on click of button point on imageView1.
i Think you want to show the section of first image to second image depends on Touch! you have to get touch location on first image,
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:image1];
Guess the image you want to show is this
UIImage *imageToCrop = [UIImage imageNamed:#"Abc.png"];
First Create Rect From point
CGRect cropRect = CGRectMake(point.x, point.y,point.x-20,point.y-20);
UIImage *cropImage = [self crop:imageToCrop Rect:cropRect];
image2.image = cropImage;
(*image2 is a ImageView);
Call this Function which return cropped image from given rect
-(UIImage*)crop:(UIImage *)Imagecrop Rect:(CGRect)rect {
CGImageRef imageRef = CGImageCreateWithImageInRect([Imagecrop CGImage], rect);
UIImage *result = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return result;
}
All This stuff you must have to do on Touch Events!
May this Help You!
you Should also look at This Project

Scratch a SKNode and reveal SKNode underneath it , an effect like wiping a misted glass

Basically I want to achieve something like this : https://github.com/moqod/iOS-Scratch-n-See in SpriteKit.
In a SKScene ,I have a car image,2-3 different types of layers of dirt images, one layer of soap , one layer of water droplets and by these layers i mean all of them are in UIImage form and equal to car frame's size(which i can use as SKTexture and SKNode eventually).
The project mentioned above adds UIImageView on one another and than erase images.
I need to manage many layers like if a soap tool is selected ,I want to bring up the dirt image layer , erase the dirt image wherever user touches and below it i will place soap image(semi-transparent) ,which will be visible now and below it car image.
After merging them(half erased/half present dirt+soap+car image) i will get another image and display it on top ,so this will give an impression to the user as if he is applying soap on car and removing dirt.
If you can see what i am trying to explain.
I want to use above mentioned project and achieve these tasks on SpriteKit.
I cant use z-position to bring upfront and move back the images as it works only on SKSpriteNode and above example is coded on UIKit (UIImages) to erase images and not nodes.
I cant add transparent SKScenes on one another ,ex : Making a SKScene's background transparent not working... is this a bug? , same way as UIImageView's are added on that project as i am working on IOS 7 and want my application to be compatible with it.
Last resort would be i need to drop SpriteKit and work on UIKit.
Any logic to swipe over a SKSpriteNode and make its particluar swiped area transparent by changing its alpha value or something ?
Any help or suggestions are most welcomed. Thank You.
You can implement a "scratch and see" using Sprite Kit's SKCropNode. An SKCropNode applies a mask to its children nodes. The mask is used to hide part or all of the crop node's children. For this app, the child node is the image you would like to uncover by "scratching."
The basic steps
Start with an empty image as the texture for the mask
Add circles to the mask where the user touches the hidden image to uncover the picture below
Here's an example of how to do that:
First, define these properties
#property UIImage *image;
#property SKSpriteNode *maskNode;
#property SKNode *node;
then add the contents of the scene to didMoveToView.
-(void)didMoveToView:(SKView *)view {
self.node = [SKNode node];
_node.name = #"tree";
// Create a node that will hold the image that's hidden and later uncovered by "scratching"
CGPoint position = CGPointMake (CGRectGetWidth(self.frame)/2,CGRectGetHeight(self.frame)/2);
SKSpriteNode *imageNode = [SKSpriteNode spriteNodeWithImageNamed:#"hidden_pic.png"];
imageNode.position = CGPointZero;
CGSize size = imageNode.size;
// This is the layer that you "scatch" off
SKSpriteNode *background = [SKSpriteNode spriteNodeWithColor:[SKColor grayColor] size:size];
background.position = position;
background.name = #"background";
[_node addChild:background];
// This is the mask node. Initialize it with an empty image, so it completely hides the image
UIImage *image = [self blankImageWithSize:size];
self.image = image;
SKTexture *texture = [SKTexture textureWithImage:image];
SKSpriteNode *maskNode = [SKSpriteNode spriteNodeWithTexture:texture];
maskNode.position = CGPointZero;
maskNode.name = #"mask";
self.maskNode = maskNode;
[_node addChild:maskNode];
// This is the node that crops its children
SKCropNode *cropNode = [SKCropNode node];
cropNode.position = position;
cropNode.maskNode = maskNode;
cropNode.zPosition = 100;
cropNode.name = #"crop";
[_node addChild:cropNode];
[cropNode addChild:imageNode];
[self addChild:_node];
}
This creates an empty image. It is used to as the initial mask image so that the picture is completely hidden.
- (UIImage*) blankImageWithSize:(CGSize)size
{
UIGraphicsBeginImageContext(size);
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
This method draws a circle on an image at a specified point. It is used to update the mask node's image. Each circle drawn on the mask uncovers more of the hidden picture.
#define kCircleRadius 22
- (UIImage *)imageByDrawingCircleOnImage:(UIImage *)image atPoint:(CGPoint)point
{
UIGraphicsBeginImageContext(image.size);
[image drawAtPoint:CGPointZero];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextScaleCTM(context, 1, -1);
CGContextTranslateCTM(context, 0, -image.size.height);
CGRect rect = CGRectMake(point.x-kCircleRadius, point.y-kCircleRadius,
kCircleRadius*2, kCircleRadius*2);
UIBezierPath* roundedRectanglePath = [UIBezierPath bezierPathWithOvalInRect:rect];
[[UIColor blackColor] setFill];
[roundedRectanglePath fill];
CGContextAddPath(context, roundedRectanglePath.CGPath);
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
This method converts the specified point to the mask node's coordinates, calls
a method to draw a circle in the mask node, and updates the mask node's
texture.
- (void) drawCircleInImageAtPoint:(CGPoint)point
{
CGPoint location = [self convertPoint:point toNode:_maskNode];
location = CGPointMake(location.x+_maskNode.size.width/2, location.y+_maskNode.size.height/2);
UIImage *newImage = [self imageByDrawingCircleOnImage:_image atPoint:location];
SKTexture *texture = [SKTexture textureWithImage:newImage];
self.image = newImage;
_maskNode.texture = texture;
}
These methods handle touch events. It adds cicles to the mask node image where the user touched the screen.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
NSArray *nodes = [self nodesAtPoint:location];
for (SKNode *node in nodes) {
if ([node.name isEqualToString:#"crop"]) {
[self drawCircleInImageAtPoint:location];
}
}
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
NSArray *nodes = [self nodesAtPoint:location];
for (SKNode *node in nodes) {
if ([node.name isEqualToString:#"crop"]) {
[self drawCircleInImageAtPoint:location];
}
}
}
}

Cropping ImageView to Produce a New Image

I am trying to crop an Image whenever the user touches the UIImageView on the screen. The UIImageView is 640 X 300 area and I allow user to touch anywhere in the UIImageView. Then I use the following code to view the croppedImage and it always show me the wrong image. I think I am having trouble getting the correct coordinates.
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint location = [[touches anyObject] locationInView:self.view];
UIImage *originalImage = self.imageView.image;
NSLog(#"x = %f, y = %f",location.x,location.y);
CGRect cropRegion = CGRectMake(location.x, location.y, 10, 10);
CGImageRef subImage = CGImageCreateWithImageInRect(originalImage.CGImage, cropRegion);
UIImage *croppedImage = [UIImage imageWithCGImage:subImage];
}
You're getting the coordinates relative to self.view, not relative to the image view. Try:
CGPoint location = [[touches anyObject] locationInView:self.imageView];

Touchpoint coordinates in iOS are not mapped properly

I'm trying create an application that allows the user to move a frame over an image, so that I can apply some effects on a selected region.
I need to allow the user to precisely drag and scale the masked-frame on the image. I need this to be exact, just like any other photo app does.
My strategy is to get the touch points of the user, on a touch-moved event, and scale my frame accordingly. That was pretty intuitive. I coded the following stuff for handling the touch moved event :
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:[self view]];
float currX = touchPoint.x;
float currY = touchPoint.y;
/*proceed with other operations on currX and currY,
which is coming out quite well*/
}
But the only problem is, the coordinates of the currX and currY variables are not quite where they are supposed to be. There is a parallax error, which keeps shifting from device to device. I also think the x and y coordinates gets swapped in the case of an iPad.
Could you please help me to figure out how to get the exact touch coordinates?
My background image is in one view (imageBG) and the masked frame is in a separate one (maskBG). I have tried out :
CGPoint touchPoint = [touch locationInView:[maskBG view]];
and
CGPoint touchPoint = [touch locationInView:[imageBG view]];
...but the same problem persists. I have also noticed the error on touch being worse on an iPad than on an iPhone or iPod.
image.center = [[[event allTouches] anyObject] locationInView:self.view];
Hi your issue is the image and the iPhone screen are not necessarily in same aspect ratio.Your touch point might not translate correctly to your actual image.
- (UIImage*) getCroppedImage {
CGRect rect = self.movingView.frame;
CGPoint a;
a.x=rect.origin.x-self.imageView.frame.origin.x;
a.y=rect.origin.y-self.imageView.frame.origin.y;
a.x=a.x*(self.imageView.image.size.width/self.imageView.frame.size.width);
a.y=a.y*(self.imageView.image.size.height/self.imageView.frame.size.height);
rect.origin=a;
rect.size.width=rect.size.width*(self.imageView.image.size.width/self.imageView.frame.size.width);
rect.size.height=rect.size.height*(self.imageView.image.size.height/self.imageView.frame.size.height);
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
// translated rectangle for drawing sub image
CGRect drawRect = CGRectMake(-rect.origin.x, -rect.origin.y, self.imageView.image.size.width, self.imageView.image.size.height);
// clip to the bounds of the image context
// not strictly necessary as it will get clipped anyway?
CGContextClipToRect(context, CGRectMake(0, 0, rect.size.width, rect.size.height));
// draw image
[self.imageView.image drawInRect:drawRect];
// grab image
UIImage* croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return croppedImage;
}
This is what i did to crop my moving view is the rect which i pass for cropping see how its being translated to reflect correctly on image.Make sure the image view on which the user sees image is aspectfit content mode.
Note:- I make the rect of image view fit the aspectFit image
use this to do it
- (CGSize)makeSize:(CGSize)originalSize fitInSize:(CGSize)boxSize
{
widthScale = 0;
heightScale = 0;
widthScale = boxSize.width/originalSize.width;
heightScale = boxSize.height/originalSize.height;
float scale = MIN(widthScale, heightScale);
CGSize newSize = CGSizeMake(originalSize.width * scale, originalSize.height * scale);
return newSize;
}
have you tried these:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:selectedImageView];
float currX = (touchPoint.x)/selectedImageView.frame.size.width;
float currY = (touchPoint.y)/selectedImageView.frame.size.height;
/*proceed with other operations on currX and currY,
which is coming out quite well*/
}
or you can also use UIPanGestureRecognizer..

Resources