Cropping image from uiimagepickerviewcontroller in selected area - ios

how to crop image from UIImagePickerController in selected area when capturing image. overlayview to UIImagePickerController

Below the Code
-(UIImage *)centerCropImage:(UIImage *)image
{
// Use smallest side length as crop square length
CGFloat squareLength = MIN(image.size.width, image.size.height);
// Center the crop area
CGRect clippedRect = CGRectMake((image.size.width - squareLength) / 2, (image.size.height - squareLength) / 2, squareLength, squareLength);
// Crop logic
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], clippedRect);
UIImage * croppedImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return croppedImage;
}
After that in imagePickerViewDelegate Method
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info objectForKey: #"UIImagePickerControllerOriginalImage"];
[self centerCropImage:image]; //Just give your image here for cropping
yourImage.image=image;
picker.delegate =self;
[picker dismissViewControllerAnimated:YES completion:nil];
}

Related

Cropping UIImage Complications in iOS 8.4 [duplicate]

This question already has answers here:
Cropping center square of UIImage
(19 answers)
Closed 7 years ago.
I'm currently cropping an UIImage with my
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info method like so:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
picker.allowsEditing = YES;
UIImage *img = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect cropRect = CGRectMake(0, 0, 2448, 3264);
UIImage *croppedImage = [img crop:cropRect];
imageToPass = croppedImage;
NSLog(#"Here's imageToPass: %#", imageToPass);
NSLog(#"and here's imageToPass' width: %f", imageToPass.size.width);
NSLog(#"and here's imageToPass' height: %f", imageToPass.size.height);
NSLog(#"and here's imageToPass' scale: %f", imageToPass.scale);
UINavigationController *postControl = [self.storyboard instantiateViewControllerWithIdentifier:#"postControl"];
RGPostViewController *postView = (RGPostViewController *)postControl.topViewController;
[postView storeImage:imageToPass];
[self.presentedViewController presentViewController:postControl animated:NO completion:nil];
}
My problem is that when I print the width and height of my imageToPass variable I find that the value is listed in points. My console prints like this:
I need to get an image returned that is cropped to be 320x320 in size. With my code CGRect cropRect = CGRectMake(0, 0, 2448, 3264); Im taking the original size of the photo, which by default with UIImagePickerController is, I'm assuming, 320x520 or something like that. Using point values I can see that 2448 is the points wide and 3264 is the height. From Google,
iPhone 5 display resolution is 1136 x 640 pixels. Measuring 4 inches diagonally, the new touch screen has an aspect ratio of 16:9 and is branded a Retina display with 326 ppi (pixels per inch).
Im not sure what to do here. Does the math 2448points/640px = 3.825 tell me that there is 3.825 points per pixel on a 326ppi screen?
PS keep in mind I'm trying to grab the 320x320 picture in the middle of the UIImagePickerControllerOriginalImage which means cutting of some top number of pixels and some bottom number of pixels determined in points I'm assuming.
EDIT
Here's the code for the crop: method in the fourth line of code above:
#import "UIImage+Crop.h"
#implementation UIImage (Crop)
- (UIImage *)crop:(CGRect)rect {
rect = CGRectMake(rect.origin.x*self.scale,
rect.origin.y*self.scale,
rect.size.width*self.scale,
rect.size.height*self.scale);
CGImageRef imageRef = CGImageCreateWithImageInRect([self CGImage], rect);
UIImage *result = [UIImage imageWithCGImage:imageRef
scale:self.scale
orientation:self.imageOrientation];
CGImageRelease(imageRef);
return result;
}
#end
I have found that if I set my CGRect cropRect with CGRect cropRect = CGRectMake(264, 0, 2448, 3000); //3264 it actually removes 264 points from the top and bottom of the image. I understand that an iPhone 5s has a screen resolution of 326ppi(pixels per inch), how can I use this to successfully remove the amount of pixels that I need to remove.
You don't need to know about converting points/pixels/retina/non/etc because of a property of the screen called scale. You do need to use core graphics to do the actual crop though. Here's what it could look like:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
picker.allowsEditing = YES;
UIImage *img = [info objectForKey:UIImagePickerControllerOriginalImage];
// you want to make your rect in the center of your image, not at [0,0]
CGFloat cropSize = 320;
CGRect cropRect = CGRectMake(img.center.x - (cropSize / 2), img.center.y - (cropSize / 2), cropSize, cropSize);
// Make a new CGImageRef with the current graphics context, then use that to make the cropped UIImage. Make sure to release that image ref!
CGImageRef imageRef = CGImageCreateWithImageInRect([img CGImage], cropRect);
croppedImage = [UIImage imageWithCGImage: imageRef];
CGImageRelease(imageRef);
// Adjust the image for scale - this is how you handle retina/orientation.
imageToPass = [UIImage imageWithCGImage:imageRef
scale:self.scale
orientation:self.imageOrientation];
UINavigationController *postControl = [self.storyboard instantiateViewControllerWithIdentifier:#"postControl"];
RGPostViewController *postView = (RGPostViewController *)postControl.topViewController;
[postView storeImage:imageToPass];
[self.presentedViewController presentViewController:postControl animated:NO completion:nil];
}

Having trouble combining two images using UIGraphics, images not correctly positioned

I am trying to make a simple app that combines the camera overlay with the photo that is taken in UIImagePicker. In this example I want to combine the overlay view of bear ears with the photo.
- (IBAction)pushTakePhoto:(id)sender {
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.cameraDevice=UIImagePickerControllerCameraDeviceFront;
//create overlay UIView
UIImage *bearEars = [UIImage imageNamed:#"bearEars"];
UIImageView *imageView = [[UIImageView alloc] initWithImage:bearEars];
UIView *camerOverlayView = [[UIView alloc] initWithFrame:imageView.frame];
[camerOverlayView addSubview:imageView];
[picker setCameraOverlayView:camerOverlayView];
[self presentViewController:picker animated:YES completion:NULL];
}
the code above creates the overlay, this works fine. The code below combines the images:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *chosenImage = info[UIImagePickerControllerOriginalImage];
//first we flip the image
UIImage * flippedImage = [UIImage imageWithCGImage:chosenImage.CGImage scale:chosenImage.scale orientation:UIImageOrientationLeftMirrored];
chosenImage=flippedImage;
//now we need to add the ears
//get pics
UIImage *backgroundImage = chosenImage;
UIImage *watermarkImage = [UIImage imageNamed:#"bearEars"];
//start a workspace
UIGraphicsBeginImageContext(backgroundImage.size);
[backgroundImage drawInRect:CGRectMake(0, 0, backgroundImage.size.width, backgroundImage.size.height)];
//position seems off for some reason
[watermarkImage drawInRect:CGRectMake(0, 0, backgroundImage.size.width, backgroundImage.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
chosenImage=result;
self.finalImageView.image = chosenImage;
[picker dismissViewControllerAnimated:YES completion:NULL];
}
When I combine the images however the bearEars overlay is not positioned at (0,0) but instead in the middle of the final image. I'm not sure why this is happening. I just want it to overlay the images like in the camera view. Any ideas why this is happening? Thanks
I think the image is centered because you are using the same rect size in drawInRect method for both images.
Try to change size from backgroundImage.size to watermarkImage.size for watermarkImage:
[watermarkImage drawInRect:CGRectMake(0, 0, watermarkImage.size.width, watermarkImage.size.height)];
EDITED:
To do the correct size for watermarkImage you need to find a scale size - a difference in size from image in preview from the final image. You can do this by dividing width of the backgroundImage to the width of the self.view
CGFloat scale = backgroundImage.size.width / self.view.frame.size.width;
Then you can use this value during drawing watermarkImage:
[watermarkImage drawInRect:CGRectMake(0, 0, watermarkImage.size.width * scale, watermarkImage.size.height * scale)];
Hope it helps.

how to crop center part UIImage in circular,square,triangular shape

I am implementing below code cropping image to square
- (void)imagePickerController:(UIImagePickerController *)picker1 didFinishPickingMediaWithInfo:(NSDictionary *)info {
tatooImage = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImageView *imageView = [[UIImageView alloc] initWithImage:tatooImage];
CGSize size = [tatooImage size];
[imageView setFrame:CGRectMake(0, 0, size.width, size.height)];
CGRect rect = CGRectMake(size.width / 4, size.height / 4 ,
(size.width / 2), (size.height / 2));
self.imageOverlay.image = [self croppedImage:tatooImage cropRect:rect];
}
- (UIImage *)croppedImage:(UIImage *)image cropRect:(CGRect)cropRect
{
CGImageRef croppedCGImage = CGImageCreateWithImageInRect(image.CGImage, cropRect);
UIImage *croppedImage = [UIImage imageWithCGImage:croppedCGImage scale:1.0f orientation:image.imageOrientation];
CGImageRelease(croppedCGImage);
return [croppedImage fixOrientation];
}
Problem i am facing is that i am not getting center part of the image

UIImagePickerController ImageIO_PNG takes massive memory

Although I resize my images once the UIImagePickerController has finished taking a photo, my instruments profile says that calls to ImageIO_PNG are taking massive amounts of memory (40 MB+) each time I take a photo. This is my code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
#autoreleasepool {
if (myImageView.image == nil) {
myImageView.contentMode = UIViewContentModeScaleAspectFill;
UIImage *topImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect rect = CGRectMake(0,0,320,440);
UIGraphicsBeginImageContext( rect.size );
// use the local image variable to draw in context
[topImage drawInRect:rect];
UIImage *topResized = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
myImageView.image = topResized;
image = nil;
info = nil;
[picker dismissViewControllerAnimated:NO completion:nil];
[picker removeFromParentViewController];
remove below line from your code that might help:
image = nil;
info = nil;

Importing an image from camera roll and cropping it

I load an image from the camera roll and send it to a function which returns a section of the original image based on a rectangle I create. I can then add this to the scene with no issues.
The problem occurs when I want to load another section of the original image. I want to create 2 sprites, each with different sections of the original image, like a jigsaw, but when I send the original image and a different rectangle, I get the same image as the first time and have two images the same added to the scene.
Any ideas would be appreciated?
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
// newImage is a UIImage do not try to use a UIImageView
UIImage *newImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
UIImage *newImage2 = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//newImage2 = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
// Dismiss UIImagePickerController and release it
[picker dismissModalViewControllerAnimated:YES];
[picker.view removeFromSuperview];
[picker release];
CGRect newRect = CGRectMake(0, 0, 320, 460);
CGRect newRect2 = CGRectMake(600, 600, 100, 100);
UIImage *testImage = [self imageFromImage:newImage inRect:newRect];
UIImage *testImage2 = [self imageFromImage:newImage2 inRect:newRect2];
CCSprite *scaledImage = [CCSprite spriteWithCGImage:testImage.CGImage key:#"ImageFromPicker"];
scaledImage.position = ccp(s.width/2, s.height/2);
[self addChild:scaledImage];
CCSprite *scaledImage2 = [CCSprite spriteWithFile:#"play.png"];//[CCSprite spriteWithCGImage:testImage2.CGImage key:#"ImageFromPicker"];
scaledImage2.position = ccp(560,40);
[self addChild:scaledImage2];
}
And the method that crops the image:
- (UIImage *)imageFromImage:(UIImage *)image inRect:(CGRect)rect
{
CGImageRef sourceImageRef = [image CGImage];
CGImageRef newImageRef = CGImageCreateWithImageInRect(sourceImageRef, rect);
UIImage *tempImage = [UIImage imageWithCGImage:newImageRef];
return tempImage;
}
Thanks to LearnCocos, you were spot on with your answer. Created 2 separate textures from different part of the larger image and could then add them separately.

Resources