Set maximum image resolution for PickerController? - ios

I want take a image with a fixed resolution. The user need to be forced for crop the image. I need a square image.
I'am using next code and it just crop by width screen resolution.
self.imagepicker = [[UIImagePickerController alloc]init];
self.imagepicker.delegate = self;
self.imagepicker.allowsEditing=YES;
self.imagepicker.preferredContentSize = CGSizeMake(200, 200);
self.imagepicker.sourceType=UIImagePickerControllerSourceTypePhotoLibrary;
[self presentViewController:self.imagepicker animated:YES completion:nil];

preferredContentSize is not a property of the UIImagePickerController itself, but of any UIViewController, for when it is embedded in a UIPopoverController.
You'll need to present a cropping interface yourself.

If you want to done after selecting image please try this code for crop image in center
- (UIImage *)squareImageFromImage:(UIImage *)image scaledToSize:(CGFloat)newSize1
{
CGAffineTransform scaleTransform;
CGPoint origin;
CGFloat newSize;
if (image.size.width > image.size.height)
{
newSize=image.size.height;
CGFloat scaleRatio = newSize / image.size.height;
scaleTransform = CGAffineTransformMakeScale(scaleRatio, scaleRatio);
origin = CGPointMake(-(image.size.width - image.size.height) / 2.0f, 0);
}
else
{
newSize=image.size.width;
CGFloat scaleRatio = newSize / image.size.width;
scaleTransform = CGAffineTransformMakeScale(scaleRatio, scaleRatio);
origin = CGPointMake(0, -(image.size.height - image.size.width) / 2.0f);
}
CGSize size = CGSizeMake(newSize, newSize);
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(size, YES, 0);
} else {
UIGraphicsBeginImageContext(size);
}
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextConcatCTM(context, scaleTransform);
[image drawAtPoint:origin];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
Its helpful to you. Thanks

Related

issue with cropImage method on image in UIScrollView yields wrong image

I am using the following crop method to crop a uiimage that's sitting in a UIImageView which is then sitting in a UIScrollView.
-(UIImage *)cropImage:(UIImage *)image
{
float scale = 1.0f/_scrollView.zoomScale;
NSLog(#"Oh and heres that zoomScale: %f", _scrollView.zoomScale);
CGRect visibleRect;
visibleRect.origin.x = _scrollView.contentOffset.x * scale;
visibleRect.origin.y = _scrollView.contentOffset.y * scale;
visibleRect.size.width = _scrollView.bounds.size.width * scale;
visibleRect.size.height = _scrollView.bounds.size.height * scale;
NSLog(#"Oh and here's that CGRect: %f", visibleRect.origin.x);
NSLog(#"Oh and here's that CGRect: %f", visibleRect.origin.y);
NSLog(#"Oh and here's that CGRect: %f", visibleRect.size.width);
NSLog(#"Oh and here's that CGRect: %f", visibleRect.size.height);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], visibleRect);
UIImage *croppedImage = [[UIImage alloc] initWithCGImage:imageRef];
CGImageRelease(imageRef);
return croppedImage;
}
I need the image to be cropped to a CGSize of (321,115). Upon cropping the image and seeing the print results, I can see that visibleRect is (0,0,321,115) - what it is supposed to be, the croppedImage UIImage then has width:321 and height:115. for some reason however the image appears to be zoomed in entirely too far (the method cropped a smaller portion of the original image to a size of 321x115).
Why is this method not correctly cropping my image?
-As a side note: When I call this method, I am calling like so _croppedImage = [self cropImage:_imageView.image]; which sets a UIImage property of a custom UIView class to the cropped image.
Please try this function. It may help you.
Parameters:
UIImage
CGSize (321,115) or any size
// crop image - image will crop from full image
- (UIImage *)cropImageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
double ratio;
double delta;
CGPoint offset;
//make a new square size, that is the resized imaged width
CGSize sz = CGSizeMake(newSize.width, newSize.width);
//figure out if the picture is landscape or portrait, then
//calculate scale factor and offset
if (image.size.width > image.size.height) {
ratio = newSize.width / image.size.width;
delta = (ratio*image.size.width - ratio*image.size.height);
offset = CGPointMake(delta/2, 0);
}
else {
ratio = newSize.width / image.size.height;
delta = (ratio*image.size.height - ratio*image.size.width);
offset = CGPointMake(0, delta/2);
}
//make the final clipping rect based on the calculated values
CGRect clipRect = CGRectMake(-offset.x,
-offset.y,
(ratio * image.size.width) + delta,
(ratio * image.size.height) + delta);
//start a new context, with scale factor 0.0 so retina displays get
//high quality image
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(sz, YES, 0.0);
} else {
UIGraphicsBeginImageContext(sz);
}
UIRectClip(clipRect);
[image drawInRect:clipRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
To crop only selected portion of image
Please check this link

Take squarish photos with UIImagePickerController

Is there an elegant way to only allow the user to take squarish photos with UIImagePickerController instead of the default rectangular ones? Something open source, maybe?
Here's the easiest way to do it (without reimplementing UIImagePickerController). First, use an overlay to make the camera field look square. Here's an example for 3.5" screens (you'd need to update it to work for iPhone 5):
UIImagePickerController *imagePickerController = [[UIImagePickerController alloc] init];
imagePickerController.sourceType = source;
if (source == UIImagePickerControllerSourceTypeCamera) {
//Create camera overlay
CGRect f = imagePickerController.view.bounds;
f.size.height -= imagePickerController.navigationBar.bounds.size.height;
CGFloat barHeight = (f.size.height - f.size.width) / 2;
UIGraphicsBeginImageContext(f.size);
[[UIColor colorWithWhite:0 alpha:.5] set];
UIRectFillUsingBlendMode(CGRectMake(0, 0, f.size.width, barHeight), kCGBlendModeNormal);
UIRectFillUsingBlendMode(CGRectMake(0, f.size.height - barHeight, f.size.width, barHeight), kCGBlendModeNormal);
UIImage *overlayImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageView *overlayIV = [[UIImageView alloc] initWithFrame:f];
overlayIV.image = overlayImage;
[imagePickerController.cameraOverlayView addSubview:overlayIV];
}
imagePickerController.delegate = self;
[self presentViewController:imagePickerController animated:YES completion:nil];
Then, after you get a picture back from the UIImagePickerController, crop it to a square with something like this:
//Crop the image to a square
CGSize imageSize = image.size;
CGFloat width = imageSize.width;
CGFloat height = imageSize.height;
if (width != height) {
CGFloat newDimension = MIN(width, height);
CGFloat widthOffset = (width - newDimension) / 2;
CGFloat heightOffset = (height - newDimension) / 2;
UIGraphicsBeginImageContextWithOptions(CGSizeMake(newDimension, newDimension), NO, 0.);
[image drawAtPoint:CGPointMake(-widthOffset, -heightOffset)
blendMode:kCGBlendModeCopy
alpha:1.];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
May be it will help you.

CGContext - scale and then crop image at point

Similar to Instagram I have a square crop view (UIScrollView) that has a UIImageView inside it. So the user can drag a portrait or landscape image inside the square rect (equal to the width of the screen) and then the image should be cropped at the scroll offset. The UIImageView is set to aspect fit. The UIScrollView content size is set to a scale factor for either landscape or portrait, so that it correctly renders with aspect fit ratio.
When the user is done dragging I want to scale the image up based on a given size, let's say 1000x1000px square and then crop it at the scroll offset (using [UIImage drawAtPoint:CGPoint].
The problem is I can't get the math right to get the right offset point. If I get it close on a 6+ it will be way off on a 4S.
Here's my code for the scale and crop:
(UIImage *)squareImageFromImage:(UIImage *)image scaledToSize:(CGFloat)newSize {
CGAffineTransform scaleTransform;
CGPoint origin;
if (image.size.width > image.size.height) {
//landscape
CGFloat scaleRatio = newSize / image.size.height;
scaleTransform = CGAffineTransformMakeScale(scaleRatio, scaleRatio);
origin = CGPointMake((int)(-self.scrollView.contentOffset.x*scaleRatio),0);
} else if (image.size.width < image.size.height) {
//portrait
CGFloat scaleRatio = newSize / image.size.width;
scaleTransform = CGAffineTransformMakeScale(scaleRatio, scaleRatio);
origin = CGPointMake(0, (int)(-self.scrollView.contentOffset.y*scaleRatio));
} else {
//square
CGFloat scaleRatio = newSize / image.size.width;
scaleTransform = CGAffineTransformMakeScale(scaleRatio, scaleRatio);
origin = CGPointMake(0, 0);
}
UIGraphicsBeginImageContextWithOptions(size, YES, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextConcatCTM(context, scaleTransform);
[image drawAtPoint:origin];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
So for example with landscape, if I drag the scroll left so that the image is cropped all the way to the right, my offset will be close on a 6+ but on a 4S it will be off by about 150-200 in terms of the CGPoint.
Here is my code for setting up the scroll view and image view:
CGRect cropRect = CGRectMake(0.0f,0.0,SCREEN_WIDTH,SCREEN_WIDTH);
CGFloat ratio = (int)self.image.size.height/self.image.size.width;
CGRect r = CGRectMake(0.0,0.0,SCREEN_WIDTH,SCREEN_WIDTH);
if (ratio>1.00) {
//portrait
r = CGRectMake(0.0,0.0,SCREEN_WIDTH,(int)(SCREEN_WIDTH*ratio));
} else if (ratio<1.00) {
//landscape
CGFloat size = (int)self.image.size.width/self.image.size.height;
cropOffset = (SCREEN_WIDTH*size)-SCREEN_WIDTH;
r = CGRectMake(0.0,0.0,(int)(SCREEN_WIDTH*size),SCREEN_WIDTH);
}
NSLog(#"r.size.height == %.4f",r.size.height);
self.scrollView.frame = cropRect;
self.scrollView.contentSize = r.size;
self.imageView = [[UIImageView alloc] initWithFrame:r];
self.imageView.backgroundColor = [UIColor clearColor];
self.imageView.contentMode = UIViewContentModeScaleAspectFit;
self.imageView.image = self.image;
[self.scrollView addSubview:self.imageView];
Cropping math can be tricky. It's been a while since I've had to deal with this, so hopefully I'm pointing you in the right direction. Here is a chunk of code from Pixology that grabs a scaled visible rect from a UIScrollView. I think the missing ingredient here might be zoomScale.
CGRect visibleRect;
visibleRect.origin = _scrollView.contentOffset;
visibleRect.size = _scrollView.bounds.size;
// figure in the scale
float theScale = 1.0 / _scrollView.zoomScale;
visibleRect.origin.x *= theScale;
visibleRect.origin.y *= theScale;
visibleRect.size.width *= theScale;
visibleRect.size.height *= theScale;
You may also need to figure in device screen scale:
CGFloat screenScale = [[UIScreen mainScreen] scale];
See how far you can get with this info, and let me know.

When using avfoundation - crop visible portion of captured image represented in display layer

I am using PBJVision library to capture images.
Under the hood it is using AVFoundation.
I set up the camera to use the following options
PBJVision *camera = [[PBJVision alloc] init];
self.camera = camera;
self.camera.delegate = self;
[self.camera setCameraMode:PBJCameraModePhoto];
[self.camera setCameraDevice:PBJCameraDeviceFront];
[self.camera setCameraOrientation:PBJCameraOrientationPortrait];
[self.camera setFocusMode:PBJFocusModeAutoFocus];
[self.camera setPresentationFrame:self.previewView.frame];
[self.camera previewLayer].frame = self.previewView.bounds;
[self.camera previewLayer].videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.previewView.layer addSublayer:[self.camera previewLayer]];
because the preview layer gravity is set as AVLayerVideoGravityResizeAspectFill the captured image isn't identical to the previewed image.
How can I crop it according to the video gravity?
Based on Erica Sadun's excellent Cookbook, adding the code below to your view controller will allow you to do
UIImage *newImage = [self.applyAspectFillImage: image InRect: self.previewView.bounds];
You can obtain the maximum image size by using the smaller edge (width or height) of the original photo image to size your destination rectangle.
CGFloat scaleW = image.size.width / previewView.bounds.size.width;
CGRect destRect = CGRectMake(0, 0, image.size.width, preview.bounds.size.height * scaleW);
UIImage *newImage = [self.applyAspectFillImage: image InRect: destRect];
The code:
CGRect CGRectCenteredInRect(CGRect rect, CGRect mainRect)
{
CGFloat xOffset = CGRectGetMidX(mainRect)-CGRectGetMidX(rect);
CGFloat yOffset = CGRectGetMidY(mainRect)-CGRectGetMidY(rect);
return CGRectOffset(rect, xOffset, yOffset);
}
// Calculate the destination scale for filling
CGFloat CGAspectScaleFill(CGSize sourceSize, CGRect destRect)
{
CGSize destSize = destRect.size;
CGFloat scaleW = destSize.width / sourceSize.width;
CGFloat scaleH = destSize.height / sourceSize.height;
return MAX(scaleW, scaleH);
}
CGRect CGRectAspectFillRect(CGSize sourceSize, CGRect destRect)
{
CGSize destSize = destRect.size;
CGFloat destScale = CGAspectScaleFill(sourceSize, destRect);
CGFloat newWidth = sourceSize.width * destScale;
CGFloat newHeight = sourceSize.height * destScale;
CGFloat dWidth = ((destSize.width - newWidth) / 2.0f);
CGFloat dHeight = ((destSize.height - newHeight) / 2.0f);
CGRect rect = CGRectMake (dWidth, dHeight, newWidth, newHeight);
return rect;
}
- (UIImage *) applyAspectFillImage: (UIImage *) image InRect: (CGRect) bounds
{
CGRect destRect;
UIGraphicsBeginImageContext(bounds.size);
CGRect rect = CGRectAspectFillRect(image.size, bounds);
destRect = CGRectCenteredInRect(rect, bounds);
[image drawInRect: destRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Merge Two UIImage that are Rotated & Scaled

I'am doing an app something like this: You load a photo and you put images over it, like balloons, etc..
When I try to merge one of this over images with only resize it works fine. Like 10px more than it should be but no problem.
The problem comes when you rotate the image [UIImageView] it appears much bigger that the image its, I try allot of things and nothing. I leave the code. I hope someone could help.
Note: The image size its inside UIImageView, then multiplied it by the scale of the main image
- (UIImage *)mergeImage:(UIImageView *)mainImage withImageView:(UIImageView *)imageView {
UIImage *temp = imageView.image;
UIImage *tempMain = mainImage.image;
CGFloat mainScale = [self imageViewScaleFactor:mainImage];
CGFloat tempScale = 1/mainScale;
NSLog(#"%f", tempScale);
//Rotate UIIMAGE
UIGraphicsBeginImageContext(temp.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGAffineTransform transform = CGAffineTransformIdentity;
transform = CGAffineTransformTranslate(transform, temp.size.width/2, temp.size.height/2);
CGFloat angle = atan2(imageView.transform.b, imageView.transform.a);
transform = CGAffineTransformRotate(transform, angle);
transform = CGAffineTransformScale(transform, 1.0, -1.0);
CGContextConcatCTM(ctx, transform);
// Draw the image into the context
CGContextDrawImage(ctx, CGRectMake(-temp.size.width/2, -temp.size.height/2, temp.size.width, temp.size.height), temp.CGImage);
// Get an image from the context
temp = [UIImage imageWithCGImage: CGBitmapContextCreateImage(ctx)];
NSLog(#"%f %f %f", mainScale, mainImage.frame.size.width, mainImage.frame.size.height);
UIGraphicsBeginImageContextWithOptions(tempMain.size, NO, 1.0f);
//Get imageView size & position
NSLog(#"%f %f %f %f", imageView.frame.origin.x, imageView.frame.origin.y, imageView.frame.size.width, imageView.frame.size.height);
CGFloat offsetX = 0;
CGFloat offsetY = -44;
if (tempMain.size.height > tempMain.size.width) {
offsetX = ((tempMain.size.width * mainScale) - 320)/2;
}else{
offsetY = ((tempMain.size.height * mainScale) - 416)/2;
offsetY -= 44;
}
CGFloat imageViewX = (imageView.frame.origin.x + offsetX) * tempScale;
CGFloat imageViewY = (imageView.frame.origin.y + offsetY) * tempScale;
CGFloat imageViewW = imageView.frame.size.width * tempScale;
CGFloat imageViewH = imageView.frame.size.height * tempScale;
CGRect tempRect = CGRectMake(imageViewX, imageViewY, imageViewW, imageViewH);
[tempMain drawAtPoint:CGPointZero];
[temp drawInRect:tempRect];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Thanks
This is the solution that works for me
Merging a previosly rotated by gesture UIImageView with another one. WYS is not WYG
I just take a photo to the main screen and then crop it to the size of the photo, its faster, and clean. and the resolution it ok if the apps runs in retina in a normal device isn't too good. And you need to prepare that code to work in retina & non-retina

Resources