Cropping UIImage from photo library vs iPhone camera - ios

I can crop images from the iPhone's photo library no problem like this:
CGRect topRect = CGRectMake(0, 0, image.size.width, image.size.height / 2);
CGImageRef topCroppedCGImageRef = CGImageCreateWithImageInRect(image.CGImage,
topRect);
UIImage *croppedImage = [[UIImage alloc] initWithCGImage:topCroppedCGImageRef];
CGImageRelease(topCroppedCGImageRef);
However this doesn't work when the image comes from the camera. Specifically the cropped image is rotated and the cropped portion isn't as expected. After reading around it sounds like this problem is relatively common. However I've tried the various code fixes and it's not quite working (still have rotation, unexpected cropping and even distortion issues). So I'd like to actually understand why the above cropping doesn't just work for images coming from the camera.
Why doesn't the above cropping method work on images coming from the iPhone's camera?

As pointed out by this famous post - Resize a UIImage the right way, this is because you leave out functionality such as EXIF orientation support, an absolute necessity when dealing with photographs taken by the iPhone’s camera.
By default (picture taken in portrait) the image has an EXIF orientation flag = 6 which means the image is rotated 90 degrees counterclockwise:
$ identify -format "%[EXIF:orientation]" myimage.jpg
6

Related

Swift camera issue

I have a UIView that works as camera and it's 320x180 and a UIImageView of same size.
When I take a photo, it generates me an UIImage of size 1080x1920, so when I show it on the imageView, what happens is that the photo is very compressed on its height, because it is very tall, is like this
██████ the black rectangle is the whole photo (1080x1920)
██████
█▒▒▒▒█ the gray is what camera show in screen
██████ (it shows only gray part but it stores
██████ all the black part 1080x1920)
I would like to store it as an UIImage exactly how I see it on the gray rectangle.
I'm not sure how to do this, since the size of the photo is way bigger than the resolution the screen (which is 320 x 568) so is hard to crop correctly (and the crop is also rotating the image and bringing other bugs).
1080/6 = 180. 1920/6 = 320. So the values are in the correct aspect ratio — but they are reversed. You need to apply the correct rotation value to the image.

App crashed when I display a large image by UIImageView

I set a image with 10000 * 10000 pixels to UIImageView from network by SDWebImage, and App crashed because it allocated too much memory. I tried to resize the image that had been loaded by SDWebImage, so I add the code below:
UIGraphicsBeginImageContext(size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextClearRect(context, CGRectMake(0, 0, size.width, size.height));
CGContextSetInterpolationQuality(context, 0.8);
[self drawInRect:drawRect blendMode:kCGBlendModeNormal alpha:1];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Although the image size was smaller, my app crashed due to the same reason.
It seems that there are some rendering action during the resizing action, the memory would rose to 600M and fell to 87M in a little while.
How can I resize a image without rendering?
It seems that display the image by UIWebView did not exist the problem. How it works?
Any help and suggestions will be highly appreciable。
Resolution:
https://developer.apple.com/library/ios/samplecode/LargeImageDownsizing/
The Resolution does work for jpg but not for png
You can't unpack the image into memory because it's too big. This is what image tiling is for, so you would download a set of tiles for the image based on the part you're currently looking at (the zoom position and scale).
I.e. if you're looking at the whole image you get 1 tile which is zoomed out and therefore low quality and small size. As you zoom in you get back other small size images which show less of the image 'area'.
The web view is likely using the image format to download a relatively small image size that is a scaled down version of the whole image, so it doesn't need to unpack the whole image to memory. It can do this because it knows your image is 10,000x10,000 but that it is going to be displayed on the page at 300x300 (for example).
Did you try to use : UIImageJPEGRepresentation (or UIImagePNGRepresentation)
You can make your image size smaller with it.

UIImagePickerController Image returned is bigger than preview on iPhone 4

I'm taking a photo with UIImagePickerController. Everything works fine on iPad and on iPhone 5. The problem comes with an iPhone4: the photo that I get from the picker is "bigger" than what I saw on the screen when I took the photo.
What do I mean with bigger? I mean that at both sides of the photo, and at the bottom, I see parts of the scene that the camera didn't show when I was taking the photo. This is a problem for my app. I need to capture exactly the same scene that the user sees through the camera when he takes the photo. Not a bigger scene. As I said, on iPhone5 and iPad4 everything works fine. I don't understand this different behaviour. How can I solve this?
PD: I'm not applying any transformation to the image on the picker.
-(void)imagePickerController:(UIImagePickerController*)picker didFinishPickingMediaWithInfo:(NSDictionary*)info
{
UIImage* originalImage = [info valueForKey:#"UIImagePickerControllerOriginalImage"];
NSLog(#"Image Size Width %f Height %f",originalImage.size.width,originalImage.size.height);
UIGraphicsBeginImageContext(CGSizeMake(320, 480));
[originalImage drawInRect:CGRectMake(0,0,320,480)];
UIImage* image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"image Size h %f Width %f",[image size].height, [image size].width);
}
Here you can see that What is original image size and after than you can change size as your wish..
It was a problem with the preview on the iPhone4... the preview on the iPhone4 is 16:9, but the final image is 4:3. On the iPhone5 and the iPad, both the preview and the final image are 4:3.
I didn't realize it before because my overlay was hiding part of the preview on the iPhone4, so I thought the preview was 4:3 too (and I think that the preview with iOS6 is 4:3, I'm not sure, I've got to find an iPhone4 with iOS6).
I made my overlay (a UIToolbar) translucent, and now I get what I want.

iOS image sizes for iPad and iPhone

I have developed an small iOS app, where i have image named bg.png which is of dimension
1024 * 768 for iPad.
Now i have many images which has been created for iPad size. Now i need to make support of this app in iPhone, for that weather i need to create same set of images agian for iPhone size,
568 * 300 for iPhone.
or there is another way to do this?
Scaling down the iPad image assets will destroy UX on iPhone. Also images like icon, splash screen usually contain company logo. Scaling down will tamper the look of the logo and overall image. Better way is to create separate images for iPhone form factor. Trim the png files using http://tinypng.org/ to keep binary size low.
Cheers!Amar.
You can use this code to re-size the image by following code,
CGSize newSize = CGSizeMake(568, 300);
UIGraphicsBeginImageContext(newSize);
[yourIpadImage drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
newIphoneImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
//UIGraphicsBeginImageContext(newSize);
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
You have option to change the Size of Your Image
sathiamoorthys solution is a difficult way or rescaling your image. You can do that by simply creating a UIImageView, initialize it with a UIImage and then change its frame.
Note that your image will look scaled/distorted that way.
follow this:
open the image in preview.
go to tools > adjust size
put in whatever size you want.
save the image as a different name.
yes
you should create duplicate and resize them for iphone. Using same images for iphone will bring memory issues because the images are unnecessarily big for iphone.
Use any software to resize them or you can do this using preview also as Nikita described above
If you are doing this to create universal app then you must postfix ~ipad in the name of the image file.
Please visit this link, May help you and solve your issue.
There is the some tips like:
Propotional scale,
Resize
If you want your images to show up unscaled, you are going to need an additional image with the correct size.
So supporting both iPad with and without retina screens would require one image of 768x1024 and one of 1536 x 2048. For iPhone 3.5" you would need 960 x 640 when it is a retina screen or 480 x 320 when it is non-retina. For iPhone 5 (4" screen) you would need 568 x 320.
If you use UIImages method imageNamed: there is help from Apple. It loads on retina devices that method looks for the the image you specified with the postfix '#2x'. So you can simply code:
UIImage * myImage = [UIImage imageNamed: #"myImage"]
If you make sure you project contains myImage.png for non-retina devices and myImage#2x.png for retina devices the right image gets loaded at runtime.

Updating UIImage orientation metaData?

Basic Task: update the EXIF orientation property for the metaData asociated with a UIImage. My problem is that I don't know where the orientation property is in all the EXIF info.
Convoluted Background: I am changing the orientation of the image returned by imagePickerController:didFinishPickingMediaWithInfo: so I am thinking that I also need to update the metaData before saving the image with writeImageToSavedPhotosAlbum:(CGImageRef)imageRef metadata:(NSDictionary *)metadata.
In other words, unless I change it, the metaData will contain the old/initial orientation and thus be wrong. The reason I am changing the orientation is because it keeps tripping me up when I run the Core Image face detection routine. Taking a photo with the iPhone (device) in Portrait mode using the front camera, the orientation is UIImageOrientationRight (3). If I rewrite the image so the orientation is UIImageOrientationUp(0), I get good face detection results. For reference, the routine to rewrite the image is below.
The whole camera orientation thing I find very confusing and I seem to be digging myself deeper into a code hole with all of this. I have looked at the posts (here, here and here. And according to this post (https://stackoverflow.com/a/3781192/840992):
"The camera is actually landscape native, so you get up or down when
you take a picture in landscape and left or right when you take a
picture in portrait (depending on how you hold the device)."
...which is totally confusing. If the above is true, I would think I should be getting an orientation of UIImageOrientationLeftMirrored or UIImageOrientationRightMirrored with the front camera. And none of this would explain why the CIDetector fails the virgin image returned by the picker.
I am approaching this ass-backwards but I can't seem to get oriented...
-(UIImage *)normalizedImage:(UIImage *) thisImage
{
if (thisImage.imageOrientation == UIImageOrientationUp) return thisImage;
UIGraphicsBeginImageContextWithOptions(thisImage.size, NO, thisImage.scale);
[thisImage drawInRect:(CGRect){0, 0, thisImage.size}];
UIImage *normalizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return normalizedImage;
}
Take a look at my answer here:
Force UIImagePickerController to take photo in portrait orientation/dimensions iOS
and associated project on github (you won't need to run the project, just look at the readme).
It's more concerned with reading rather than writing metadata - but it includes a few notes on Apple's imageOrientation and the corresponding orientation 'Exif' metadata.
This might be worth a read also
Captured photo automatically rotated during upload in IOS 6.0 or iPhone
There are two different constant numbering conventions in play to indicate image orientation.
kCGImagePropertyOrientation constants as used in TIFF/IPTC image metadata tags
UIImageOrientation constants as used by UIImage imageOrientation property.
iPhones native camera orientation is landscape left (with home button to the right). Native pixel dimensions always reflect this, rotation flags are used to orient the image correctly with the display orientation.
Apple UIImage.imageOrientation TIFF/IPTC kCGImagePropertyOrientation
iPhone native UIImageOrientationUp = 0 = Landscape left = 1
rotate 180deg UIImageOrientationDown = 1 = Landscape right = 3
rotate 90CCW UIImageOrientationLeft = 2 = Portrait down = 8
rotate 90CW UIImageOrientationRight = 3 = Portrait up = 6
UIImageOrientation 4-7 map to kCGImagePropertyOrientation 2,4,5,7 - these are the mirrored counterparts.
UIImage derives it's imagerOrientation property from the underlying kCGImagePropertyOrientation flags - that's why it is a read-only property. This means that as long as you get the metadata flags right, the imageOrientation will follow correctly. But if you are reading the numbers in order to apply a transform, you need to be aware which numbers you are looking at.
A few gleanings from my world o' pain in looking into this:
Background: Core Image face detection was failing and it seemed to be related to using featuresInImage:options: and using the UIImage.imageOrientation property as an argument. With an image adjusted to have no rotation and not mirrored, detection worked fine but when passing in an image directly from the camera detection failed.
Well...UIImage.imageOrientation is DIFFERENT than the actual orientation of the image.
In other words...
UIImage* tmpImage = [self.imageInfo objectForKey:UIImagePickerControllerOriginalImage];
printf("tmpImage.imageOrientation: %d\n", tmpImage.imageOrientation);
Reports a value of 3 or UIImageOrientationRight whereas using the metaData returned by the UIImagePickerControllerDelegate method...
NSMutableDictionary* metaData = [[tmpInfo objectForKey:#"UIImagePickerControllerMediaMetadata"] mutableCopy];
printf(" metaData orientation %d\n", [[metaData objectForKey:#"Orientation"] integerValue]);
Reports a value of 6 or UIImageOrientationLeftMirrored.
I suppose it seems obvious now that UIImage.imageOrientation is a display orientation which appears to determined by source image orientation and device rotation. (I may be wrong here) Since the display orientation is different than the actual image data, using that will cause the CIDetector to fail. Ugh.
I'm sure all of that serves very good and important purposes for liquid GUIs etc. but it is too much to deal with for me since all the CIDetector coordinates will also be in the original image orientation making CALayer drawing sick making. So, for posterity here is how to change the Orientation property of the metaData AND the image contained therein. This metaData can then be used to save the image to the cameraRoll.
Solution
// NORMALIZE IMAGE
UIImage* tmpImage = [self normalizedImage:[self.imageInfo objectForKey:UIImagePickerControllerOriginalImage]];
NSMutableDictionary * tmpInfo =[self.imageInfo mutableCopy];
NSMutableDictionary* metaData = [[tmpInfo objectForKey:#"UIImagePickerControllerMediaMetadata"] mutableCopy];
[metaData setObject:[NSNumber numberWithInt:0] forKey:#"Orientation"];
[tmpInfo setObject:tmpImage forKey:#"UIImagePickerControllerOriginalImage"];
[tmpInfo setObject:metaData forKey:#"UIImagePickerControllerMediaMetadata"];
self.imageInfo = tmpInfo;

Resources