Basic Task: update the EXIF orientation property for the metaData asociated with a UIImage. My problem is that I don't know where the orientation property is in all the EXIF info.
Convoluted Background: I am changing the orientation of the image returned by imagePickerController:didFinishPickingMediaWithInfo: so I am thinking that I also need to update the metaData before saving the image with writeImageToSavedPhotosAlbum:(CGImageRef)imageRef metadata:(NSDictionary *)metadata.
In other words, unless I change it, the metaData will contain the old/initial orientation and thus be wrong. The reason I am changing the orientation is because it keeps tripping me up when I run the Core Image face detection routine. Taking a photo with the iPhone (device) in Portrait mode using the front camera, the orientation is UIImageOrientationRight (3). If I rewrite the image so the orientation is UIImageOrientationUp(0), I get good face detection results. For reference, the routine to rewrite the image is below.
The whole camera orientation thing I find very confusing and I seem to be digging myself deeper into a code hole with all of this. I have looked at the posts (here, here and here. And according to this post (https://stackoverflow.com/a/3781192/840992):
"The camera is actually landscape native, so you get up or down when
you take a picture in landscape and left or right when you take a
picture in portrait (depending on how you hold the device)."
...which is totally confusing. If the above is true, I would think I should be getting an orientation of UIImageOrientationLeftMirrored or UIImageOrientationRightMirrored with the front camera. And none of this would explain why the CIDetector fails the virgin image returned by the picker.
I am approaching this ass-backwards but I can't seem to get oriented...
-(UIImage *)normalizedImage:(UIImage *) thisImage
{
if (thisImage.imageOrientation == UIImageOrientationUp) return thisImage;
UIGraphicsBeginImageContextWithOptions(thisImage.size, NO, thisImage.scale);
[thisImage drawInRect:(CGRect){0, 0, thisImage.size}];
UIImage *normalizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return normalizedImage;
}
Take a look at my answer here:
Force UIImagePickerController to take photo in portrait orientation/dimensions iOS
and associated project on github (you won't need to run the project, just look at the readme).
It's more concerned with reading rather than writing metadata - but it includes a few notes on Apple's imageOrientation and the corresponding orientation 'Exif' metadata.
This might be worth a read also
Captured photo automatically rotated during upload in IOS 6.0 or iPhone
There are two different constant numbering conventions in play to indicate image orientation.
kCGImagePropertyOrientation constants as used in TIFF/IPTC image metadata tags
UIImageOrientation constants as used by UIImage imageOrientation property.
iPhones native camera orientation is landscape left (with home button to the right). Native pixel dimensions always reflect this, rotation flags are used to orient the image correctly with the display orientation.
Apple UIImage.imageOrientation TIFF/IPTC kCGImagePropertyOrientation
iPhone native UIImageOrientationUp = 0 = Landscape left = 1
rotate 180deg UIImageOrientationDown = 1 = Landscape right = 3
rotate 90CCW UIImageOrientationLeft = 2 = Portrait down = 8
rotate 90CW UIImageOrientationRight = 3 = Portrait up = 6
UIImageOrientation 4-7 map to kCGImagePropertyOrientation 2,4,5,7 - these are the mirrored counterparts.
UIImage derives it's imagerOrientation property from the underlying kCGImagePropertyOrientation flags - that's why it is a read-only property. This means that as long as you get the metadata flags right, the imageOrientation will follow correctly. But if you are reading the numbers in order to apply a transform, you need to be aware which numbers you are looking at.
A few gleanings from my world o' pain in looking into this:
Background: Core Image face detection was failing and it seemed to be related to using featuresInImage:options: and using the UIImage.imageOrientation property as an argument. With an image adjusted to have no rotation and not mirrored, detection worked fine but when passing in an image directly from the camera detection failed.
Well...UIImage.imageOrientation is DIFFERENT than the actual orientation of the image.
In other words...
UIImage* tmpImage = [self.imageInfo objectForKey:UIImagePickerControllerOriginalImage];
printf("tmpImage.imageOrientation: %d\n", tmpImage.imageOrientation);
Reports a value of 3 or UIImageOrientationRight whereas using the metaData returned by the UIImagePickerControllerDelegate method...
NSMutableDictionary* metaData = [[tmpInfo objectForKey:#"UIImagePickerControllerMediaMetadata"] mutableCopy];
printf(" metaData orientation %d\n", [[metaData objectForKey:#"Orientation"] integerValue]);
Reports a value of 6 or UIImageOrientationLeftMirrored.
I suppose it seems obvious now that UIImage.imageOrientation is a display orientation which appears to determined by source image orientation and device rotation. (I may be wrong here) Since the display orientation is different than the actual image data, using that will cause the CIDetector to fail. Ugh.
I'm sure all of that serves very good and important purposes for liquid GUIs etc. but it is too much to deal with for me since all the CIDetector coordinates will also be in the original image orientation making CALayer drawing sick making. So, for posterity here is how to change the Orientation property of the metaData AND the image contained therein. This metaData can then be used to save the image to the cameraRoll.
Solution
// NORMALIZE IMAGE
UIImage* tmpImage = [self normalizedImage:[self.imageInfo objectForKey:UIImagePickerControllerOriginalImage]];
NSMutableDictionary * tmpInfo =[self.imageInfo mutableCopy];
NSMutableDictionary* metaData = [[tmpInfo objectForKey:#"UIImagePickerControllerMediaMetadata"] mutableCopy];
[metaData setObject:[NSNumber numberWithInt:0] forKey:#"Orientation"];
[tmpInfo setObject:tmpImage forKey:#"UIImagePickerControllerOriginalImage"];
[tmpInfo setObject:metaData forKey:#"UIImagePickerControllerMediaMetadata"];
self.imageInfo = tmpInfo;
Related
I am confused about the UIImage orientation in my iOS design.
I simply load an image taken by my iPhone into UIImageView using storyboard and expect it would be shown exactly the same in the simulator. However, it rotates.
(I choose the content mode to be aspect fit)
I try with other images downloaded from the internet all of them works fine but the one was taken by the camera.
Anyone have any idea why this happens?
edit:
I try to print out imageOrientation property my example. It shows 0 which is the default value of .up.
It seems the picture has not been rotated but it looks different in storyboard and simulator.
The example would be as following:
The UIImage is rotated automatically , the developer should rectify it.
use the below function to get the image in the correct orientation.
UIImage *imageToDisplay = [UIImage imageWithCGImage:[originalImage CGImage] scale:[originalImage scale] orientation: UIImageOrientationUp];
A UIImage has a property imageOrientation, which instructs the UIImageView and other UIImage consumers to rotate the raw image data. There's a good chance that this flag is being saved to the default data in the uploaded jpeg image, but the program you use to view it is not honoring that flag. Try changing the orientation while fetching image from UIImage Picker Controller.
To rotate the UIImage to display properly when uploaded, you can use a category. Check #Anomie's ans in Objective C. If you are using Swift you can create your extension for UIImage using same logic.
I'm using UIImagePickerController to snap an image and uploading it to server.
When taking a photo in the front camera, the height/width get reversed somewhere.
The image is displayed correctly later, but height and width are reversed (and I'm using them for the UIImageView autolayout constraint)
The thing is - that when looking at UIImagePickerControllerMediaMetadata of front and back camera images - the EXIF and the rest of the metadata is the same (resolution is smaller but the height/width ratio is the same)
Any ideas what is the difference?
Apple images are always landscape left with EXIF and the orientation is specified in the EXIF.
OK, so #zaph comment is correct, apparently back camera images are "reversed" as well - the upload code in the server (Codeigniter PHP) ignored the EXIF.
The problem surfaced only due to front camera low resolution...
What is the most efficient way to iterate over the entire camera roll, open every single photo and resize it?
My naive attempts to iterate over the asset library and get the defaultRepresentation results took about 1 second per 4 images (iPhone 5). Is there a way to do better?
I need the resized images to do some kind of processing.
Resizing full resolution photos is rather expensive operation. But you can use images already resized to screen resolution:
ALAsset *result = // .. do not forget to initialize it
ALAssetRepresentation *rawImage = [result defaultRepresentation];
UIImage *image = [UIImage imageWithCGImage:rawImage.fullScreenImage];
If you need another resolution you can still use 'fullScreenImage' since it has smaller size than original photo.
(CGImageRef)fullScreenImage
Returns a CGImage of the representation that is appropriate for
displaying full screen. The dimensions of the image are dependent on
the device your application is running on; the dimensions may not,
however, exactly match the dimensions of the screen.
In iOS 5 and later, this method returns a fully cropped, rotated, and
adjusted image—exactly as a user would see in Photos or in the image
picker.
Returns a CGImage of the representation that is appropriate for
displaying full screen, or NULL if a CGImage representation could not
be generated.
I’m trying to take a screenshot of my iPad application, and create a UIImage, which I can then blur, and do fun stuff with. I’m taking the screenshot with the following code, which is working perfectly:
- (UIImage *)convertViewToImage
{
UIGraphicsBeginImageContext(self.bounds.size);
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
However, my app supports landscape orientation only, and the screenshot that comes out is rotated 90 degrees to the left (in portrait orientation). I’m calling this method on my UIWindow, and I understand that UIWindow uses a different coordinate system, but I can’t quite figure out how to rotate that UIImage. I’ve read a couple of similar questions here and tried their solution (using initWithCGImage:scale:orientation, and re-drawing the image), but since it’s a full-screen iPad image, if I rotate it, it seems to keep the actual portrait dimension, but stretch the rotated image into it.
In a nutshell, the method above is giving me what I need, but at 768x1024, rather than 1024x768. I need to rotate it 90 degrees to the left, exactly the same as rotate would work in Preview, Photoshop, etc.
You can call it on the root view controller's view instead. That has the same content as the window but with the orientation of the device.
I'm using UIImagePickerController to fetch images from the user's photo library and/or taken with the camera. Works great.
I'm noticing that fetched images are often (always?) coming back with their imageOrientation set to UIImageOrientationRight. But the image was captured with the device in portrait orientation. Why is this? This is an iPhone4S, iOS6, using the rear camera - so the resolution is 8MP.
In the simulator, grabbing photos from the photo library, images come back UIImageOrientationUp.
When I display the image in a UIImageView the orientation looks correct (portrait/up). But when I go to crop the image the coordinate system isn't what I would expect. 0,0 is in the upper-right of the image, which I guess makes sense when it reports UIImageOrientationRight.
I'm looking for an explanation of what's going on and the correct approach to dealing with the odd coordinate system.
EDIT: it sure appears to me that, on iPhone4S at least, the camera always takes UIImageOrientationRight/"landscape" images, and that UIImageView is respecting the imageOrientation on display. However, if I save the image using UIImagePNGRepresentation the orientation is not preserved (I think I read about this somewhere.)
It has to do with the orientation the phone was in when the image was taken. The phone doesn't rotate the image data from the camera sensor to make up in the image be up but instead sets the imageOrientation and then UIImage will take care of rendering things the right way.
When you try and crop, you typically change the image to be a CGImage and that loses the orientation information so suddenly you get the image with a strange orientation.
There are several categories on UIImage that you can get that will perform image cropping while taking imageOrientation into account.
Have a look at link or link