Want fixed orientation, but the UIImage autoratate - ios

I am confused about the UIImage orientation in my iOS design.
I simply load an image taken by my iPhone into UIImageView using storyboard and expect it would be shown exactly the same in the simulator. However, it rotates.
(I choose the content mode to be aspect fit)
I try with other images downloaded from the internet all of them works fine but the one was taken by the camera.
Anyone have any idea why this happens?
edit:
I try to print out imageOrientation property my example. It shows 0 which is the default value of .up.
It seems the picture has not been rotated but it looks different in storyboard and simulator.
The example would be as following:

The UIImage is rotated automatically , the developer should rectify it.
use the below function to get the image in the correct orientation.
UIImage *imageToDisplay = [UIImage imageWithCGImage:[originalImage CGImage] scale:[originalImage scale] orientation: UIImageOrientationUp];

A UIImage has a property imageOrientation, which instructs the UIImageView and other UIImage consumers to rotate the raw image data. There's a good chance that this flag is being saved to the default data in the uploaded jpeg image, but the program you use to view it is not honoring that flag. Try changing the orientation while fetching image from UIImage Picker Controller.
To rotate the UIImage to display properly when uploaded, you can use a category. Check #Anomie's ans in Objective C. If you are using Swift you can create your extension for UIImage using same logic.

Related

How to override the automatic re-orientation of images in iOS?

iOS 7, Xcode 5
I'm loading photos into a UIImageView and noticed that they are automatically re-oriented to display "face-up". You can see this same effect by taking 4 pictures with your phone, each time rotating the phone 90 degrees more, then viewing the images on the phone.
The problem is that I'm trying to implement some rotations after loading the image and it has the wrong effect.
For example, I have a photo taken with the iPhone vertical (home button at bottom).
When I copy the image into a UIImageView it is re-oriented face-up (without my code).
I do not want this.
And when I make another copy and apply a rotation, the rotation appears to be against the original tmp image!
What I am trying to get is an image and it's mirror.
Here's my code:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
[self dismissViewControllerAnimated:YES completion:nil];
imageTmp=info[UIImagePickerControllerOriginalImage];
NSLog(#"original orientation:%li",imageTmp.imageOrientation);
self.imageA.image=imageTmp;
self.imageB.image=[UIImage imageWithCGImage:[imageTmp CGImage]
scale:1.0
orientation:UIImageOrientationUpMirrored];
}
In the screenshot above, the image on the left was taken with the phone Vertical.
I then load it into a UIImage.
Next I copy it into a UIImageView "imageA".
Last, I transform and copy the "imageTmp" into another UIImageView "imageB".
Notice that the Left image is auto-re-oriented (I do not want this!), while the right image is transformed based on the actual landscape version.
How can I prevent the auto-re-orientation?
You can get the image without any implied rotation, by creating a new UIImage from the CGImage and not specifying an orientation:
naturalImage = [UIImage imageWithCGImage:[originalImage CGImage]];
Note that, as I mention in the comments above, this is always going to effectively be LandscapeRight (I think, maybe LandscapeLeft, I'm too lazy to remember which is which), as that is how the camera records the images.

drawViewHierachyInRect draws iPad at wrong orientation

I’m trying to take a screenshot of my iPad application, and create a UIImage, which I can then blur, and do fun stuff with. I’m taking the screenshot with the following code, which is working perfectly:
- (UIImage *)convertViewToImage
{
UIGraphicsBeginImageContext(self.bounds.size);
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
However, my app supports landscape orientation only, and the screenshot that comes out is rotated 90 degrees to the left (in portrait orientation). I’m calling this method on my UIWindow, and I understand that UIWindow uses a different coordinate system, but I can’t quite figure out how to rotate that UIImage. I’ve read a couple of similar questions here and tried their solution (using initWithCGImage:scale:orientation, and re-drawing the image), but since it’s a full-screen iPad image, if I rotate it, it seems to keep the actual portrait dimension, but stretch the rotated image into it.
In a nutshell, the method above is giving me what I need, but at 768x1024, rather than 1024x768. I need to rotate it 90 degrees to the left, exactly the same as rotate would work in Preview, Photoshop, etc.
You can call it on the root view controller's view instead. That has the same content as the window but with the orientation of the device.

UIImagePickerController Image returned is bigger than preview on iPhone 4

I'm taking a photo with UIImagePickerController. Everything works fine on iPad and on iPhone 5. The problem comes with an iPhone4: the photo that I get from the picker is "bigger" than what I saw on the screen when I took the photo.
What do I mean with bigger? I mean that at both sides of the photo, and at the bottom, I see parts of the scene that the camera didn't show when I was taking the photo. This is a problem for my app. I need to capture exactly the same scene that the user sees through the camera when he takes the photo. Not a bigger scene. As I said, on iPhone5 and iPad4 everything works fine. I don't understand this different behaviour. How can I solve this?
PD: I'm not applying any transformation to the image on the picker.
-(void)imagePickerController:(UIImagePickerController*)picker didFinishPickingMediaWithInfo:(NSDictionary*)info
{
UIImage* originalImage = [info valueForKey:#"UIImagePickerControllerOriginalImage"];
NSLog(#"Image Size Width %f Height %f",originalImage.size.width,originalImage.size.height);
UIGraphicsBeginImageContext(CGSizeMake(320, 480));
[originalImage drawInRect:CGRectMake(0,0,320,480)];
UIImage* image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"image Size h %f Width %f",[image size].height, [image size].width);
}
Here you can see that What is original image size and after than you can change size as your wish..
It was a problem with the preview on the iPhone4... the preview on the iPhone4 is 16:9, but the final image is 4:3. On the iPhone5 and the iPad, both the preview and the final image are 4:3.
I didn't realize it before because my overlay was hiding part of the preview on the iPhone4, so I thought the preview was 4:3 too (and I think that the preview with iOS6 is 4:3, I'm not sure, I've got to find an iPhone4 with iOS6).
I made my overlay (a UIToolbar) translucent, and now I get what I want.

Updating UIImage orientation metaData?

Basic Task: update the EXIF orientation property for the metaData asociated with a UIImage. My problem is that I don't know where the orientation property is in all the EXIF info.
Convoluted Background: I am changing the orientation of the image returned by imagePickerController:didFinishPickingMediaWithInfo: so I am thinking that I also need to update the metaData before saving the image with writeImageToSavedPhotosAlbum:(CGImageRef)imageRef metadata:(NSDictionary *)metadata.
In other words, unless I change it, the metaData will contain the old/initial orientation and thus be wrong. The reason I am changing the orientation is because it keeps tripping me up when I run the Core Image face detection routine. Taking a photo with the iPhone (device) in Portrait mode using the front camera, the orientation is UIImageOrientationRight (3). If I rewrite the image so the orientation is UIImageOrientationUp(0), I get good face detection results. For reference, the routine to rewrite the image is below.
The whole camera orientation thing I find very confusing and I seem to be digging myself deeper into a code hole with all of this. I have looked at the posts (here, here and here. And according to this post (https://stackoverflow.com/a/3781192/840992):
"The camera is actually landscape native, so you get up or down when
you take a picture in landscape and left or right when you take a
picture in portrait (depending on how you hold the device)."
...which is totally confusing. If the above is true, I would think I should be getting an orientation of UIImageOrientationLeftMirrored or UIImageOrientationRightMirrored with the front camera. And none of this would explain why the CIDetector fails the virgin image returned by the picker.
I am approaching this ass-backwards but I can't seem to get oriented...
-(UIImage *)normalizedImage:(UIImage *) thisImage
{
if (thisImage.imageOrientation == UIImageOrientationUp) return thisImage;
UIGraphicsBeginImageContextWithOptions(thisImage.size, NO, thisImage.scale);
[thisImage drawInRect:(CGRect){0, 0, thisImage.size}];
UIImage *normalizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return normalizedImage;
}
Take a look at my answer here:
Force UIImagePickerController to take photo in portrait orientation/dimensions iOS
and associated project on github (you won't need to run the project, just look at the readme).
It's more concerned with reading rather than writing metadata - but it includes a few notes on Apple's imageOrientation and the corresponding orientation 'Exif' metadata.
This might be worth a read also
Captured photo automatically rotated during upload in IOS 6.0 or iPhone
There are two different constant numbering conventions in play to indicate image orientation.
kCGImagePropertyOrientation constants as used in TIFF/IPTC image metadata tags
UIImageOrientation constants as used by UIImage imageOrientation property.
iPhones native camera orientation is landscape left (with home button to the right). Native pixel dimensions always reflect this, rotation flags are used to orient the image correctly with the display orientation.
Apple UIImage.imageOrientation TIFF/IPTC kCGImagePropertyOrientation
iPhone native UIImageOrientationUp = 0 = Landscape left = 1
rotate 180deg UIImageOrientationDown = 1 = Landscape right = 3
rotate 90CCW UIImageOrientationLeft = 2 = Portrait down = 8
rotate 90CW UIImageOrientationRight = 3 = Portrait up = 6
UIImageOrientation 4-7 map to kCGImagePropertyOrientation 2,4,5,7 - these are the mirrored counterparts.
UIImage derives it's imagerOrientation property from the underlying kCGImagePropertyOrientation flags - that's why it is a read-only property. This means that as long as you get the metadata flags right, the imageOrientation will follow correctly. But if you are reading the numbers in order to apply a transform, you need to be aware which numbers you are looking at.
A few gleanings from my world o' pain in looking into this:
Background: Core Image face detection was failing and it seemed to be related to using featuresInImage:options: and using the UIImage.imageOrientation property as an argument. With an image adjusted to have no rotation and not mirrored, detection worked fine but when passing in an image directly from the camera detection failed.
Well...UIImage.imageOrientation is DIFFERENT than the actual orientation of the image.
In other words...
UIImage* tmpImage = [self.imageInfo objectForKey:UIImagePickerControllerOriginalImage];
printf("tmpImage.imageOrientation: %d\n", tmpImage.imageOrientation);
Reports a value of 3 or UIImageOrientationRight whereas using the metaData returned by the UIImagePickerControllerDelegate method...
NSMutableDictionary* metaData = [[tmpInfo objectForKey:#"UIImagePickerControllerMediaMetadata"] mutableCopy];
printf(" metaData orientation %d\n", [[metaData objectForKey:#"Orientation"] integerValue]);
Reports a value of 6 or UIImageOrientationLeftMirrored.
I suppose it seems obvious now that UIImage.imageOrientation is a display orientation which appears to determined by source image orientation and device rotation. (I may be wrong here) Since the display orientation is different than the actual image data, using that will cause the CIDetector to fail. Ugh.
I'm sure all of that serves very good and important purposes for liquid GUIs etc. but it is too much to deal with for me since all the CIDetector coordinates will also be in the original image orientation making CALayer drawing sick making. So, for posterity here is how to change the Orientation property of the metaData AND the image contained therein. This metaData can then be used to save the image to the cameraRoll.
Solution
// NORMALIZE IMAGE
UIImage* tmpImage = [self normalizedImage:[self.imageInfo objectForKey:UIImagePickerControllerOriginalImage]];
NSMutableDictionary * tmpInfo =[self.imageInfo mutableCopy];
NSMutableDictionary* metaData = [[tmpInfo objectForKey:#"UIImagePickerControllerMediaMetadata"] mutableCopy];
[metaData setObject:[NSNumber numberWithInt:0] forKey:#"Orientation"];
[tmpInfo setObject:tmpImage forKey:#"UIImagePickerControllerOriginalImage"];
[tmpInfo setObject:metaData forKey:#"UIImagePickerControllerMediaMetadata"];
self.imageInfo = tmpInfo;

UIImage from UIImagePickerController orientation issue

I'm using UIImagePickerController to fetch images from the user's photo library and/or taken with the camera. Works great.
I'm noticing that fetched images are often (always?) coming back with their imageOrientation set to UIImageOrientationRight. But the image was captured with the device in portrait orientation. Why is this? This is an iPhone4S, iOS6, using the rear camera - so the resolution is 8MP.
In the simulator, grabbing photos from the photo library, images come back UIImageOrientationUp.
When I display the image in a UIImageView the orientation looks correct (portrait/up). But when I go to crop the image the coordinate system isn't what I would expect. 0,0 is in the upper-right of the image, which I guess makes sense when it reports UIImageOrientationRight.
I'm looking for an explanation of what's going on and the correct approach to dealing with the odd coordinate system.
EDIT: it sure appears to me that, on iPhone4S at least, the camera always takes UIImageOrientationRight/"landscape" images, and that UIImageView is respecting the imageOrientation on display. However, if I save the image using UIImagePNGRepresentation the orientation is not preserved (I think I read about this somewhere.)
It has to do with the orientation the phone was in when the image was taken. The phone doesn't rotate the image data from the camera sensor to make up in the image be up but instead sets the imageOrientation and then UIImage will take care of rendering things the right way.
When you try and crop, you typically change the image to be a CGImage and that loses the orientation information so suddenly you get the image with a strange orientation.
There are several categories on UIImage that you can get that will perform image cropping while taking imageOrientation into account.
Have a look at link or link

Resources