iOS 7, Xcode 5
I'm loading photos into a UIImageView and noticed that they are automatically re-oriented to display "face-up". You can see this same effect by taking 4 pictures with your phone, each time rotating the phone 90 degrees more, then viewing the images on the phone.
The problem is that I'm trying to implement some rotations after loading the image and it has the wrong effect.
For example, I have a photo taken with the iPhone vertical (home button at bottom).
When I copy the image into a UIImageView it is re-oriented face-up (without my code).
I do not want this.
And when I make another copy and apply a rotation, the rotation appears to be against the original tmp image!
What I am trying to get is an image and it's mirror.
Here's my code:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
[self dismissViewControllerAnimated:YES completion:nil];
imageTmp=info[UIImagePickerControllerOriginalImage];
NSLog(#"original orientation:%li",imageTmp.imageOrientation);
self.imageA.image=imageTmp;
self.imageB.image=[UIImage imageWithCGImage:[imageTmp CGImage]
scale:1.0
orientation:UIImageOrientationUpMirrored];
}
In the screenshot above, the image on the left was taken with the phone Vertical.
I then load it into a UIImage.
Next I copy it into a UIImageView "imageA".
Last, I transform and copy the "imageTmp" into another UIImageView "imageB".
Notice that the Left image is auto-re-oriented (I do not want this!), while the right image is transformed based on the actual landscape version.
How can I prevent the auto-re-orientation?
You can get the image without any implied rotation, by creating a new UIImage from the CGImage and not specifying an orientation:
naturalImage = [UIImage imageWithCGImage:[originalImage CGImage]];
Note that, as I mention in the comments above, this is always going to effectively be LandscapeRight (I think, maybe LandscapeLeft, I'm too lazy to remember which is which), as that is how the camera records the images.
Related
I am confused about the UIImage orientation in my iOS design.
I simply load an image taken by my iPhone into UIImageView using storyboard and expect it would be shown exactly the same in the simulator. However, it rotates.
(I choose the content mode to be aspect fit)
I try with other images downloaded from the internet all of them works fine but the one was taken by the camera.
Anyone have any idea why this happens?
edit:
I try to print out imageOrientation property my example. It shows 0 which is the default value of .up.
It seems the picture has not been rotated but it looks different in storyboard and simulator.
The example would be as following:
The UIImage is rotated automatically , the developer should rectify it.
use the below function to get the image in the correct orientation.
UIImage *imageToDisplay = [UIImage imageWithCGImage:[originalImage CGImage] scale:[originalImage scale] orientation: UIImageOrientationUp];
A UIImage has a property imageOrientation, which instructs the UIImageView and other UIImage consumers to rotate the raw image data. There's a good chance that this flag is being saved to the default data in the uploaded jpeg image, but the program you use to view it is not honoring that flag. Try changing the orientation while fetching image from UIImage Picker Controller.
To rotate the UIImage to display properly when uploaded, you can use a category. Check #Anomie's ans in Objective C. If you are using Swift you can create your extension for UIImage using same logic.
I’m trying to take a screenshot of my iPad application, and create a UIImage, which I can then blur, and do fun stuff with. I’m taking the screenshot with the following code, which is working perfectly:
- (UIImage *)convertViewToImage
{
UIGraphicsBeginImageContext(self.bounds.size);
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
However, my app supports landscape orientation only, and the screenshot that comes out is rotated 90 degrees to the left (in portrait orientation). I’m calling this method on my UIWindow, and I understand that UIWindow uses a different coordinate system, but I can’t quite figure out how to rotate that UIImage. I’ve read a couple of similar questions here and tried their solution (using initWithCGImage:scale:orientation, and re-drawing the image), but since it’s a full-screen iPad image, if I rotate it, it seems to keep the actual portrait dimension, but stretch the rotated image into it.
In a nutshell, the method above is giving me what I need, but at 768x1024, rather than 1024x768. I need to rotate it 90 degrees to the left, exactly the same as rotate would work in Preview, Photoshop, etc.
You can call it on the root view controller's view instead. That has the same content as the window but with the orientation of the device.
I'm taking a photo with UIImagePickerController. Everything works fine on iPad and on iPhone 5. The problem comes with an iPhone4: the photo that I get from the picker is "bigger" than what I saw on the screen when I took the photo.
What do I mean with bigger? I mean that at both sides of the photo, and at the bottom, I see parts of the scene that the camera didn't show when I was taking the photo. This is a problem for my app. I need to capture exactly the same scene that the user sees through the camera when he takes the photo. Not a bigger scene. As I said, on iPhone5 and iPad4 everything works fine. I don't understand this different behaviour. How can I solve this?
PD: I'm not applying any transformation to the image on the picker.
-(void)imagePickerController:(UIImagePickerController*)picker didFinishPickingMediaWithInfo:(NSDictionary*)info
{
UIImage* originalImage = [info valueForKey:#"UIImagePickerControllerOriginalImage"];
NSLog(#"Image Size Width %f Height %f",originalImage.size.width,originalImage.size.height);
UIGraphicsBeginImageContext(CGSizeMake(320, 480));
[originalImage drawInRect:CGRectMake(0,0,320,480)];
UIImage* image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"image Size h %f Width %f",[image size].height, [image size].width);
}
Here you can see that What is original image size and after than you can change size as your wish..
It was a problem with the preview on the iPhone4... the preview on the iPhone4 is 16:9, but the final image is 4:3. On the iPhone5 and the iPad, both the preview and the final image are 4:3.
I didn't realize it before because my overlay was hiding part of the preview on the iPhone4, so I thought the preview was 4:3 too (and I think that the preview with iOS6 is 4:3, I'm not sure, I've got to find an iPhone4 with iOS6).
I made my overlay (a UIToolbar) translucent, and now I get what I want.
I'm using UIImagePickerController to fetch images from the user's photo library and/or taken with the camera. Works great.
I'm noticing that fetched images are often (always?) coming back with their imageOrientation set to UIImageOrientationRight. But the image was captured with the device in portrait orientation. Why is this? This is an iPhone4S, iOS6, using the rear camera - so the resolution is 8MP.
In the simulator, grabbing photos from the photo library, images come back UIImageOrientationUp.
When I display the image in a UIImageView the orientation looks correct (portrait/up). But when I go to crop the image the coordinate system isn't what I would expect. 0,0 is in the upper-right of the image, which I guess makes sense when it reports UIImageOrientationRight.
I'm looking for an explanation of what's going on and the correct approach to dealing with the odd coordinate system.
EDIT: it sure appears to me that, on iPhone4S at least, the camera always takes UIImageOrientationRight/"landscape" images, and that UIImageView is respecting the imageOrientation on display. However, if I save the image using UIImagePNGRepresentation the orientation is not preserved (I think I read about this somewhere.)
It has to do with the orientation the phone was in when the image was taken. The phone doesn't rotate the image data from the camera sensor to make up in the image be up but instead sets the imageOrientation and then UIImage will take care of rendering things the right way.
When you try and crop, you typically change the image to be a CGImage and that loses the orientation information so suddenly you get the image with a strange orientation.
There are several categories on UIImage that you can get that will perform image cropping while taking imageOrientation into account.
Have a look at link or link
I am using UIImagePicker to load an image that I have upload to the photo library (on my iPad), however it loads without the alpha channel.
I have tripple checked to make sure the image has one. Unless it is removed when syncing to itunes?
Here is the code I am using for the image picker when it finishes picking:
-(void)imagePickerController: (UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
//dissmiss picker
[imagePopOver dismissPopoverAnimated:YES];
//get the picker image
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//create image view for selected image
UIImageView *imageView = [[UIImageView alloc] initWithImage:image];
imageView.frame = CGRectMake(0, 0, imageView.image.size.width, imageView.image.size.height);
imageView.center = CGPointMake(512, 384);
[imageLayer addSubview:imageView];
//release image and image view
[image release];
[imageView release];
}
What am I doing wrong? I hope this isn't a 'Doh!' moment.
UIImagePicker does support images with an alpha channel, as Kangoo states in their own answer.
However, the conditions for it working correctly are more complicated than just downloading an image through browser to photo lib.
First of all, it makes most sense to be using iOS's image picker to be picking only from the Camera Roll. On an iOS device, it's possible to save images from Mail or Safari to the Camera Roll, and other apps that support images with an alpha channel allow you to save to the Camera Roll.
Images in the rest of the Photos Library, other than the Camera Roll, would have gotten there by iTunes sync. And that's a problem. Although iPhoto on OSX does support PNG images with alpha channel, when iTunes sync pushes those images to an iOS device the transparent areas of the image background become opaque white.
Second, if you configure iOS's image picker to allow editing (square crop, move, scale), then the transparent areas of the image background will become opaque black!
But if you configure iOS's image picker to NOT allow editing, and you pick an image from the Camera Roll, then the picked image (UIImage) can have an alpha channel.
On the picked UIImage, use CGImageGetAlphaInfo(pickedImage.CGImage) to get the CGImageAlphaInfo, and if that info is not (kCGImageAlphaNone or kCGImageAlphaNoneSkipFirst or kCGImageAlphaNoneSkipLast) then the image has an alpha channel.
Never mind. It was as I suspected. The alpha channel is lost somehow when importing the png through itunes (syncing). Must be converting to 24bit? I uploaded my image to the web and then downloaded it through browser to photo lib and it's fine now.
Have you tried calling:
imageView.backgroundColor = [UIColor clear]
Probably you may need to make sure that imageLayer is transparent, too. I am not sure how this has been implemented.