I am trying to set different images on a UIImageView but I don´t know how to handle the image orientation because always that I call image.imageOrientation the result is UIImageOrientationUp and it isn´t.
Here some examples:
Image with wrong orientation.
Image with wrong orientation.
I want that all images look correctly but I don´t know how to do it if the image.imageOrientation don´t give me the correct value.
Here my code:
NSArray *paths =
NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *getImagePath = [documentsDirectory stringByAppendingPathComponent:#"porfileImage.png"];
UIImage *img = [UIImage imageWithContentsOfFile:getImagePath];
if (img == nil) {
img = [UIImage imageNamed:#"user"];
}
NSLog(#"Orien: %ld",(long)img.imageOrientation);
self.porfileImage.image = [UIImage imageWithCGImage:[img CGImage] scale:[img scale] orientation: UIImageOrientationUp];
I found my error!!
I was saving the image using UIImagePNGRepresentation(image); so the image lose the orientation and when I wanted to use it, the image already was a default orientation that isn´t the correct one.
The solution is to use UIImageJPEGRepresentation(image, 1.0); instead of UIImagePNGRepresentation(image); and for automatically the image save with the correct orientation.
Nearly all the images clicked by a camera have metadata/ EXIF data which has the image orientation value. As the image is transmitted or passed along various channels like email, messaging etc. some of this data might be stripped off. So every time you load the image using CGImage, it takes the default orientation of UIImageOrientationUp.
One solution might be to create two versions of your image, UIImage and CGImage, and compare their height. If they are equal, then the CGImage version is not be rotated and you are good. Else, the CGImage conversion has messed up the orientation. In that case, swap the x/y coordinate of origin and try to rotate it.
Related
I add pinch to zoom and rotate gestures to UIImageView (myImageView). After zooming and rotating, there is apply button to save the image. Below is the apply method. It saves the image correctly with exact rotation and scaling . As it is just like screenshot.
UIGraphicsBeginImageContextWithOptions(view.frame.size,view.opaque,0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *data=UIImageJPEGRepresentation(viewImage, 100);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *strPath = [documentsDirectory stringByAppendingPathComponent:#"BorderedImage.jpg"];
[data writeToFile:strPath atomically:YES];
But after doing this , i reassign the saved image to myImageView.
myImageView.image=[self loadImage:#"BorderedImage.jpg"];
what it does, if image was zoomed out , image becomes more small, if image was zoomed in , image becomes too big. if image was rotated, then image shows with totally wrong rotation .
I know this is some kind of UIImageView problem, because i checked the image in documents folder , it is same just like when i press apply button.
i am new to iOS development . kindly help me out.
After a great headache , i solved it myself. just before re-assigning the transformed image, i change the UIImageView transform to the original transform values.
myImageView.transform=defaultTranform;
where defaultTransform is
CGAffineTransform rotTran;
rotTran=myImageView.transform; // view did load
In my app I've to take a picture and add the following information over the pic, these information are:
Weather forecast
Temperature
GPS location
Until now I obtained these information by using GPS and a web service for weather forecast (open weather map). I made so:
I take the picture with the standard UIImagePicker
I put a button on my interface to show the picture to the user
When the user press the button the app open a new ViewController in which I show the picture just take and I added 2 UILabel (one for temperature and one for the location) and a UIImageView (to show an icon about the weather forecast). The UILabels and the UIImageView I draw directly on the StoryBoard.
Now I need to merge the picture with the 2 UILabel and with the UIImageView, there's a way to merge them in a single UIImageView?
I've to do that to save the picture with the weather forecast and location
UPDATE
I create a button to save the picture with the labels and the imageview and the code I wrote it's this:
- (IBAction)buttonSavePicture:(UIButton *)sender {
[self.imageView addSubview:self.labelPlace];
[self.imageView addSubview:self.labelTemperature];
[self.imageView addSubview:self.imageViewWeather];
UIGraphicsBeginImageContext(self.imageView.bounds.size);
[self.imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentationDirectory, NSUserDomainMask, YES);
NSString *filePath = [[paths objectAtIndex:0] stringByAppendingPathComponent:self.filename];
[UIImageJPEGRepresentation(finalImage, 1) writeToFile:filePath atomically:YES];
}
But when I go to see in the Documents directory if I saved correctly the picture I don't find it.
Yes, you can easily do it by capturing them. Follow steps.
Create a small parent view in storyboard put all controls you want to capture together inside. Create an outlet say captureView.
Call the following function when you need.
-(void)capture{
UIGraphicsBeginImageContext(self.captureView.bounds.size);
[self.captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//FINAL OUTPUT
self.imageView.image=capturedImage;
}
Cheers.
If you using iOS7 have a look at the snapshotViewAfterScreenUpdates: and the drawViewHierarchyInRect:afterScreenUpdates: this one is used to include or capture subviews like labels etc. this will return a single UIView of everything on screen, then save that as a UIImage.
CGSize imgSize = CGSizeMake(self.view.bounds.size.width, self.view.bounds.size.height);
UIGraphicsBeginImageContextWithOptions(imgSize, NO , 0.0f);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *weatherImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(weatherImage, nil, nil, nil; //save to saved image album
If all went right you should have your "screenshot" in the photo album
I am going to share my image to Instagram , but before sharing in need to user crop their own photo , so I used VPImageCropperViewController (https://github.com/windshg/VPImageCropper) to crop the image first then share it to Instagram but the result is over scaled image :
Crop area :
and the result :
here is my codes :
- (IBAction)shareIt:(id)sender {
UIGraphicsBeginImageContextWithOptions(captureView.bounds.size, NO, 0.0);
[captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
VPImageCropperViewController *imgCropperVC =
[[VPImageCropperViewController alloc]
initWithImage:image
cropFrame:CGRectMake(0, 100.0f, self.view.frame.size.width,self.view.frame.size.width)
limitScaleRatio:3.0];
imgCropperVC.delegate = self;
[self presentViewController:imgCropperVC animated:YES completion:nil];
}
VPImageCropperDelegate
- (void)imageCropper:(VPImageCropperViewController *)cropperViewController didFinished:(UIImage *)editedImage {
[cropperViewController dismissViewControllerAnimated:YES completion:^{
NSURL *instagramURL = [NSURL URLWithString:#"instagram://location?id=1"];
if ([[UIApplication sharedApplication] canOpenURL:instagramURL]) {
NSURL *url;
docFile.delegate = self;
//Save image to directory
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *savedImagePath = [documentsDirectory stringByAppendingPathComponent:#"savedImage.jpg"];
NSData *imageData = UIImagePNGRepresentation(editedImage);
[imageData writeToFile:savedImagePath atomically:NO];
//load image
NSString *getImagePath = [documentsDirectory stringByAppendingPathComponent:#"savedImage.jpg"];
UIImage *tempImage = [UIImage imageWithContentsOfFile:getImagePath];
//Hook it with Instagram
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Image.ig"];
[UIImageJPEGRepresentation(tempImage, 1.0) writeToFile:jpgPath atomically:YES];
url = [[NSURL alloc] initFileURLWithPath:jpgPath];
docFile = [UIDocumentInteractionController interactionControllerWithURL:url];
[docFile setUTI:#"com.instagram.photo"];
docFile.annotation = #{#"InstagramCaption" : #" #geometrica" };
[docFile presentOpenInMenuFromRect:self.view.bounds inView:self.view animated:YES];
}else {
UIAlertView *errorToShare = [[UIAlertView alloc] initWithTitle:#"Instagram unavailable " message:#"You need to install Instagram" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
[errorToShare show];
}
}];
}
It seems like the error here lies in the instatiation of your imgCropperVC. You are telling the cropper to crop out a specific rectangle (with your yellow border's size) from the image you are providing. Since you said in a comment here that the size of the picture is the same size as the screen, and since your result seems to be EXACTLY half of the width (it's cropped directly through his face), I sugges trying
[[VPImageCropperViewController alloc]
initWithImage:image
cropFrame:CGRectMake(0, 100.0f, self.view.frame.size.width*2,self.view.frame.size.width*2)
limitScaleRatio:3.0];
This might have something to do with #2x, though I'm not really sure this solves your issue for other images and in general.
I suspect the original picture to be bigger than the screen, and scaled down to fit it. If this is correct, weird things(not really) happen.
Your screen's size is probably 640px wide. Let's assume the original picture(picture.jpg) has a width of 1280 pixels or something. When showing this in your UIImageView in your ViewController or UIScrollView, it will obviously scale to fit, or else you wouldn't be able to see the entire picture. If you now use your line of code, then you are asking the cropper to crop out a width of 640 (the width of your own screen and rectangle) from a picture that is 1280px wide. The cropper doesn't care how wide your screen is, it just crops out 640px from a 1280px wide image because that's what you told it to do. This will result in the image being cut in half. If your image is scaled down (which it is by default, I guess), you'll need to use this scale in your line of code. If the original picture actually IS 1280px wide, then my above code will work, because I double the rectangle's size (The width of your view is probably 320, this is the same as 640 retina I believe).
If you multiply self.view.fram.size.width and height by the amount the picture has been zoomed, you should get the correct image. This also applies to the "static" 100.0f you send. For your VIEW it's 100f, but if the image is scaled up or down, this will not be correct. You need to multiply this with the same value as well. I'm sure you can get the zoom-scale from the scrollView or imageView or something.
I suggest testing your app with different image sizes. If an image with the exact same pixel-size as your UIView is tested, I think it will work perfectly, and will confirm my suspicion.
Finally I figure it out to crop image size there is a method in project demo called :
- (UIImage *)imageByScalingAndCroppingForSourceImage:(UIImage *)sourceImage targetSize:(CGSize)targetSize;
you can mention your target images size with it some thing like this :
image = [self imageByScalingAndCroppingForSourceImage:image targetSize:CGSizeMake(1024, 1024)];
I need to create a calendar of an image (they are size 640 x 960 or 960 x 640). Now one approach I tried was to create a view, add an image view to it, present the image, "draw" on it with subviews, then save to view to a file.
Now this works as planned, but testing it in simulator, the resolution of the saved image is 306 x 462 (size of the view I'm saving). Now I just lost half my original resolution...
Is there any way to work around this?
Code that saved the view:
UIGraphicsBeginImageContext(self.backgroundImageView.bounds.size);
[self.backgroundImageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSArray * paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString * basePath = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
NSString *fullPath = [NSString stringWithFormat:#"%#/Calendar.png",basePath];
NSData * binaryImageData = UIImagePNGRepresentation(image);
[binaryImageData writeToFile:fullPath atomically:YES];
You're not accounting for device screen scale, that should do it: UIGraphicsBeginImageContextWithOptions(self.backgroundImageView.bounds.size, YES, [UIScreen mainScreen].scale);.
If anyone has similar problem, I've come up with 2 solutions:
1) Enlarge everything before saving (you'll have to play around with AutoresizingMasks).
2) Put everything into a scrollView and display it in the large size so the user scrolls around if he wants to see everything.
I really want to know if there's a way to get iOS's light linen background. Here's what it looks like:
Is there a way to access this by merely using the built-in SDK? Or do I have to find an image like this somewhere?
EDIT: I didn't want to use private APIs, so here's what I did. I grabbed the image this way:
CGRect rect = CGRectMake(0.0f, 0.0f, 640.0f, 960.0f);
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetFillColorWithColor(context, [[UIColor underPageBackgroundColor] CGColor]);
CGContextFillRect(context, rect);
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *imagepath = [documentsDirectory stringByAppendingPathComponent:#"underPageBackground.png"];
[UIImagePNGRepresentation(image) writeToFile:imagepath atomically:YES];
And here is the resulting png in retina resolution: (click on thumbnail to open)
Hope this is useful to someone. :)
Use [UIColor underPageBackgroundColor]. And here is a link with useful information and samples.
I'm not sure it is exactly the one (seems darker to me) but, you can choose the "Scroll View Textured Background Color" in Interface Builder. To do so, when selection a color choose the dropbox to the right instead of the color box on the left.
I believe you should be able to find this background in the SDK(it should be the default one for this kind of "flick page up" function) or on google (try looking for UIStockImageUnderPageBackground.png).
Otherwise - it looks like a pattern. What you could do is to import the attached screenshot, then cut a bit without the shadows and fill a blank canvas with it, so that the sides match forming the original pattern.
This isn't really a solution to get the actual image but you can easily just make the texture. I just pulled up a tutorial on how to make it in photoshop. iOS Linen tute here
This way you can make it whatever color you want!