Preserving the mask on a UIImage when converting it to NSData - ios

I'm working on an iOS app with Parse that requires profile pictures to be in the shape of hexagons. Right now, I'm downloading the PFFile from Parse, grabbing that image and then masking it with a hexagon. This works well for simple views like the profile screen (masking is only required once), but the app suffers severe performance issues when masking a collection view with a list of followers' profile images.
In my mind, the best solution would be to upload the profile pictures to Parse already masked correctly, so all I have to do is pull them down and display them. Here's the code that I used to do this in my sign up view controller:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissViewControllerAnimated:YES completion:nil];
UIImage *chosenPicture = info[UIImagePickerControllerEditedImage];
[EVNUtility maskImage:chosenPicture withMask:[UIImage imageNamed:#"MaskImage"] withCompletionBlock:^(UIImage *maskedImage) {
self.profileImageView.image = maskedImage;
self.pictureData = UIImagePNGRepresentation(maskedImage);
}];
}
The image comes back correctly masked from my utility function, however when I use UIImagePNGRepresentation (or the JPG equivalent) to convert the UIImage to data (this data is then uploaded to Parse), the image loses its mask and is square again.
How can I keep the mask when converting a UIImage to NSData?
I've tried a couple of things, but I'm guessing this is due to my fuzzy understanding of how masking is done and if it affects the underlying image. Here's the reference I used for masking my image: http://iosdevelopertips.com/cocoa/how-to-mask-an-image.html

pass in the view that contains your image and mask to this method (I use this as an extension of UIImage)
+ (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
this will return your 'flattened', masked image (make sure the UIView has a clear background)
you will then need to convert to PNG (not JPG) before creating a PFFile, this will retain the transparency.

Related

Image loses quality when scaled

I know that when scaling down an image you have to expect some loss of quality, but when I assign an image to a UIButton of size (75,75) it has great quality.
When I scale the image to size (75,75) for copy/paste using UIPasteboard it has really bad quality.
Background: My app is a keyboard extension, so I have buttons with assigned images and when they are clicked, I get the image from the button, scale it to be the right size, copy it to UIPasteboard, then paste.
Code:
Here is my code for detecting a button click and copying an image:
- (IBAction) clickedImage:(id)sender {
UIButton *btn = sender;
UIImage *scaledImage = btn.imageView.image;
UIImage *newImage = [scaledImage imageWithImage:scaledImage andSize:CGSizeMake(75, 75)];
NSData *imgData = UIImagePNGRepresentation(newImage);
UIPasteboard *pasteboard = [UIPasteboard generalPasteboard];
[pasteboard setData:imgData forPasteboardType:[UIPasteboardTypeListImage objectAtIndex:0]];
}
And I have a UIImage category with the imageWithImage:andSize: method for scaling the image. This is the scaling method:
- (UIImage*)imageWithImage:(UIImage*)image andSize:(CGSize)newSize {
// Create a bitmap context.
UIGraphicsBeginImageContextWithOptions(newSize, NO, image.scale);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
What doesn't make sense is that when I put the image in the UIButton it is scaled down to the exact same size as when I scale using code, but the quality is way better for the UIButton than when I return the scaled image. Is there something wrong with my scaling code? Does anyone know why there is such a drop in quality between the two images?
A better way to do this is to use ImageIO to resize your images. It takes a little bit longer, but it is far better for scaling images than redrawing into a graphics context.
Did you try this https://github.com/mbcharbonneau/UIImage-Categories ?
There is an interesting method in the Resize category
- (UIImage *)resizedImage:(CGSize)newSize
interpolationQuality:(CGInterpolationQuality)quality;
Setting quality to kCGInterpolationHigh seems to give a good result (a little bit slower)

How To implement double exposure for picking images?

In my app I'm having two imageviews, first imageview will be constant and for second imageview ,after picking image it will zoom and it can be place any where in the view and second image will be merged with first image and after merging second imageview will be nil and first imageview will be having two images and we can pic as many images in second imageview and can move anywhere but we need pic only single image at single time.How to implement this.?.Thanks in advance.
A simple image usually has 32bit of color information per pixel (rgb: ff ff ff), a grayscale image has 8bit.
It should be obvious that a conversion from 32bit to 8bit is not reversible.
To achieve your goal you have to keep a copy of your original Image.
When you select Image from gallery save it in one UIImage object as original image before applying any grayscale on that.
Make another copy of selected Image and apply grayscale on that and assign this image to ImageView.
Now when you touch on this ImageView (Detect touch using Gestures) set originalImage to the ImageView.
Edit:
You can have function like this:
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissModalViewControllerAnimated:YES];
[picker release];
// Edited image works great (if you allowed editing)
UIImage *editedImage = [info objectForKey:UIImagePickerControllerEditedImage];
// AND the original image works great
UIImage *originalImage = [info objectForKey:UIImagePickerControllerOriginalImage];
// AND do whatever you want with it, (NSDictionary *)info is fine now
}
In above code you can declare both editedImage and originalImage variables at global scope, assign editedImage as image for your ImageView and on touch on ImageView you can show originalImage

NSData from UIImage. Uploading to server with wrong orientation

I'm trying to upload UIImage which taken from UIImagePickerController using UIImageJPEGRepresentation. But it shows on server with wrong orientation. First of all, I'd like to ask is there way to fix it on my side? Or if not how to handle it on server?
I already saw iOS 4 Image Oriented Incorrectly When Posted To Server and this one Save UIImage, Load it in Wrong Orientation but I need to show image with right orientation directly on server not in application.
From what I've found, this seems to be the best answer.
iOS UIImagePickerController result image orientation after upload
Add the category UIImage+fixOrientation to your project and import "UIImage+fixOrientation.h" in your view controller.
Then you should be able to use something like this.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info objectForKey: UIImagePickerControllerOriginalImage];
image = [image fixOrientation];
NSData *data = UIImageJPEGRepresentation(image, 1.0);
}
try
UIImage *orientationFixedImage = [UIImage imageWithCGImage:[originalImage CGImage] scale:1.0 orientation:originalImage.imageOrientation];

Rotation after select UIImage of library

When I chose a portrait photo from the library like that :
(source: hostingpics.net)
The UIImage returned by the library is rotated.
Here is my code :
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)img editingInfo:(NSDictionary *)editInfo {
UIImage* originalImg = [editInfo objectForKey:UIImagePickerControllerOriginalImage];
[self useImage:originalImg];
}
How can I avoid that ?
self.imageData=[info objectForKey:#"UIImagePickerControllerOriginalImage"];
CIImage *image = [[CIImage alloc] initWithImage:[info objectForKey:#"UIImagePickerControllerOriginalImage"]];
self.imageData=[UIImage imageWithCIImage:image scale:self.imageData.scale orientation:self.imageData.imageOrientation];
You can use the below methods to retain the image's orientation:
[UIImage imageWithCIImage:(CIImage *) scale:(CGFloat) orientation:(UIImageOrientation)];
[UIImage imageWithCGImage:(CGImageRef) scale:(CGFloat) orientation:(UIImageOrientation)];
Did you previously take the image using an image picker in your app and then save the image?
If so, did you set the orientation when you saved the image? You need to.
Consider adding a log statement or debugging to check what orientation is set on the image.
If the orientation is set on the image then UIImageView will automatically render the image correctly.
If (in the future) you're going to take the image data and upload it somewhere then you would need to check the orientation and re-draw the image while applying an appropriate transform to the context.

UIImagePickerController image Scaling and maintain its quality

Image Size captured using Camera return's image of size 720*960.
The captured Image is displayed in a UIImageView of 320*436, like this.
UIImageView *imgView=[[UIImageView alloc] initWithFrame:CGRectMake(0.0,0.0,320.0,436.0)];
imgView.image=img;//Image received from camera.
[self.view addSubView:imgView];
This, works fine image 720*960 is scaled to 320*436 and displayed.
Now, from here actual problem starts. I have another image of size 72*72. This image is overlapped with the image received from camera at some arbitrary coordinates.
CGRectMake(0.0,0.0,72.0,72.0);
I am not able to find a better way to handle scaling and applying a overlay of another Image, at the same time maintain its quality.
The image needs to be send to a server.
Use the following code to scale images:
-(UIImage*) imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Resources