In my app I'm having two imageviews, first imageview will be constant and for second imageview ,after picking image it will zoom and it can be place any where in the view and second image will be merged with first image and after merging second imageview will be nil and first imageview will be having two images and we can pic as many images in second imageview and can move anywhere but we need pic only single image at single time.How to implement this.?.Thanks in advance.
A simple image usually has 32bit of color information per pixel (rgb: ff ff ff), a grayscale image has 8bit.
It should be obvious that a conversion from 32bit to 8bit is not reversible.
To achieve your goal you have to keep a copy of your original Image.
When you select Image from gallery save it in one UIImage object as original image before applying any grayscale on that.
Make another copy of selected Image and apply grayscale on that and assign this image to ImageView.
Now when you touch on this ImageView (Detect touch using Gestures) set originalImage to the ImageView.
Edit:
You can have function like this:
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissModalViewControllerAnimated:YES];
[picker release];
// Edited image works great (if you allowed editing)
UIImage *editedImage = [info objectForKey:UIImagePickerControllerEditedImage];
// AND the original image works great
UIImage *originalImage = [info objectForKey:UIImagePickerControllerOriginalImage];
// AND do whatever you want with it, (NSDictionary *)info is fine now
}
In above code you can declare both editedImage and originalImage variables at global scope, assign editedImage as image for your ImageView and on touch on ImageView you can show originalImage
Related
I'm working on an iOS app with Parse that requires profile pictures to be in the shape of hexagons. Right now, I'm downloading the PFFile from Parse, grabbing that image and then masking it with a hexagon. This works well for simple views like the profile screen (masking is only required once), but the app suffers severe performance issues when masking a collection view with a list of followers' profile images.
In my mind, the best solution would be to upload the profile pictures to Parse already masked correctly, so all I have to do is pull them down and display them. Here's the code that I used to do this in my sign up view controller:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissViewControllerAnimated:YES completion:nil];
UIImage *chosenPicture = info[UIImagePickerControllerEditedImage];
[EVNUtility maskImage:chosenPicture withMask:[UIImage imageNamed:#"MaskImage"] withCompletionBlock:^(UIImage *maskedImage) {
self.profileImageView.image = maskedImage;
self.pictureData = UIImagePNGRepresentation(maskedImage);
}];
}
The image comes back correctly masked from my utility function, however when I use UIImagePNGRepresentation (or the JPG equivalent) to convert the UIImage to data (this data is then uploaded to Parse), the image loses its mask and is square again.
How can I keep the mask when converting a UIImage to NSData?
I've tried a couple of things, but I'm guessing this is due to my fuzzy understanding of how masking is done and if it affects the underlying image. Here's the reference I used for masking my image: http://iosdevelopertips.com/cocoa/how-to-mask-an-image.html
pass in the view that contains your image and mask to this method (I use this as an extension of UIImage)
+ (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
this will return your 'flattened', masked image (make sure the UIView has a clear background)
you will then need to convert to PNG (not JPG) before creating a PFFile, this will retain the transparency.
Im trying to get a full sized image from the photo library using UIImagePickerControllerOriginalImage.
the returned UIImage size is 750 x 1001, but when i extract the image using Image Capture, it is actually 3264 x 2448, how can I get the real original image?
When you initialize the UIImagePickerController make sure to set allowsEditing property to NO, which would allow you to get the full-sized original image.
For an example, if you use:
self.imagePickerController = [[UIImagePickerController alloc] init];
self.imagePickerController.delegate = self;
self.imagePickerController.allowsEditing = YES;
the resulting image you get in - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info from info[UIImagePickerControllerOriginalImage] will have it's size less than the actual original, like it's happening to you.
But if you set self.imagePickerController.allowsEditing = NO;, UIImage from info[UIImagePickerControllerOriginalImage] will be the exact same size as it's in the library.
I'm trying to upload UIImage which taken from UIImagePickerController using UIImageJPEGRepresentation. But it shows on server with wrong orientation. First of all, I'd like to ask is there way to fix it on my side? Or if not how to handle it on server?
I already saw iOS 4 Image Oriented Incorrectly When Posted To Server and this one Save UIImage, Load it in Wrong Orientation but I need to show image with right orientation directly on server not in application.
From what I've found, this seems to be the best answer.
iOS UIImagePickerController result image orientation after upload
Add the category UIImage+fixOrientation to your project and import "UIImage+fixOrientation.h" in your view controller.
Then you should be able to use something like this.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info objectForKey: UIImagePickerControllerOriginalImage];
image = [image fixOrientation];
NSData *data = UIImageJPEGRepresentation(image, 1.0);
}
try
UIImage *orientationFixedImage = [UIImage imageWithCGImage:[originalImage CGImage] scale:1.0 orientation:originalImage.imageOrientation];
When I chose a portrait photo from the library like that :
(source: hostingpics.net)
The UIImage returned by the library is rotated.
Here is my code :
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)img editingInfo:(NSDictionary *)editInfo {
UIImage* originalImg = [editInfo objectForKey:UIImagePickerControllerOriginalImage];
[self useImage:originalImg];
}
How can I avoid that ?
self.imageData=[info objectForKey:#"UIImagePickerControllerOriginalImage"];
CIImage *image = [[CIImage alloc] initWithImage:[info objectForKey:#"UIImagePickerControllerOriginalImage"]];
self.imageData=[UIImage imageWithCIImage:image scale:self.imageData.scale orientation:self.imageData.imageOrientation];
You can use the below methods to retain the image's orientation:
[UIImage imageWithCIImage:(CIImage *) scale:(CGFloat) orientation:(UIImageOrientation)];
[UIImage imageWithCGImage:(CGImageRef) scale:(CGFloat) orientation:(UIImageOrientation)];
Did you previously take the image using an image picker in your app and then save the image?
If so, did you set the orientation when you saved the image? You need to.
Consider adding a log statement or debugging to check what orientation is set on the image.
If the orientation is set on the image then UIImageView will automatically render the image correctly.
If (in the future) you're going to take the image data and upload it somewhere then you would need to check the orientation and re-draw the image while applying an appropriate transform to the context.
I am working on an application for iPad, it is working fine until i reached this point:
The app shows the popover for the photo library, but when I choose the photo, the popover doesn't hide, and I also want it to view the selected image in a UIImageView, however i do not know how.
I am sure there is something wrong in the didFinishpickingMediaWithInfo function. here is the function's code:
-(void) imagePickerController:(UIImagePickerController *)picker didFinishpickingMediaWithInfo:(NSDictionary *)info{
//bgImage is a UIImageView
bgImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
// Dismiss UIImagePickerController and release it [picker dismissModalViewControllerAnimated:YES];
[picker.view removeFromSuperview];
[picker release];
}
My first question is: What am I supposed to add to this function for viewing the selected photo in the UIImageView? (When I click on the photo from the photo library in the simulator, neither the photo library hide nor the image is viewed in the specified UIImageView)
2- I have read that I should've used UIImage instead of UIImageView.. Is this true? If yes, what about the interface builder? There is nothing called UIImage?
To close the image picker, use:
[[picker parentViewController] dismissModalViewControllerAnimated:YES];
What you get from
bgImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
is not an UIImageView, it's a UIImage. To get it displayed, you need to have a UIImageView in your UI somewhere, and set the view's image with what you just got from the picker:
imageView.image = bgImage
Hope this helps