WorkGround
I have an image (.JPG format) of size of 53kb in simulator.
Now I am selecting this image using UIImagePickerController and save it to local.
Code for selecting image and storing it local is below:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[[UIApplication sharedApplication] setStatusBarHidden:YES];
NSString *mediaType = info[UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {
// Media is an image
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
if (image) {
NSData *jpegData = UIImageJPEGRepresentation(image, 1);
// Store image.
BOOL isImageSave = [jpegData writeToFile:filePath atomically:NO];
if (isImageSave) {
DLog(#"Image saved");
} else {
DLog(#"Error");
}
}
}
[self.view dismissViewControllerAnimated:YES completion:nil];
}
Problem:
Here original image is of 53kb but when i select this image the image size will increase and store image is of size 142kb. Here i used compression to reduce size. But i check twice that in UIImagePickerController delegate method it returns bigger size image.
Does any one have idea, Why it select bigger image. Is there any way to get original size image in terms of size?
You can resize to image view in particular frame. using below code
- (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Then call this method as :
UIImage *myIcon = [self imageWithImage:myUIImageInstance scaledToSize:CGSizeMake(20, 20)];
As far as storage of the image, the fastest image format to use with the iPhone is PNG, because it has optimizations for that format. However, if you want to store these images as JPEGs, you can take your UIImage and do the following:
NSData *dataForJPEGFile = UIImageJPEGRepresentation(theImage, 0.6);
You're using a JPEG encoding quality of 1 which corresponds to 100% quality. Try using a setting of 0.2-0.8 when you perform the JPEG encoding and check the file size then.
Related
When I take a photo from UIImagePickerController, the camera focuses, and when a picture is taken, it is immediately very blurry. Is there a way to change the statements used in the method in order to produce a sharper image?
#pragma mark - Image picker delegate methods
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// Resize the image from the camera
UIImage *scaledImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFill bounds:CGSizeMake(photo.frame.size.width, photo.frame.size.height) interpolationQuality:kCGInterpolationHigh];
// Crop the image to a square
UIImage *croppedImage = [scaledImage croppedImage:CGRectMake((scaledImage.size.width -photo.frame.size.width)/2, (scaledImage.size.height -photo.frame.size.height)/2, photo.frame.size.width, photo.frame.size.height)];
// Show the photo on the screen
photo.image = croppedImage;
[picker dismissModalViewControllerAnimated:NO];
photo.transform=CGAffineTransformMakeRotation(M_PI*0.5);
}
I have the below implementation to save a UIView as an image to the device's photo album. It works correctly, however the saved image adapts to the device's screen resolution. As an instance if I run it on a iPhone 5, the saved image will be 640 x 640 px. My goal is to save custom sized images like 1800 x 1800 px or something like that on every device. So I would really appreciate if somebody could give me an example or any guidance, that helps me to find the right solution. Any other tips welcomed, my goal is to make custom pixel sized images, it doesn't matter if I have to use a different implementation.
- (IBAction)saveImg:(id)sender {
UIImage *imageToSave = [[UIImage alloc] init];
// self.fullVw holds the image, that I want to save
imageToSave = [self.fullVw pb_takeSnapshot];
NSData *pngData = UIImagePNGRepresentation(imageToSave);
UIImage *imageToSave2 = [[UIImage alloc] init];
imageToSave2 = [UIImage imageWithData:pngData];
UIImageWriteToSavedPhotosAlbum(imageToSave2,nil, nil, nil);
}
// This method is in a UIView category
- (UIImage *)pb_takeSnapshot {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, self.opaque, 0.0);
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
This question already has answers here:
Cropping center square of UIImage
(19 answers)
Closed 7 years ago.
I'm currently cropping an UIImage with my
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info method like so:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
picker.allowsEditing = YES;
UIImage *img = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect cropRect = CGRectMake(0, 0, 2448, 3264);
UIImage *croppedImage = [img crop:cropRect];
imageToPass = croppedImage;
NSLog(#"Here's imageToPass: %#", imageToPass);
NSLog(#"and here's imageToPass' width: %f", imageToPass.size.width);
NSLog(#"and here's imageToPass' height: %f", imageToPass.size.height);
NSLog(#"and here's imageToPass' scale: %f", imageToPass.scale);
UINavigationController *postControl = [self.storyboard instantiateViewControllerWithIdentifier:#"postControl"];
RGPostViewController *postView = (RGPostViewController *)postControl.topViewController;
[postView storeImage:imageToPass];
[self.presentedViewController presentViewController:postControl animated:NO completion:nil];
}
My problem is that when I print the width and height of my imageToPass variable I find that the value is listed in points. My console prints like this:
I need to get an image returned that is cropped to be 320x320 in size. With my code CGRect cropRect = CGRectMake(0, 0, 2448, 3264); Im taking the original size of the photo, which by default with UIImagePickerController is, I'm assuming, 320x520 or something like that. Using point values I can see that 2448 is the points wide and 3264 is the height. From Google,
iPhone 5 display resolution is 1136 x 640 pixels. Measuring 4 inches diagonally, the new touch screen has an aspect ratio of 16:9 and is branded a Retina display with 326 ppi (pixels per inch).
Im not sure what to do here. Does the math 2448points/640px = 3.825 tell me that there is 3.825 points per pixel on a 326ppi screen?
PS keep in mind I'm trying to grab the 320x320 picture in the middle of the UIImagePickerControllerOriginalImage which means cutting of some top number of pixels and some bottom number of pixels determined in points I'm assuming.
EDIT
Here's the code for the crop: method in the fourth line of code above:
#import "UIImage+Crop.h"
#implementation UIImage (Crop)
- (UIImage *)crop:(CGRect)rect {
rect = CGRectMake(rect.origin.x*self.scale,
rect.origin.y*self.scale,
rect.size.width*self.scale,
rect.size.height*self.scale);
CGImageRef imageRef = CGImageCreateWithImageInRect([self CGImage], rect);
UIImage *result = [UIImage imageWithCGImage:imageRef
scale:self.scale
orientation:self.imageOrientation];
CGImageRelease(imageRef);
return result;
}
#end
I have found that if I set my CGRect cropRect with CGRect cropRect = CGRectMake(264, 0, 2448, 3000); //3264 it actually removes 264 points from the top and bottom of the image. I understand that an iPhone 5s has a screen resolution of 326ppi(pixels per inch), how can I use this to successfully remove the amount of pixels that I need to remove.
You don't need to know about converting points/pixels/retina/non/etc because of a property of the screen called scale. You do need to use core graphics to do the actual crop though. Here's what it could look like:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
picker.allowsEditing = YES;
UIImage *img = [info objectForKey:UIImagePickerControllerOriginalImage];
// you want to make your rect in the center of your image, not at [0,0]
CGFloat cropSize = 320;
CGRect cropRect = CGRectMake(img.center.x - (cropSize / 2), img.center.y - (cropSize / 2), cropSize, cropSize);
// Make a new CGImageRef with the current graphics context, then use that to make the cropped UIImage. Make sure to release that image ref!
CGImageRef imageRef = CGImageCreateWithImageInRect([img CGImage], cropRect);
croppedImage = [UIImage imageWithCGImage: imageRef];
CGImageRelease(imageRef);
// Adjust the image for scale - this is how you handle retina/orientation.
imageToPass = [UIImage imageWithCGImage:imageRef
scale:self.scale
orientation:self.imageOrientation];
UINavigationController *postControl = [self.storyboard instantiateViewControllerWithIdentifier:#"postControl"];
RGPostViewController *postView = (RGPostViewController *)postControl.topViewController;
[postView storeImage:imageToPass];
[self.presentedViewController presentViewController:postControl animated:NO completion:nil];
}
Although I resize my images once the UIImagePickerController has finished taking a photo, my instruments profile says that calls to ImageIO_PNG are taking massive amounts of memory (40 MB+) each time I take a photo. This is my code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
#autoreleasepool {
if (myImageView.image == nil) {
myImageView.contentMode = UIViewContentModeScaleAspectFill;
UIImage *topImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect rect = CGRectMake(0,0,320,440);
UIGraphicsBeginImageContext( rect.size );
// use the local image variable to draw in context
[topImage drawInRect:rect];
UIImage *topResized = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
myImageView.image = topResized;
image = nil;
info = nil;
[picker dismissViewControllerAnimated:NO completion:nil];
[picker removeFromParentViewController];
remove below line from your code that might help:
image = nil;
info = nil;
I load an image from the camera roll and send it to a function which returns a section of the original image based on a rectangle I create. I can then add this to the scene with no issues.
The problem occurs when I want to load another section of the original image. I want to create 2 sprites, each with different sections of the original image, like a jigsaw, but when I send the original image and a different rectangle, I get the same image as the first time and have two images the same added to the scene.
Any ideas would be appreciated?
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
// newImage is a UIImage do not try to use a UIImageView
UIImage *newImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
UIImage *newImage2 = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//newImage2 = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
// Dismiss UIImagePickerController and release it
[picker dismissModalViewControllerAnimated:YES];
[picker.view removeFromSuperview];
[picker release];
CGRect newRect = CGRectMake(0, 0, 320, 460);
CGRect newRect2 = CGRectMake(600, 600, 100, 100);
UIImage *testImage = [self imageFromImage:newImage inRect:newRect];
UIImage *testImage2 = [self imageFromImage:newImage2 inRect:newRect2];
CCSprite *scaledImage = [CCSprite spriteWithCGImage:testImage.CGImage key:#"ImageFromPicker"];
scaledImage.position = ccp(s.width/2, s.height/2);
[self addChild:scaledImage];
CCSprite *scaledImage2 = [CCSprite spriteWithFile:#"play.png"];//[CCSprite spriteWithCGImage:testImage2.CGImage key:#"ImageFromPicker"];
scaledImage2.position = ccp(560,40);
[self addChild:scaledImage2];
}
And the method that crops the image:
- (UIImage *)imageFromImage:(UIImage *)image inRect:(CGRect)rect
{
CGImageRef sourceImageRef = [image CGImage];
CGImageRef newImageRef = CGImageCreateWithImageInRect(sourceImageRef, rect);
UIImage *tempImage = [UIImage imageWithCGImage:newImageRef];
return tempImage;
}
Thanks to LearnCocos, you were spot on with your answer. Created 2 separate textures from different part of the larger image and could then add them separately.