EDIT 1: This problem specially happens on iOS7.
I'm trying to crop an image that was already taken by the UIImagePickerController using some Core Graphics methods inside the imagePickerController method:
- (void)imagePickerController:(UIImagePickerController *)photoPicker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
if(modeSelected == kModeTypeAutomatic){
//cropping image
if(image.imageOrientation == UIImageOrientationUp){
CGImageRef tmpImgRef = [image CGImage];
CGImageRef topImgRef = CGImageCreateWithImageInRect(tmpImgRef, CGRectMake(image.size.width/2, 0, image.size.width/2, image.size.height));
secondImage = [UIImage imageWithCGImage:topImgRef];
CGImageRelease(topImgRef);
}
else{
CGImageRef tmpImgRef = [image CGImage];
CGImageRef topImgRef = CGImageCreateWithImageInRect(tmpImgRef, CGRectMake(0, 0, image.size.width, image.size.height/2.0));
secondImage = [UIImage imageWithCGImage:topImgRef scale:image.scale orientation:UIImageOrientationRight];
CGImageRelease(topImgRef);
}
}
image = nil;
[photoPicker dismissViewControllerAnimated:NO completion:nil];
if (secondImage) {
[self performSegueWithIdentifier:#"SecondPicTakenSegue" sender:nil];
}
}
I made some allocations and memory leaks using Instrument, and I verified that the problem is inside this method.
Am I not releasing something here? Is something missing?
Thanks!
Hard to tell from that code snippet, but secondImage could be what's being leaked. No harm in explicitly setting it to nil in your dealloc method to see if that helps:
-(void)dealloc {
secondImage = nil;
}
This may be your issue (from the CGImage reference, under CGImageCreateWithImageInRect):
The resulting image retains a reference to the original image, which means you may release the original image after calling this function.
https://developer.apple.com/library/ios/documentation/graphicsimaging/reference/CGImage/Reference/reference.html#jumpTo_6
So you may want to try a CGImageRelease(tmpImgRef) since it sounds like you may be incrementing the reference count with that call to CGImageCreateWithImageInRect. I haven't tested this myself to ensure that this is the case, but perhaps worth a shot.
Related
The following code does no longer work on iOS10. The image data remains unrotated.
I have confirmed that this code works on iOS 8 and 9.
CIImage *i = [[CIImage alloc] initWithImage:image];
imageView.image = [UIImage imageWithCIImage:i scale:image.scale orientation:UIImageOrientationRight];
Has anyone run into this same problem? Is this a bug, or an intended change?
My guess is that something has changed in terms of how UIImageView handles image rotation flags. I can't find the changes mentioned anywhere, but at least the code below works. Taken from here.
- (UIImage*)rotateImage:(UIImage*)sourceImage clockwise:(BOOL)clockwise
{
CGSize size = sourceImage.size;
UIGraphicsBeginImageContext(CGSizeMake(size.height, size.width));
[[UIImage imageWithCGImage:[sourceImage CGImage]
scale:1.0
orientation:clockwise ? UIImageOrientationRight : UIImageOrientationLeft]
drawInRect:CGRectMake(0,0,size.height ,size.width)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
If you need to change the CIImage's orientation, please try this.
let newCIImage: CIImage
if #available(iOS 11.0, *) {
newCIImage = myCIImage.oriented(.right)
} else {
newCIImage = myCIImage.oriented(forExifOrientation: Int32(CGImagePropertyOrientation.right.rawValue))
}
This question already has answers here:
Cropping center square of UIImage
(19 answers)
Closed 7 years ago.
I'm currently cropping an UIImage with my
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info method like so:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
picker.allowsEditing = YES;
UIImage *img = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect cropRect = CGRectMake(0, 0, 2448, 3264);
UIImage *croppedImage = [img crop:cropRect];
imageToPass = croppedImage;
NSLog(#"Here's imageToPass: %#", imageToPass);
NSLog(#"and here's imageToPass' width: %f", imageToPass.size.width);
NSLog(#"and here's imageToPass' height: %f", imageToPass.size.height);
NSLog(#"and here's imageToPass' scale: %f", imageToPass.scale);
UINavigationController *postControl = [self.storyboard instantiateViewControllerWithIdentifier:#"postControl"];
RGPostViewController *postView = (RGPostViewController *)postControl.topViewController;
[postView storeImage:imageToPass];
[self.presentedViewController presentViewController:postControl animated:NO completion:nil];
}
My problem is that when I print the width and height of my imageToPass variable I find that the value is listed in points. My console prints like this:
I need to get an image returned that is cropped to be 320x320 in size. With my code CGRect cropRect = CGRectMake(0, 0, 2448, 3264); Im taking the original size of the photo, which by default with UIImagePickerController is, I'm assuming, 320x520 or something like that. Using point values I can see that 2448 is the points wide and 3264 is the height. From Google,
iPhone 5 display resolution is 1136 x 640 pixels. Measuring 4 inches diagonally, the new touch screen has an aspect ratio of 16:9 and is branded a Retina display with 326 ppi (pixels per inch).
Im not sure what to do here. Does the math 2448points/640px = 3.825 tell me that there is 3.825 points per pixel on a 326ppi screen?
PS keep in mind I'm trying to grab the 320x320 picture in the middle of the UIImagePickerControllerOriginalImage which means cutting of some top number of pixels and some bottom number of pixels determined in points I'm assuming.
EDIT
Here's the code for the crop: method in the fourth line of code above:
#import "UIImage+Crop.h"
#implementation UIImage (Crop)
- (UIImage *)crop:(CGRect)rect {
rect = CGRectMake(rect.origin.x*self.scale,
rect.origin.y*self.scale,
rect.size.width*self.scale,
rect.size.height*self.scale);
CGImageRef imageRef = CGImageCreateWithImageInRect([self CGImage], rect);
UIImage *result = [UIImage imageWithCGImage:imageRef
scale:self.scale
orientation:self.imageOrientation];
CGImageRelease(imageRef);
return result;
}
#end
I have found that if I set my CGRect cropRect with CGRect cropRect = CGRectMake(264, 0, 2448, 3000); //3264 it actually removes 264 points from the top and bottom of the image. I understand that an iPhone 5s has a screen resolution of 326ppi(pixels per inch), how can I use this to successfully remove the amount of pixels that I need to remove.
You don't need to know about converting points/pixels/retina/non/etc because of a property of the screen called scale. You do need to use core graphics to do the actual crop though. Here's what it could look like:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
picker.allowsEditing = YES;
UIImage *img = [info objectForKey:UIImagePickerControllerOriginalImage];
// you want to make your rect in the center of your image, not at [0,0]
CGFloat cropSize = 320;
CGRect cropRect = CGRectMake(img.center.x - (cropSize / 2), img.center.y - (cropSize / 2), cropSize, cropSize);
// Make a new CGImageRef with the current graphics context, then use that to make the cropped UIImage. Make sure to release that image ref!
CGImageRef imageRef = CGImageCreateWithImageInRect([img CGImage], cropRect);
croppedImage = [UIImage imageWithCGImage: imageRef];
CGImageRelease(imageRef);
// Adjust the image for scale - this is how you handle retina/orientation.
imageToPass = [UIImage imageWithCGImage:imageRef
scale:self.scale
orientation:self.imageOrientation];
UINavigationController *postControl = [self.storyboard instantiateViewControllerWithIdentifier:#"postControl"];
RGPostViewController *postView = (RGPostViewController *)postControl.topViewController;
[postView storeImage:imageToPass];
[self.presentedViewController presentViewController:postControl animated:NO completion:nil];
}
Although I resize my images once the UIImagePickerController has finished taking a photo, my instruments profile says that calls to ImageIO_PNG are taking massive amounts of memory (40 MB+) each time I take a photo. This is my code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
#autoreleasepool {
if (myImageView.image == nil) {
myImageView.contentMode = UIViewContentModeScaleAspectFill;
UIImage *topImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
CGRect rect = CGRectMake(0,0,320,440);
UIGraphicsBeginImageContext( rect.size );
// use the local image variable to draw in context
[topImage drawInRect:rect];
UIImage *topResized = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
myImageView.image = topResized;
image = nil;
info = nil;
[picker dismissViewControllerAnimated:NO completion:nil];
[picker removeFromParentViewController];
remove below line from your code that might help:
image = nil;
info = nil;
I load an image from the camera roll and send it to a function which returns a section of the original image based on a rectangle I create. I can then add this to the scene with no issues.
The problem occurs when I want to load another section of the original image. I want to create 2 sprites, each with different sections of the original image, like a jigsaw, but when I send the original image and a different rectangle, I get the same image as the first time and have two images the same added to the scene.
Any ideas would be appreciated?
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
// newImage is a UIImage do not try to use a UIImageView
UIImage *newImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
UIImage *newImage2 = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//newImage2 = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
// Dismiss UIImagePickerController and release it
[picker dismissModalViewControllerAnimated:YES];
[picker.view removeFromSuperview];
[picker release];
CGRect newRect = CGRectMake(0, 0, 320, 460);
CGRect newRect2 = CGRectMake(600, 600, 100, 100);
UIImage *testImage = [self imageFromImage:newImage inRect:newRect];
UIImage *testImage2 = [self imageFromImage:newImage2 inRect:newRect2];
CCSprite *scaledImage = [CCSprite spriteWithCGImage:testImage.CGImage key:#"ImageFromPicker"];
scaledImage.position = ccp(s.width/2, s.height/2);
[self addChild:scaledImage];
CCSprite *scaledImage2 = [CCSprite spriteWithFile:#"play.png"];//[CCSprite spriteWithCGImage:testImage2.CGImage key:#"ImageFromPicker"];
scaledImage2.position = ccp(560,40);
[self addChild:scaledImage2];
}
And the method that crops the image:
- (UIImage *)imageFromImage:(UIImage *)image inRect:(CGRect)rect
{
CGImageRef sourceImageRef = [image CGImage];
CGImageRef newImageRef = CGImageCreateWithImageInRect(sourceImageRef, rect);
UIImage *tempImage = [UIImage imageWithCGImage:newImageRef];
return tempImage;
}
Thanks to LearnCocos, you were spot on with your answer. Created 2 separate textures from different part of the larger image and could then add them separately.
I'm using this function to crop an image and when I use it, it doesn't release memory, I tried not using it and the memory works fine.I can't seem to find the problem with it.Please help
- (void) processImage:(UIImage *)image {
haveImage = YES;
UIGraphicsBeginImageContext(image.size);
[image drawInRect: CGRectMake(0, 0, image.size.width, image.size.height)];
__weak UIImage *smallImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect cropRect = CGRectMake(0, 0, image.size.width, image.size.height - ((image.size.height - image.size.width)/2));
CGImageRef imageRef = CGImageCreateWithImageInRect([smallImage CGImage], cropRect);
__weak UIImage *corteSuperior =[UIImage imageWithCGImage:imageRef];
cropRect = CGRectMake(0, ((image.size.height - image.size.width)/2), corteSuperior.size.width, corteSuperior.size.height - ((image.size.height - image.size.width)/2));
imageRef = CGImageCreateWithImageInRect([corteSuperior CGImage], cropRect);
self.imagenCamara.image = [UIImage imageWithCGImage:imageRef];
smallImage = nil;
imageRef = nil;
corteSuperior = nil;
UIGraphicsEndImageContext();
CGImageRelease(imageRef);
}
You have to call CGImageRelease call for each CGImageCreateWithImageInRect call. You're never releasing the first occurrence. Before you assign imageRef the second time, release the first imageRef.
Also, at the end, make sure you perform the CGImageRelease before you nil the imageRef pointer. By setting imageRef to nil, you're losing your reference to the imageRef, and thus the CGImageRelease will be unable to release anything.