My app lets the user take photos, and in every photo there is a small watermark. The problem is: The watermark appears bigger when the photo has been taken with the front camera. I want the watermark to have the same size no matter which camera has been used.
Any ideas?
My code:
UIImage *backgroundImage = image;
UIImage *watermarkImage = [UIImage imageNamed:#"Watermark.png"];
UIGraphicsBeginImageContext(backgroundImage.size);
[backgroundImage drawInRect:CGRectMake(0, 0, backgroundImage.size.width, backgroundImage.size.height)];
[watermarkImage drawInRect:CGRectMake(backgroundImage.size.width - watermarkImage.size.width, backgroundImage.size.height - watermarkImage.size.height, watermarkImage.size.width, watermarkImage.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.imageView.image = result;
The watermark is the same size. The image is not, since the two cameras have different resolutions. You need to resize the watermark in proportion to the image size. I believe you can use scaleImage:toSize: for this.
Figured out a weird solution. I turned on edit mode: [picker setAllowsEditing:YES]; and now the watermark is the same size no matter what camera you use.
Related
I have an application in which I am scanning text from a picture. I am using OCR. Now the problem is my source image size is 3024*3024. I have copied the image into iPhone 6 Simulator in Xcode. Now when I uploaded the image using uiimagepicker controller the image was resized to 748*748 and its quality is not same.
When I tried to increase image size the image's quality is not the same.
What I need is image in its actual size and quality/resolution should not change.
try this code to resize image
- (UIImage*)imageWithImage:(UIImage*)img scaledToSize:(CGSize)newSize;{
UIGraphicsBeginImageContext( newSize );
[img drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImg;
}
and For the compression of images
NSData *dataForJPEGFile = UIImageJPEGRepresentation(theImage, 0.6);
Thanks
Did you set the qualityType property of your UIImagePickerController to UIImagePickerControllerQualityTypeHigh? The default is medium.
I have a UIImage that I'm loading into one of my app's views. It is a 10.7 MB image, but when it loads in the app, the app's resource usage suddenly jumps by 50 MB. Why does it do this? Shouldn't memory used increase by only about 10.7MB? I am certain that loading the image is what causes the jump in memory usage because I tried commenting these lines out and the memory usage went back to around 8 MB. Here's how I load the image:
UIImage *image = [UIImage imageNamed:#"background.jpg"];
self.backgroundImageView = [[UIImageView alloc] initWithImage:image];
[self.view addSubview:self.backgroundImageView];
If there is no way to decrease the memory used by this image, is there a way to force it to deallocate when I want it to? I'm using ARC.
No, it should not be 10.7MB. The 10.7MB is the compressed size of the image.
The image loaded in to the UIImage object is a decoded image.
For each pixel in the image 4 bytes (R,G,B and Alpha) are used, therefore you can calculate the memory size, height x width x 4 = total bytes in memory.
So the moment you loaded the image into memory it will take up lots of memory, and since a UIImageView is used to present the image and as a subview the images is kept in memory.
You should try and change the size of the image to match the size of the iOS screen size.
As #rckoenes said
Don't show the images with high file size.
You need to resize the image before you display it.
UIImage *image = [UIImage imageNamed:#"background.jpg"];
self.backgroundImageView =[self imageWithImage:display scaledToSize:CGSizeMake(20, 20)];//Give your CGSize of the UIImageView.
[self.view addSubview:self.backgroundImageView];
-(UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
//UIGraphicsBeginImageContext(newSize);
// In next line, pass 0.0 to use the current device's pixel scaling factor (and thus account for Retina resolution).
// Pass 1.0 to force exact pixel size.
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
You can do one thing. if you can afford 50 MB for this image. If this image with 10 mb size is that much critical to your application then. you can release it just after its use to keep memory usage in control.
As you are using ARC there is no option for release but you can do this
#autoreleasepool {
UIImage *image = [UIImage imageNamed:#"background.jpg"];
self.backgroundImageView = [[UIImageView alloc] initWithImage:image];
[self.view addSubview:self.backgroundImageView];
}
using autoreleasepool it will be sure that after this autoreleasepool{} block memory for fat image will be deallocated. making your device RAM happy again.
Hope it helps !
I am taking screenshot of a particular View in my Xib file with the following code...
UIView* captureView = self.view;
UIGraphicsBeginImageContextWithOptions(captureView.bounds.size, NO , 0.0f);
[captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
It works fine and saves JPG image to camera roll.
But problem is, There is another UIImageView on the top of my View, that UIImageView has a semi-transparent image in it.
My screenshot doesn't preserve that transparency in the screenshot it is taking.
I want to keep the transparency as it is in the actual screen.
How can you preserve the transparency in the screenshot?
If you specify "No" for the opaque property, your image must include an alpha channel for this to work. Check that your image has an alpha channel.
JPGs don't have transparency so as soon as you convert it to JPG alpha is gone.
This is a known limitation of UIImageWriteToSavedPhotosAlbum
it doesn't keep png.
try this. this code working for me
UIGraphicsBeginImageContext(baseViewOne.frame.size);
[[baseViewOne layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshota = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
also check cocoa coder screen shots
NSData* imdata = UIImagePNGRepresentation(_snapshotImgView.image);
UIImage* snapshotPNG = [UIImage imageWithData:imdata];
UIImageWriteToSavedPhotosAlbum(snapshotPNG, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
Image Size captured using Camera return's image of size 720*960.
The captured Image is displayed in a UIImageView of 320*436, like this.
UIImageView *imgView=[[UIImageView alloc] initWithFrame:CGRectMake(0.0,0.0,320.0,436.0)];
imgView.image=img;//Image received from camera.
[self.view addSubView:imgView];
This, works fine image 720*960 is scaled to 320*436 and displayed.
Now, from here actual problem starts. I have another image of size 72*72. This image is overlapped with the image received from camera at some arbitrary coordinates.
CGRectMake(0.0,0.0,72.0,72.0);
I am not able to find a better way to handle scaling and applying a overlay of another Image, at the same time maintain its quality.
The image needs to be send to a server.
Use the following code to scale images:
-(UIImage*) imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I have one UIImageView. Its content mode is set to AspectFit.
[imageView setContentMode:UIViewContentModeScaleAspectFit].
I need to crop a subImage from this image. This is the code which crops the image:
CGImageRef imageRef = CGImageCreateWithImageInRect([imageView.image CGImage], customRect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
where customRect is the rectangle from which I need to crop the image.
This is how I calculate it:
CGRect customRect = CGRectMake((cropView.frame.origin.x/xFactor),
(cropView.frame.origin.y/yFactor),
(cropView.frame.size.width/xFactor),
(cropView.frame.size.height/yFactor));
The problem comes in cropping. CGImageCreateWithImageInRect crops the given area according to the actual image size which, in some cases, is larger than the image view size. I tried using other approaches such as UIGraphics:getImageFromCurrentImageContext but these do not keep the image quality as much as it degrades them.