Image Size captured using Camera return's image of size 720*960.
The captured Image is displayed in a UIImageView of 320*436, like this.
UIImageView *imgView=[[UIImageView alloc] initWithFrame:CGRectMake(0.0,0.0,320.0,436.0)];
imgView.image=img;//Image received from camera.
[self.view addSubView:imgView];
This, works fine image 720*960 is scaled to 320*436 and displayed.
Now, from here actual problem starts. I have another image of size 72*72. This image is overlapped with the image received from camera at some arbitrary coordinates.
CGRectMake(0.0,0.0,72.0,72.0);
I am not able to find a better way to handle scaling and applying a overlay of another Image, at the same time maintain its quality.
The image needs to be send to a server.
Use the following code to scale images:
-(UIImage*) imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Related
I have an application in which I am scanning text from a picture. I am using OCR. Now the problem is my source image size is 3024*3024. I have copied the image into iPhone 6 Simulator in Xcode. Now when I uploaded the image using uiimagepicker controller the image was resized to 748*748 and its quality is not same.
When I tried to increase image size the image's quality is not the same.
What I need is image in its actual size and quality/resolution should not change.
try this code to resize image
- (UIImage*)imageWithImage:(UIImage*)img scaledToSize:(CGSize)newSize;{
UIGraphicsBeginImageContext( newSize );
[img drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImg;
}
and For the compression of images
NSData *dataForJPEGFile = UIImageJPEGRepresentation(theImage, 0.6);
Thanks
Did you set the qualityType property of your UIImagePickerController to UIImagePickerControllerQualityTypeHigh? The default is medium.
My app lets the user take photos, and in every photo there is a small watermark. The problem is: The watermark appears bigger when the photo has been taken with the front camera. I want the watermark to have the same size no matter which camera has been used.
Any ideas?
My code:
UIImage *backgroundImage = image;
UIImage *watermarkImage = [UIImage imageNamed:#"Watermark.png"];
UIGraphicsBeginImageContext(backgroundImage.size);
[backgroundImage drawInRect:CGRectMake(0, 0, backgroundImage.size.width, backgroundImage.size.height)];
[watermarkImage drawInRect:CGRectMake(backgroundImage.size.width - watermarkImage.size.width, backgroundImage.size.height - watermarkImage.size.height, watermarkImage.size.width, watermarkImage.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.imageView.image = result;
The watermark is the same size. The image is not, since the two cameras have different resolutions. You need to resize the watermark in proportion to the image size. I believe you can use scaleImage:toSize: for this.
Figured out a weird solution. I turned on edit mode: [picker setAllowsEditing:YES]; and now the watermark is the same size no matter what camera you use.
The generic problem I'm facing is this:
I have a stretchable 50x50 PNG. I'm stretching it to 300x100. I want to get three UIImages of size 100x100 cut from the stretched image, A, B & C in the picture below:
I'm trying to do it like this:
// stretchedImage is the 50x50 UIImage, abcImageView is the 300x100 UIImageView
UIImage *stretchedImage = [abcImageView.image stretchableImageWithLeftCapWidth:25 topCapHeight:25];
CGImageRef image = CGImageCreateWithImageInRect(stretchedImage.CGImage, bButton.frame);
UIImage *result = [UIImage imageWithCGImage:image];
[bButton setBackgroundImage:result forState:UIControlStateSelected];
CGImageRelease(image);
I'm trying to crop the middle 100 ("B") using CGImageCreateWithImageInRect, but this is not right, since stretchedImage is 50x50, not 300x100. How do I get the 300x100 image to crop from? If the original image was 300x100 there would be no problem, but then I would lose the advantage of stretchable image.
I guess to generalize the problem even more, the question would be as simple as: if you scale or stretch an image to a bigger image view, how do you get the scaled/stretched image?
Background for the specific task I'd like to apply the solution for (if you can come up with an alternative solution):
I'm trying to implement a UI that's similar to the one you see during a call in native iPhone call application: a plate containing buttons for mute, speaker, hold, etc. Some of them are toggle type buttons with a different background color for selected state.
I have two graphics for the whole plate, for non-selected and selected states. I'm stretching both images to the desired size. For the buttons in selected state I want to get a piece of the stretched selected graphic.
You should be able to do this by rendering abcImageView to a UIImage
UIGraphicsBeginImageContextWithOptions(abcImageView.bounds.size, NO, 0.f);
[abcImageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Then, you can crop the image like this (given cropRect):
CGImageRef cgImage = CGImageCreateWithImageInRect(image.CGImage, cropRect);
UIImage *croppedImage = [UIImage imageWithCGImage:cgImage];
// Do something with the image.
CGImageRelease(cgImage);
I have one UIImageView. Its content mode is set to AspectFit.
[imageView setContentMode:UIViewContentModeScaleAspectFit].
I need to crop a subImage from this image. This is the code which crops the image:
CGImageRef imageRef = CGImageCreateWithImageInRect([imageView.image CGImage], customRect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
where customRect is the rectangle from which I need to crop the image.
This is how I calculate it:
CGRect customRect = CGRectMake((cropView.frame.origin.x/xFactor),
(cropView.frame.origin.y/yFactor),
(cropView.frame.size.width/xFactor),
(cropView.frame.size.height/yFactor));
The problem comes in cropping. CGImageCreateWithImageInRect crops the given area according to the actual image size which, in some cases, is larger than the image view size. I tried using other approaches such as UIGraphics:getImageFromCurrentImageContext but these do not keep the image quality as much as it degrades them.
I am displaying an image in tableview cell (Image name saved in a plist). Before setting it to the cell, I am resizing the image to
imageSize = CGSizeMake(32, 32);
But, after resizing the image, quality is also getting degraded in retina display.
I have both the images added to the project (i.e. 1x and #2x).
This is how I am reducing the image size to 32x32.
+ (UIImage *)scale:(UIImage *)image toSize:(CGSize)size
{
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
Any pointers on this is very much appreciated.
Thanks
try this : instead of UIGraphicsBeginImageContext(size);use UIGraphicsBeginImageContextWithOptions(size,NO,0.0);
from what i understand what you're doing there is resizing the image to 32x32 (in points) no matter what the resolution. the UIGraphicsBeginImageContextWithOptions scales the image to the scale of the device's screen..so you have the image resized to 32x32 points but the resolution is kept for retina display
(note that this is what i understood from apple's uikit reference..it may not be so..but it should)
read here