I have a UIView and a UIImageView both at 40*40 dimensions. I also have an image at 120*120 and a compressed copy at 40*40 with worse quality.
When I use UIView, in the drawRect: function I have [backgroundImage drawInRect: rect]; It displays two images equally vaguely.
When I use UIImageView, I put imageView.image = backgroundImage; It displays the 120*120 image at a way better quality.
I want to know why is that? What can I do to display the expected quality in the UIView without using a UIImageView?
Related
I have an UIView with UIImageViews and UILabels, which I have to capture into image and then export to photo gallery. The image has a fixed size in pixels and must have alpha channel, because UIView background color is clear.
Now I use UIGraphicsBeginImageContextWithOptions with renderInContext or drawViewHierarchyInRect, then I resize image to a given size and save it with UIImagePNGRepresentation. It works - I get an UIImage of the exact pixel size I need, with alpha channel, saved in gallery.
UIGraphicsBeginImageContextWithOptions(_templateView.bounds.size, NO, 0.0);
[_templateView drawViewHierarchyInRect:_templateView.bounds afterScreenUpdates:NO];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsBeginImageContext(CGSizeMake(1080.0f, 1080.0f));
[img drawInRect:CGRectMake(0, 0, 1080.0f, 1080.0f)];
img = UIGraphicsGetImageFromCurrentImageContext();
NSData *pngImageData = UIImagePNGRepresentation(img);
The problem is the size of the result image. It is a way larger than expected. When I add only one UIImageView (filling parent UIView) with the image of 1.2Mb. it's capture results in 1.65Mb.. It is crucial because I have a limit size for an image. How can I reduce it's size? Is is possible to reduce quality of such an image with alpha channel?
I tried resize it to 50% and then again to 100% but it results even in largest size.
![enter image description here][1]I am taking a UIImage from UIImagePickerController and showing it in a UIImageView. Actually the UIImagePickerController is in full screen and thus the UIImage is also. But UIImageView is not in full screen because of presence of a header bar.
The Image is getting stretched..
How do I fix this..
Please help me someone..
THANKS IN ADVANCE.....
First image is the image picker screen and second one is where I am displaying the UIImage captured from picker in a UIImageView.
![This is my UIImagePickerController][2]
![This is the image after capturing being shown in a UIImageview. ][3]
NOTE: BOTH THE SCREENS ARE IN LANDSCAPE MODE
Use this for your UIImageView
imageView.contentMode = UIViewContentModeScaleAspectFill;
You won't get any space and with scale preserved. However, some part of the image will be clipped off.
If you use:
imageView.contentMode = UIViewContentModeScaleAspectFit;
There will be some empty space, but scale is preserved.
I am letting the user capture an image from the camera or picking one from the library.
This image I display in an UIImageView.
The user can now scale and position the image within a bounding box, exactly like you would do using the UIImagePickerController when allowsEditing is set to YES.
When the user is satisfied with the result and taps Done I would like to produce a cropped UIImage.
The problem arises when using CGImageCreateWithImageInRect as this does not take the scaling into account. The transform is applied to the imageView like this:
CGAffineTransform transform = CGAffineTransformScale(self.imageView.transform, newScale, newScale);
[self.imageView setTransform:transform];
Using a gestureRecognizer.
I assume what is happening is; the UIImageView is scaled and moved, it then applies the UIViewContentModeScaleAspectFit to the UIImage is holds and when I ask it to crop the image, it does exactly that - whit no regards to the scaling positioning. The reason I think this, is that if I don't scale or move the image but just tap Done straight away the cropping works.
I crop the image like this:
- (UIImage *)cropImage:(UIImage*) img toRect:(CGRect)rect {
CGFloat scale = [[UIScreen mainScreen] scale];
if (scale>1.0) {
rect = CGRectMake(rect.origin.x*scale , rect.origin.y*scale, rect.size.width*scale, rect.size.height*scale);
}
CGImageRef imageRef = CGImageCreateWithImageInRect([img CGImage], rect);
UIImage *result = [UIImage imageWithCGImage:imageRef scale:self.imageView.image.scale orientation:self.imageView.image.imageOrientation];
// UIImage *result = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return result;
}
Passing in a cropRect from a view that is a subView of my main view (the square overlay box, like in UIImagePickerController). Main UIView has a UIImageView that gets scaled and a UIView that displays the crop rectangle.
How can I get the "what you see is what you get" cropping and which factors must I take into account. Or maybe suggestions if I should implemented the hierarchy or scaling differently.
Try a simple trick. Apple has got samples on its site to show how to zoom into a photo using code. Once done zooming, using graphic context take the frame size of the bounding view, and take the image with that. Eg Uiview contains scroll view which has the zoomed image. So the scrollview zooms and so does your image, now take the frame size of your bounding UIview, and create an image context out of it and then save that as a new image. Tell me if that makes sense.
Cheers :)
I'm searching a way to draw stretchable image as background of my custom cell background view. I would like to use drawRect method and draw an image stretched exactly as it would be stretched with stretchableImageWithLeftCapWidth in a UIImageView... how can i continue this code to make it happen ?
- (void)drawRect:(CGRect)rect{
CGContextRef context = UIGraphicsGetCurrentContext();
UIImage *bgImg =[[UIImage imageNamed:#"bg_table_top"]stretchableImageWithLeftCapWidth:3 topCapHeight:0];
//How to draw the image stretched as the self.bounds size ?
....
}
Any reason not to let UIImageView do this? (Include one as a child of your custom cell.) It's true that reducing child views can be a performance improvement in tables, but UIImageView is also pretty good at getting good performance when drawing images.
My guess is otherwise you're going to have to do multiple draw calls, in order to get the ends and middle drawn correctly.
How can I truncate the left side of an image stored in a UIImage object. Basically in certain situations I just want to show part of an image.
How can I do this on with the iOS sdk?
P.S. I tried changing the frame size of the UIImage but that just scales the image and distorts it.
A very simple way is to load the image into a UIImageView, and then add the view to another view. You can then position the image view so that its .frame.origin.x property is negative, which will place it off to the left. The parent view needs to have setMasksToBounds:YES called on it, or else the image view will still be fully-visible.
There are many other ways to achieve this effect as well, but this may be the simplest for you to implement.
to crop a UIImage, you can use one of the UIImage categories available out there, such as http://www.hive05.com/2008/11/crop-an-image-using-the-iphone-sdk/
For example, this frame will remove 100 pixel from the left side of a 200x200 pixel UIImage
CGRect clippedRect = CGRectMake(100, 0, 100, 200);
UIImage *cropped = [self imageByCropping:lightsOnImage toRect:clippedRect];