Combine two UIImageView in to one image stretching - ios

When i tried two combine two UIImageview then images are stretching here's code what i am using
CGSize size =CGSizeMake(MAX(self.imgCapture.size.width, self.imgGallary.size.width), MAX(self.imgCapture.size.height, self.imgGallary.size.height));
UIGraphicsBeginImageContext(size);
[self.imgCaptured.image drawInRect:CGRectMake(self.view.frame.origin.x,self.view.frame.origin.y,size.width/2,self.imgCapture.size.height)];
[self.imgGallaryCD.image drawInRect:CGRectMake(self.view.frame.origin.x+(size.width/2),self.view.frame.origin.y,size.width/2,self.imgGallary.size.height)];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
Here's First screenshot is there is two UIImageview's
Second screenshot is when i am combine this image into one but image is stretching i want that aspect ratio like screenshot 1

The image is not "stretching". It is squeezing. It's a matter of simple arithmetic. Looking at your image context size and your drawInRect commands, we see that your image context is the size of one image, so now you are drawing both images at half width. So they are squeezed horizontally. You need the image context to be the size of both images added together.

Related

Filling UIImage view with a small image

I'm working with a UIImage, when the image is larger than the image view, I'd like to aspect fill.
When the image is smaller, I'd like the same for it to stretch it, keeping the aspect ratio but filling it's UIImageView.
Currently, it will just leave the smaller image in it's original size..

Objective-C How does snapchat make the text on top of an image/video so sharp and not pixelated?

In my app, it allows users to place text on top of images like snapchat, then they are allowed to save the image to their device. I simply add the text view on top of the image and take a picture of the image using the code:
UIGraphicsBeginImageContext(imageView.layer.bounds.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* savedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
But when I compare the text on my image, to the text from a snapchat image...it is significantly different. Snapchat's word text on top of image is significantly sharper then mine. Mine looks very pixelated. Also I am not compressing the image at all, just saving the image as is using ALAssetLibrary.
Thank You
When you use UIGraphicsBeginImageContext, it defaults to a 1x scale (i.e. non-retina resolution). You probably want:
UIGraphicsBeginImageContextWithOptions(imageView.layer.bounds.size, YES, 0);
Which will use the same scale as the screen (probably 2x). The final parameter is the scale of the resulting image; 0 means "whatever the screen is".
If your imageView is scaled to the size of the screen, then I think your jpeg will also be limited to that resolution. If setting the scale on UIGraphicsBeginImageContextWithOptions does not give you enough resolution, you can do your drawing in a larger offscreen image. Something like:
UIGraphicsBeginImageContext(imageSize);
[image drawInRect:CGRectMake(0,0,imageSize.width,imageSize.height)];
CGContextScaleCTM(UIGraphicsGetCurrentContext(),scale,scale);
[textOverlay.layer renderInContext:UIGraphicsGetCurrentContext()];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
You need to set the "scale" value to scale the textOverlay view, which is probably at screen size, to the offscreen image size.
Alternatively, probably simpler, you can start with a larger UIImageView, but put it within another UIView to scale it to fit on screen. Do the same with your text overlay view. Then, your code for creating composite should work, at whatever resolution you choose for the UIImageView.

UIImageView Aspect Ratio?

I have an image of size 320X460 and I want to create an UIImageView which height should be 450. To maintain aspect ratio I calculated the width of UIImageView = (320/460)*450 = 313.043 dynamically. And set the contentMode For UIImageView is UIViewContentModeScaleAspectFit. And set the image(320x460) to image view but it is some what blur.
Note: If I don't resize the UIImageView to 313.043X450 the image is very clear as it is. So what is the mistake I have done?
If I understand the question, this should answer it.
First to set the aspect ratio for the image in your image view.
myView.contentMode = .scaleAspectFit
Next, your image may blur because it is a .png or another rasterized format. You need to use .pdf as recommended by Apple or at very least another vectorized format. Rasterized images have values for all pixels in the image, so when the image is stretched too far it just duplicates and blurs pixels. Vectorized images do not blur because they are really just a series of instructions on how to draw/render the corresponding image.
If you are resizing UIImageView manually, set the content mode to UIViewContentModeScaleAspectFill.
If you want to keep content mode UIViewContentModeScaleAspectFit do not resize imageview to 313. Image will adjust maximum possible width and height , keeping it's aspect ratio.

UIImage distorted when using it for UIImageView

I have taken a photo, and then I'm initializing a UIImageView object with this photo. The only problem is, when I take the photo, the photo is being taken using the full iPhone screen (portrait).
The UIImageView that is being initialized with this photo is only set to take up the top 50% of the iphone's screen. So you can imagine the image looks distorted.
I have been able to make it look a lot better by using the following code:
UIImageView *halfView = [[UIImageView alloc]initWithImage:image];
[self.view addSubview:halfView];
halfView.frame = CGRectMake(0, 0, self.view.bounds.size.width, self.view.frame.size.height/2);
halfView.contentMode = UIViewContentModeScaleAspectFill;
The only problem is, the final UIImageView called "halfView" is still slightly distorted.
I have a feeling that this is impossible to fix, because the original photo is being taken with the full iphone screen and can never perfectly scale to fit a UIImageView that only takes up the top 50% of the iphone screen.
I was basically trying to copy the frontback app. Here is what it looks like when you are taking the original image in their app:
This is what my app's screen looks like when you are taking the picture:
And then right after you take the picture, my app's screen changes to look like the frontback screen and takes the picture you just took and places it in the top half and tries to scale it.
I hope that makes sense. I know it is a long question, but I just really wanted to let the user use the full screen while taking the photo and then just scale it to half the screen.
Am I going about this all wrong? Am I crazy to think I could ever properly scale the image to half the screen when it was originally captured as a "full screen" image?
Thanks for the help.
For the sake of argument let's say your captured image size is 640x1136 (twice the size of an iPhone 5 screen) and you are trying to display it in a UIImageView with of size 320x284 (half the size of an iPhone 5 screen).
As you can already see from these dimensions the captured image's width is smaller than its height whereas the UIImageView's width is larger than its height - the proportions are different.
Therefore, scaling the captured image to fit the UIImageView's width (scale by 0.5) means the captured image will be of size 320x568 - its height is larger than the UIImageView's height.
Scaling the captured image to fit the UIImageView's height (scale by 0.25) means the captured image will be of size 160x284 - its width is smaller the the UIImageView's width.
The image can't scale exactly like you want it to scale. However, you can use UIViewContentModeScaleAspectFill to fill the entire UIImageView but lose some of the image (image's height is too big to fit). You can also choose to use UIViewContentModeScaleAspectFit which will show the entire image but will leave some space on the sides (image's width is too small).
Another option you have is to actually capture the image in the proportions of your UIImageView in the first place but that means you won't be able to capture a full screen image.
Try this function, pass your UIImage in this function along with the new size, in turn it will return you the UIImage with size specified by you.
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I guess this is what you want.
Hope this helps.
You mention the image takes up the full size of the screen. If it's to display the UIImageView taking up half the screen, then you'll need to add this code to clip the frame
halfView.clipToBounds = YES;
Despite making the size of the imageview half the screen, the actual image will show outside the boundaries of the imageview if it's original size is bigger with the aspectFit property. clipToBounds will fix this.
I hope this is what you're looking for. Thanks, Jim.

UIImage rendering not clear on iPad Mini

The following code block is used in my application to take a screenshot of the current screen of an iPad mini(768 x 1024):
UIImage *img;
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
In a different viewcontroller, I present a UIScrollView with a width of 540 and a height of 290. I display the screencapture UIImage in a UIImageView which I create programmatically initWithFrame with a rectangle width of 250 and height of 250. The content size of the scrollview is 768 by 250.
Now running the application, I display four rectangles and screenshot the screen using the above block of code. Transitioning to the UIScrollView, the image is not clear (and by not clear, some rectangles are missing sides while some are thicker than others). Is there a way to display the image clearer? I know the image has to be scaled down from the original 768 by 1024 to 250 by 250. Could this be the problem? If so, what would be the best fix?
Edit:
Above a screenshot of the image I want to capture.
Below is the UIImage in UIImageView within a UIScrollView:
Cast each coordinate to int, or use CGRectIntegral, to do that directly on a CGRect, decimal point requires AA and makes images blurry.
Try changing the content mode of your UIImageViews. If you use UIViewContentModeScaleAspectFill, you shouldn't see any extra space around the edges.
Update: From the screenshots you posted, it looks like this is just an effect of the built-in downscaling in UIKit. Try manually downscaling the image to fit using Core Graphics first. Alternatively, you might want to use something like the CILanczosScaleTransform Core Image filter (iOS 6+).

Resources