Swift camera issue - ios

I have a UIView that works as camera and it's 320x180 and a UIImageView of same size.
When I take a photo, it generates me an UIImage of size 1080x1920, so when I show it on the imageView, what happens is that the photo is very compressed on its height, because it is very tall, is like this
██████ the black rectangle is the whole photo (1080x1920)
██████
█▒▒▒▒█ the gray is what camera show in screen
██████ (it shows only gray part but it stores
██████ all the black part 1080x1920)
I would like to store it as an UIImage exactly how I see it on the gray rectangle.
I'm not sure how to do this, since the size of the photo is way bigger than the resolution the screen (which is 320 x 568) so is hard to crop correctly (and the crop is also rotating the image and bringing other bugs).

1080/6 = 180. 1920/6 = 320. So the values are in the correct aspect ratio — but they are reversed. You need to apply the correct rotation value to the image.

Related

Understanding image sizes in iOS

This is in regards to the pictures taken through the iPhone's camera. No matter what, I can't understand why image sizes are in the order of 1000s and image scale always 1.0.
For example, I printed out an image's details and this is what I got:
<UIImage: 0x134def110> size {3024, 4032} orientation 3 scale 1.000000
What does 3024x4032 mean? And why is the scale 1.0, when my screen size is really 375x667? Orientation 3 means the image is roated 90º counterclockwise. So if the original image is 375x500 (in pixels), after rotation it should be 500x375. Then why does the size shown not change accordingly?
And on a similar note, how would I get the size of the image in pixels from this size that's printed out? Because no matter what the size of the camera preview, if the ratio of the camera preview is 4:3, the resulting size of the image (image.size.width and image.size.height) is always 3024x4032.
What does 3024x4032 mean?
Those are the dimensions of the image. I think you're missing one point: the iPhone's camera can take photographs with a much higher resolution than its screen size. Just because an image is shown on the screen, it doesn't mean the image dimensions are that size.
Size: An uncropped, landscape 12.2MP photo (that's default size when shot on the iPhone 7 rear camera) is 3024 * 4032 pixels, so that's where that number comes from. Extra crispy in case you want to frame it and hang it up on your wall! See source.
Scale: Generally 1.0 (or 100%), it's the magnitude of which you've reduced your image file size. So if you wanted a 50% smaller file, you could scale the image down to 0.5 (50%), obviously losing some quality in the process.
tl;dr: those dimensions are the scale of the photo in storage, not the dimensions at which it's rendered on the phone.

Maintain image quality in iOS Application

I am trying to create a photo application but I am having a tough time formatting my photos so that they show clearly.
I have an imageview size 320 * 500, and an image size 3648*2736 px (Which of course I can scale down).
imageView.contentMode=UIViewContentModeScaleAspectFit;
With imageView.contentMode=UIViewContentModeScaleAspectFit; I changed the image size to 700* 525px (IMGA) and one 500 * 325(IMGB).
In this mode
IMGA fills the entire image view but is somehow a little distorted/not crisp
IMGB does not fill the entire image view Top and Bottom but the width is perfect and the image is crisp.
UIViewContentModeScaleAspectFill
With UIViewContentModeScaleAspectFill
the image is made for fit into the uiimageview but again distorted even if the image is scaled down vs being scaled up.
I see many apps with crisp large images . and I am hoping that someone helps me with measuring/ contentmode to get my images better.
Or correct my resizing
P.S I have been looking at this link to try help but I'm still missing my goal.
Difference between UIViewContentModeScaleAspectFit and UIViewContentModeScaleToFill?

Crop a UIImage that has been zoomed and rotated

I have an app where I take a photo, then zoom/rotate the photo. How can I crop the result to the screen's frame?
For example, say I am displaying a photo that is 1000px x 1000px. If I zoom in and rotate a bit, I want the resulting UIImage to be... say 800px x 800px (not 320x320). How can I accomplish this? If I use CGImageCreateWithImageInRect and pass in the UIImageView's rect, I get a super-zoomed, small corner of the image.

UIImage distorted when using it for UIImageView

I have taken a photo, and then I'm initializing a UIImageView object with this photo. The only problem is, when I take the photo, the photo is being taken using the full iPhone screen (portrait).
The UIImageView that is being initialized with this photo is only set to take up the top 50% of the iphone's screen. So you can imagine the image looks distorted.
I have been able to make it look a lot better by using the following code:
UIImageView *halfView = [[UIImageView alloc]initWithImage:image];
[self.view addSubview:halfView];
halfView.frame = CGRectMake(0, 0, self.view.bounds.size.width, self.view.frame.size.height/2);
halfView.contentMode = UIViewContentModeScaleAspectFill;
The only problem is, the final UIImageView called "halfView" is still slightly distorted.
I have a feeling that this is impossible to fix, because the original photo is being taken with the full iphone screen and can never perfectly scale to fit a UIImageView that only takes up the top 50% of the iphone screen.
I was basically trying to copy the frontback app. Here is what it looks like when you are taking the original image in their app:
This is what my app's screen looks like when you are taking the picture:
And then right after you take the picture, my app's screen changes to look like the frontback screen and takes the picture you just took and places it in the top half and tries to scale it.
I hope that makes sense. I know it is a long question, but I just really wanted to let the user use the full screen while taking the photo and then just scale it to half the screen.
Am I going about this all wrong? Am I crazy to think I could ever properly scale the image to half the screen when it was originally captured as a "full screen" image?
Thanks for the help.
For the sake of argument let's say your captured image size is 640x1136 (twice the size of an iPhone 5 screen) and you are trying to display it in a UIImageView with of size 320x284 (half the size of an iPhone 5 screen).
As you can already see from these dimensions the captured image's width is smaller than its height whereas the UIImageView's width is larger than its height - the proportions are different.
Therefore, scaling the captured image to fit the UIImageView's width (scale by 0.5) means the captured image will be of size 320x568 - its height is larger than the UIImageView's height.
Scaling the captured image to fit the UIImageView's height (scale by 0.25) means the captured image will be of size 160x284 - its width is smaller the the UIImageView's width.
The image can't scale exactly like you want it to scale. However, you can use UIViewContentModeScaleAspectFill to fill the entire UIImageView but lose some of the image (image's height is too big to fit). You can also choose to use UIViewContentModeScaleAspectFit which will show the entire image but will leave some space on the sides (image's width is too small).
Another option you have is to actually capture the image in the proportions of your UIImageView in the first place but that means you won't be able to capture a full screen image.
Try this function, pass your UIImage in this function along with the new size, in turn it will return you the UIImage with size specified by you.
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I guess this is what you want.
Hope this helps.
You mention the image takes up the full size of the screen. If it's to display the UIImageView taking up half the screen, then you'll need to add this code to clip the frame
halfView.clipToBounds = YES;
Despite making the size of the imageview half the screen, the actual image will show outside the boundaries of the imageview if it's original size is bigger with the aspectFit property. clipToBounds will fix this.
I hope this is what you're looking for. Thanks, Jim.

Again edit of edited UIImage in iOS app

In my iOS app, I am putting several UIImages on one back UIImage and then saving the overall back image with all subimages added on it by taking screenshot programmatically.
Now I want to change the subviews UIImages position of that saved image. So want to know how to detect the subview images position as I have taken whole image as screenshot.
Record their frame as converted to window coordinates. The pixels of the image should be the same as the frame origin (for normal) or double for retina. The screenshot is of the whole screen, so its dimensions are equivalent to the window frame. UIView has some convenience methods to convert arbitrary view frames to other view (or window) coordinates.
EDIT: to deal with content fit, you have to do the math yourself. You know the frame of the imageView, and you can ask the image for its size. Knowing the aspect ratio of each will let you determine in which dimension the image completely fits, and then you can compute the other dimension (which will be a value less than the imageView frame. Divide the difference of the view dimension minus the image dimension by two, and that lets you know the offset to the image inside the view. Now you can save the frame of the image as its displayed in the view.

Resources