Dynamically centering UIImagepicker viewfinder that has been scaled up to full screen - ios

I'm making a full screen camera for the iPhone 5 and have the following code to scale the 4:3 camera to fill the entire screen, which is a 2:3 ratio. The left and right sides bleed off the screen.
I have to move the cameraView down 71 points in order for it to center with the screen. Otherwise, there's a black bar at the bottom. I'm not quite sure why. Because I don't know why this is happening, I can't figure out how to dynamically code the adjustment to accommodate the iPhone 6 and 6 Plus.
Any help is appreciated.
// get the screen size
CGSize screenSize = [[UIScreen mainScreen] bounds].size;
// establish the height to width ratio of the camera
float heightRatio = 4.0f / 3.0f;
// calculate the height of the camera based on the screen width
float cameraHeight = screenSize.width * heightRatio;
// calculate the ratio that the camera height needs to be scaled by
float ratio = screenSize.height / cameraHeight;
//This slots the preview exactly in the middle of the screen by moving it down 71 points (for iphone 5)
CGAffineTransform translate = CGAffineTransformMakeTranslation(0.0, 71.0);
self.camera.cameraViewTransform = translate;
CGAffineTransform scale = CGAffineTransformScale(translate, ratio, ratio);
self.camera.cameraViewTransform = scale;

This finally clicked in my head. Since we know that the camera will always be the same length of the screen width:
//getting the camera height by the 4:3 ratio
int cameraViewHeight = SCREEN_WIDTH * 1.333;
int adjustedYPosition = (SCREEN_HEIGHT - cameraViewHeight) / 2;
CGAffineTransform translate = CGAffineTransformMakeTranslation(0, adjustedYPosition);
self.imagePicker.cameraViewTransform = translate;

Related

Get camera preview position in UIImagePickerController for iPhone 12

I have a UIImagePickerController with showsCameraControls set to false. The camera preview position is being set to the center by using the cameraViewTransform property.
int cameraViewHeight = [UIScreen mainScreen].bounds.size.width * (4.0 / 3.0);
int adjustedYPosition = ([UIScreen mainScreen].bounds.size.height - cameraViewHeight) / 2;
self.imagePicker.cameraViewTransform = CGAffineTransformMakeTranslation(0, adjustedYPosition);
Normally, the position is (0,0) when showsCameraControls is false. This changes with iPhone 12, in which it keeps the same position as when showsCameraControls is true (almost centered).
Is there a way to access the position of the camera preview, in order to calculate the corresponding transformation so it can be perfectly centered?

how to set width and height of a UIVIew in pixels?

I am working on an ObjectiveC app in which I need to set the width and height of a uiview in pixels and than scale that uiview equal to the devices width. The width and height values in pixels are fixed ( let say 900 px x 500 px).
Currently I am doing this,
UIScreen* mainScreen = [UIScreen mainScreen];
CGRect frame = CGRectMake(0, 0, 900.0, 500.0);
[view setFrame:frame];
CGFloat valScale = mainScreen.bounds.size.width/900.0;
[view setContentScaleFactor:valScale];
But this is not giving me the desired values.
What should I do?
(P.S) Does view.frame.size.width return the width in pixels or points?
Hi for scaling a view you can use the transform property. Say you wish to scale your UIView by a factor of 2 then you can use.
self.view.transform = CGAffineTransformScale(CGAffineTransformIdentity, 2, 2);
CGRect width and height returns points not pixels. One point has two pixels for 2x devices like iPhone 6,7 and three pixels for 3x devices like iPhone 6 Plus,7 Plus.
For more detail you can refer this Apple's documentation link.

CALayer frame gives strange position

I am currently trying to use CALayer to show a mask and then use this mask to crop the picture according to this mask but I can't find a way to get the good position and size of my mask in my image.
When I draw my mask, I use kCAGravityResizeAspectFill to keep the ratio of my image. In this case the layer use the height to fill my screen height and compute the proper width / (x, y) to keep the ratio.
CGRect screenViewRect = [self.viewForBaselineLayout bounds];
CGFloat screenViewWidth = screenViewRect.size.width;
CGFloat screenViewHeight = screenViewRect.size.height;
masqueLayer.frame = CGRectMake(screenViewWidth *0.45, screenViewHeight*0.05, screenViewWidth *0.10, screenViewHeight*0.92);
masqueLayer.contents = (__bridge id)([UIImage imageNamed:masqueJustif].CGImage);
masqueLayer.contentsGravity = kCAGravityResizeAspectFill;
[self.layer insertSublayer:masqueLayer atIndex:2];
As I get the mask on my screen I can easily see that the screenViewWidth *0.10 is not respected as I wanted, but my trouble come from the fact that when I get the frame or my layer the width isn't updated and so I can't get the real position of my layer on my screen.
Is there a method to get the real position of my layer on my screen.
I am actually trying to get the crop rectangle with this (considering my ratio is 21/29.7 as it is a A4 mask). This code is actually working on IPad but not on Iphone as the ratio is different :
CGRect outputRect = [masqueLayer convertRect:masqueLayer.bounds toLayer:self.layer];
outputRect.origin.y *= 2;
outputRect.size.height *= 2;
outputRect.size.width = outputRect.size.height * (21/29.7);
I also tried using my mask percentage :
CGRect outputRect = masqueLayer.frame;
outputRect.origin.y = its.size.height * 0.05;
outputRect.origin.x = its.size.width * 0.45 * masqueLayer.anchorPoint.x;
outputRect.size.height = its.size.height * 0.92;
outputRect.size.width = its.size.height * 0.92 * (21/29.7);
Here is a screenshot of my mask on another layer. I want to extract the image bounded by the blue corner (which is the border of my layer)
Thanks.

iOS Capturing bigger image compared with camera preview mode iOS

for iPhone 6 and 6+ I used following code for stretching camera on full screen.
I basically stretch camera to full screen by using cameraViewTransform. Due to this I am getting more enlarged image compared with preview mode. How can I only capture only that part that is visible in preview mode.
Refer following screen shots for more clearance
In Iphone 6 -
(in preview mode)
(captured image)
In iphone 5 -
(in preview mode)
(captured image)
CGSize screenSize = [[UIScreen mainScreen] bounds].size;
// set the aspect ratio of the camera
float heightRatio = 4.0f / 3.0f;
// calculate the height of the camera based on the screen width
float cameraHeight = screenSize.width * heightRatio;
// calculate the ratio that the camera height needs to be scaled by
float scale = screenSize.height / cameraHeight;
// move the controller to the center of the screen
self.imagePickerController.cameraViewTransform = CGAffineTransformMakeTranslation(0, (screenSize.height - cameraHeight) / 2.0);
// concatenate the scale transform
self.imagePickerController.cameraViewTransform = CGAffineTransformScale(self.imagePickerController.cameraViewTransform, scale, scale);

Why do lines smaller than 1.0pts not render correctly on non-retina screens?

self.layer.borderWidth = 0.5;
on a UIButton or UITextField render fine on a retina screen, but on a non-retina screen only the top and left borders render while the right and bottom borders do not render.
I assume it has something to do with dpi of the screen and how sub point lines are drawn, but it's possible that there is a better explanation.
Question:
I'd like to know if it's possible to have all sides of a UIView's border show as expected on both retina and non-retina screens with borderWidth set to 0.5.
If you want a single pixel (not point) line always, you'll have to use a different border width depending on the scale of the screen.
E.g.:
CGFloat scale = [[UIScreen mainScreen] scale];
if (scale == 2.0) {
// retina screen;
self.layer.borderWidth = 0.5;
} else {
// non-retina screen
self.layer.borderWidth = 1.0;
}
Now that multiple scales are supported (#3x) it is probably better to write Matt's answer as this:
CGFloat scale = [[UIScreen mainScreen] scale];
CGFloat width = scale > 0.0 ? 1.0 / scale : 1.0;
[self.layer setBorderWidth:width];

Resources