How do I scale my imagePickerController to fit the screen? - ios

I've created an imagePickerController which I am trying to create a custom overlayView for. I have done this, but when hiding the camera controls, I was left with a big black space.
So I tried to fill this by attempting to dynamically scale my camera to fit the screen, no matter what device is used. The result is that on my iphone 6, the camera does now fill the screen but it is super zoomed in and I don't know how to counteract this. Help much appreciated. This is my code:
let screenBound = UIScreen.mainScreen().bounds.size
let cameraAR = 4.0/3.0 as CGFloat
let cameraVH = screenBound.width * cameraAR
let scale = screenBound.height / cameraAR
imagePicker.cameraViewTransform = CGAffineTransformMakeTranslation(0, screenBound.height - cameraVH / 2.0)
imagePicker.cameraViewTransform = CGAffineTransformScale(imagePicker.cameraViewTransform, scale, scale)

You cannot do this in default imagePickerController. You have to 'build' your own, unfortunately.

Related

Find screen resolution in Swift [duplicate]

This question already has an answer here:
Swift - SpriteKit CGPoint Alignment
(1 answer)
Closed 9 months ago.
I am writing an iOS game in Swift using Spritekit and want to find the screen resolution to properly place my sprites. I found multiple ways on internet, but none of them is giving me a correct resolution that help me to place my sprites.
The one that works fine for an iPhone 13 Pro is the following
screenSize=self.size
screenSize.width /= (UIScreen.main.bounds.height/screenSize.height) / (UIScreen.main.bounds.width/screenSize.width)
let background = SKSpriteNode(imageNamed: "Landscape.jpg")
background.position = CGPoint(x: 0, y: 0)
background.size = screenSize
If I use the recommended UIScreen.main.bounds, this is the outcome on an iPhone:
But on an iPad for example, the dimensions are too big.
Is there a unique way of finding screen resolution on all devices ? Or is there a scene scaling that enters in the equation ?
Try this:
// Get main screen bounds
let screenSize: CGRect = UIScreen.main.bounds
let screenWidth = screenSize.width
let screenHeight = screenSize.height
print("Screen width = \(screenWidth), screen height = \(screenHeight)")
this is output of iPone 13 pro simulator:
Screen width = 390.0, screen height = 844.0

Size dimensions iOS don't make sense

I'm trying to look up the bounds of my screen in Portrait mode (set to only Portrait in Xcode) using Swift but everything I try does not place my sprite on a sensible position.
let screenWidth = self.size.width;
let screenHeight = self.size.height;
var posX: CGFloat = screenWidth;
var posY: CGFloat = screenHeight / 2;
Sprite is positioned out of screen in portrait. Turning the screen reveals the sprite far right of the screen (width), in the middle (height).
So you'd think I should just turn around these values.
let screenWidth = self.size.height;
let screenHeight = self.size.width;
Nope. Still out of screen in portrait. Turning the screen again reveals it, but halfway between middle and border of the right screen border (width), and not in the middle (height).
So I guess I could try to get the values in some other way.
let screenWidth = UIScreen.mainScreen().bounds.width;
let screenHeight = UIScreen.mainScreen().bounds.height;
It's visible in portrait!... But on the far left (not on the border) and not in the middle of the screen's height either. Turning the screen places it a bit left of the middle, and still not center of height.
I think you can guess what I tried next.
let screenWidth = UIScreen.mainScreen().bounds.height;
let screenHeight = UIScreen.mainScreen().bounds.width;
Pretty far on the bottom, slighty right of the middle in portrait. Turning screen makes no sense either.
I'm kinda lost. Googled and tried things for hours but no result.

Display UIView on scaled CGRect iOS

I am using the AVMetaData API to extract the bounds of an AVMetadataFaceObject. When printed to the console, this CGRect has the following values: bounds={0.2,0.3 0.4x0.5}. I'm having a fair amount of trouble mapping this to a UIView that displays over the face. I can hard-code in some conversion values for my specific screen to get it to crudely be in the right spot, but I would like a solution that displays a UIView over the face shown in my previewView on any screen size.
Does anyone know how to map these to the frame of an on-screen UIView based upon the size of a previewView?
You should be able to take the size of the capture area, let's call that "captureSize" and then do this:
CGRect viewRect;
viewRect.origin.x = bounds.origin.x * captureSize.width;
viewRect.origin.y = bounds.origin.y * captureSize.height;
viewRect.size.width = bounds.size.width * captureSize.width;
viewRect.size.height = bounds.size.height * captureSize.height;
Now, this all depends how your previewView is setup and whether or not it has any content scaling, etc, but should give you a sense of the conversion.

Dynamically centering UIImagepicker viewfinder that has been scaled up to full screen

I'm making a full screen camera for the iPhone 5 and have the following code to scale the 4:3 camera to fill the entire screen, which is a 2:3 ratio. The left and right sides bleed off the screen.
I have to move the cameraView down 71 points in order for it to center with the screen. Otherwise, there's a black bar at the bottom. I'm not quite sure why. Because I don't know why this is happening, I can't figure out how to dynamically code the adjustment to accommodate the iPhone 6 and 6 Plus.
Any help is appreciated.
// get the screen size
CGSize screenSize = [[UIScreen mainScreen] bounds].size;
// establish the height to width ratio of the camera
float heightRatio = 4.0f / 3.0f;
// calculate the height of the camera based on the screen width
float cameraHeight = screenSize.width * heightRatio;
// calculate the ratio that the camera height needs to be scaled by
float ratio = screenSize.height / cameraHeight;
//This slots the preview exactly in the middle of the screen by moving it down 71 points (for iphone 5)
CGAffineTransform translate = CGAffineTransformMakeTranslation(0.0, 71.0);
self.camera.cameraViewTransform = translate;
CGAffineTransform scale = CGAffineTransformScale(translate, ratio, ratio);
self.camera.cameraViewTransform = scale;
This finally clicked in my head. Since we know that the camera will always be the same length of the screen width:
//getting the camera height by the 4:3 ratio
int cameraViewHeight = SCREEN_WIDTH * 1.333;
int adjustedYPosition = (SCREEN_HEIGHT - cameraViewHeight) / 2;
CGAffineTransform translate = CGAffineTransformMakeTranslation(0, adjustedYPosition);
self.imagePicker.cameraViewTransform = translate;

AVFoundation photo size and rotation

I'm having a nightmare time trying to correct a photo taken with AVFoundation captureStillImageAsynchronouslyFromConnection to size and orient to exactly what is shown on the screen.
I show the AVCaptureVideoPreviewLayer with this code to make sure it displays the correct way up at all rotations:
previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
previewLayer.frame = CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height);
if ([[previewLayer connection] isVideoOrientationSupported])
{
[[previewLayer connection] setVideoOrientation:(AVCaptureVideoOrientation)[UIApplication sharedApplication].statusBarOrientation];
}
[self.view.layer insertSublayer:previewLayer atIndex:0];
Now when I have a returned image it needs cropping as it's much bigger than what was displayed.
I know there are loads of UIImage cropping examples, but the first hurdle I seem to have is finding the correct CGRect to use. When I simply crop to self.view.frame the image is cropped at the wrong location.
The preview is using AVLayerVideoGravityResizeAspectFill and I have my UIImageView also set to AspectFill
So how can I get the correct frame that AVFoundation is displaying on screen from the preview layer?
EDIT ----
Here's an example of the problem i'm facing. Using the front camera of an iPad Mini, the camera using the resolution 720x1280 but the display is 768x0124. The view displays this (See the dado rail at the top of the image:
Then when I take the image and display it, it looks like this:
Obviously the camera display was centred in the view, but the cropped image is taken from the top(none seen) section of the photo.
I'm working on a similar project right now and thought I might be able to help, if you haven't already figured this out.
the first hurdle I seem to have is finding the correct CGRect to use. When I simply crop to self.view.frame the image is cropped at the wrong location.
Let's say your image is 720x1280 and you want your image to be cropped to the rectangle of your display, which is a CGRect of size 768x1024. You can't just pass a rectangle of size 768x1024. First, your image isn't 768 pixels wide. Second, you need to specify the placement of that rectangle with respects to the image (i.e. by specifying the rectangle's origin point). In your example, self.view.frame is a CGRect that has an origin of (0, 0). That's why it's always cropping from the top of your image rather than from the center.
Calculating the cropping rectangle is a bit tricky because you have a few different coordinate systems.
You've got your view controller's view, which has...
...a video preview layer as a sublayer, which is displaying an aspect-filled image, but...
...the AVCaptureOutput returns a UIImage that not only has a different width/height than the video preview, but it also has a different aspect ratio.
So because your preview layer is displaying a centered and cropped preview image (i.e. aspect fill), what you basically want to find is the CGRect that:
Has the same display ratio as self.view.bounds
Has the same smaller dimension size as the smaller dimension of the UIImage (i.e. aspect fit)
Is centered in the UIImage
So something like this:
// Determine the width:height ratio of the crop rect, based on self.bounds
CGFloat widthToHeightRatio = self.bounds.size.width / self.bounds.size.height;
CGRect cropRect;
// Set the crop rect's smaller dimension to match the image's smaller dimension, and
// scale its other dimension according to the width:height ratio.
if (image.size.width < image.size.height) {
cropRect.size.width = image.size.width;
cropRect.size.height = cropRect.size.width / widthToHeightRatio;
} else {
cropRect.size.width = image.size.height * widthToHeightRatio;
cropRect.size.height = image.size.height;
}
// Center the rect in the longer dimension
if (cropRect.size.width < cropRect.size.height) {
cropRect.origin.x = 0;
cropRect.origin.y = (image.size.height - cropRect.size.height) / 2.0;
} else {
cropRect.origin.x = (image.size.width - cropRect.size.width) / 2.0;
cropRect.origin.y = 0;
}
So finally, to go back to your original example where the image is 720x1280, and you want your image to be cropped to the rectangle of your display which is 768x1024, you will end up with a CGRect of size 720x960, with an origin of x = 0, y = 1280-960/2 = 160.

Resources