I have a UIImagePickerController with showsCameraControls set to false. The camera preview position is being set to the center by using the cameraViewTransform property.
int cameraViewHeight = [UIScreen mainScreen].bounds.size.width * (4.0 / 3.0);
int adjustedYPosition = ([UIScreen mainScreen].bounds.size.height - cameraViewHeight) / 2;
self.imagePicker.cameraViewTransform = CGAffineTransformMakeTranslation(0, adjustedYPosition);
Normally, the position is (0,0) when showsCameraControls is false. This changes with iPhone 12, in which it keeps the same position as when showsCameraControls is true (almost centered).
Is there a way to access the position of the camera preview, in order to calculate the corresponding transformation so it can be perfectly centered?
Related
I am working on an app which places labels of cities on top of the cameraView for augmented reality. When I open the camera I hide the controls and navigationBar and toolBar.
I use the following code I found online to resize the camera view to cover the whole screen. I only have an iPhone 7 in landscape mode. When I use this the height of the screen is 320 (instead of 375 that it should be) and the length is 677. How can I make it so the screen size is the actual size of the phone screen and how do I change it for each different iPhone? I can't use it in simulator to see since you can't use augmented reality in simulator.
picker = [[UIImagePickerController alloc] init];
picker.allowsEditing = NO;
picker.sourceType = UIImagePickerControllerSourceTypeCamera ;
picker.showsCameraControls = NO;
self.picker.navigationBarHidden = YES;
self.picker.toolbarHidden = YES;
if([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone){
CGSize screenSize = [[UIScreen mainScreen] bounds].size; // 320 x 568
float scale = screenSize.width / screenSize.height*5/3; // screen height divided by the pickerController height ... or: 568 / ( 320*4/3 )
CGAffineTransform translate=CGAffineTransformMakeTranslation(0,(screenSize.height - screenSize.width*4/3)*0.5);
CGAffineTransform fullScreen=CGAffineTransformMakeScale(scale, scale);
picker.cameraViewTransform =CGAffineTransformConcat(fullScreen, translate);
I don't understand the logic behind this and think it is originally for an iPhone SE. If there is a tutorial that explains this or if someone can explain it to me I would really appreciate it.
Thanks
I am working on an ObjectiveC app in which I need to set the width and height of a uiview in pixels and than scale that uiview equal to the devices width. The width and height values in pixels are fixed ( let say 900 px x 500 px).
Currently I am doing this,
UIScreen* mainScreen = [UIScreen mainScreen];
CGRect frame = CGRectMake(0, 0, 900.0, 500.0);
[view setFrame:frame];
CGFloat valScale = mainScreen.bounds.size.width/900.0;
[view setContentScaleFactor:valScale];
But this is not giving me the desired values.
What should I do?
(P.S) Does view.frame.size.width return the width in pixels or points?
Hi for scaling a view you can use the transform property. Say you wish to scale your UIView by a factor of 2 then you can use.
self.view.transform = CGAffineTransformScale(CGAffineTransformIdentity, 2, 2);
CGRect width and height returns points not pixels. One point has two pixels for 2x devices like iPhone 6,7 and three pixels for 3x devices like iPhone 6 Plus,7 Plus.
For more detail you can refer this Apple's documentation link.
for iPhone 6 and 6+ I used following code for stretching camera on full screen.
I basically stretch camera to full screen by using cameraViewTransform. Due to this I am getting more enlarged image compared with preview mode. How can I only capture only that part that is visible in preview mode.
Refer following screen shots for more clearance
In Iphone 6 -
(in preview mode)
(captured image)
In iphone 5 -
(in preview mode)
(captured image)
CGSize screenSize = [[UIScreen mainScreen] bounds].size;
// set the aspect ratio of the camera
float heightRatio = 4.0f / 3.0f;
// calculate the height of the camera based on the screen width
float cameraHeight = screenSize.width * heightRatio;
// calculate the ratio that the camera height needs to be scaled by
float scale = screenSize.height / cameraHeight;
// move the controller to the center of the screen
self.imagePickerController.cameraViewTransform = CGAffineTransformMakeTranslation(0, (screenSize.height - cameraHeight) / 2.0);
// concatenate the scale transform
self.imagePickerController.cameraViewTransform = CGAffineTransformScale(self.imagePickerController.cameraViewTransform, scale, scale);
I'm making a full screen camera for the iPhone 5 and have the following code to scale the 4:3 camera to fill the entire screen, which is a 2:3 ratio. The left and right sides bleed off the screen.
I have to move the cameraView down 71 points in order for it to center with the screen. Otherwise, there's a black bar at the bottom. I'm not quite sure why. Because I don't know why this is happening, I can't figure out how to dynamically code the adjustment to accommodate the iPhone 6 and 6 Plus.
Any help is appreciated.
// get the screen size
CGSize screenSize = [[UIScreen mainScreen] bounds].size;
// establish the height to width ratio of the camera
float heightRatio = 4.0f / 3.0f;
// calculate the height of the camera based on the screen width
float cameraHeight = screenSize.width * heightRatio;
// calculate the ratio that the camera height needs to be scaled by
float ratio = screenSize.height / cameraHeight;
//This slots the preview exactly in the middle of the screen by moving it down 71 points (for iphone 5)
CGAffineTransform translate = CGAffineTransformMakeTranslation(0.0, 71.0);
self.camera.cameraViewTransform = translate;
CGAffineTransform scale = CGAffineTransformScale(translate, ratio, ratio);
self.camera.cameraViewTransform = scale;
This finally clicked in my head. Since we know that the camera will always be the same length of the screen width:
//getting the camera height by the 4:3 ratio
int cameraViewHeight = SCREEN_WIDTH * 1.333;
int adjustedYPosition = (SCREEN_HEIGHT - cameraViewHeight) / 2;
CGAffineTransform translate = CGAffineTransformMakeTranslation(0, adjustedYPosition);
self.imagePicker.cameraViewTransform = translate;
How to get the screen width and height only in landscape orientation , i have two ipad 4 tablets and on one of them it takes me the landscape width & height , and on the other one it takes me the portrait one even if it's orientation is on lanscape
at the moment i am using this but its not working well
CGFloat width = self.view.bounds.size.width;
CGFloat height = self.view.bounds.size.height;
Use [UIScreen mainScreen].bounds you'll have the same.
Discussion
This rectangle is specified in the current coordinate
space, which takes into account any interface rotations in effect for
the device. Therefore, the value of this property may change when the
device rotates between portrait and landscape orientations.
Use [UIScreen mainScreen].nativeBounds in iOS8 only to get the portrait-locked bounds.
Discussion This rectangle is based on the device in a
portrait-up orientation. This value does not change as the device
rotates.
Swift 3
let pixelWidth = UIScreen.main.nativeBounds.width
let pixelHeight = UIScreen.main.nativeBounds.height
let pointWidth = pixelWidth / UIScreen.main.nativeScale
let pointHeight = pixelHeight / UIScreen.main.nativeScale
print ("Pixels: \(pixelWidth) x \(pixelHeight)")
print ("Points: \(pointWidth) x \(pointHeight)")
On a 6s Plus will print...
Pixels: 1080.0 x 1920.0
Points: 414.0 x 736.0