I'm writing a Metal app for iPhone. I have tons of OpenGL experience, so it shouldn't be too hard, right?
WRONG.
I'm rendering this 2d scene of rectangles with no aspect ratio correction - the vertices are in [-1,1] x [-1,1] coordinates, so this should fill the entire screen and distort the scene to fit the screen.
BTW, this is running on a relatively new iPhone, version 12.1.2 (16C101).
In landscape mode (width > height), this is what I get (screencapped image): https://imgur.com/r7gJXct. Half of the screen is blank.
In portrait mode (height > width), I get exactly what I expected (distorted squares): https://imgur.com/ZoPoHhR.
I think what is happening is that Metal just renders the portrait mode, and clips whatever goes off the screen, without "squishing" it to the screen viewport.
The code is just basic metal code, no configuration. I took the default Metal iOS project, removed the code in Renderer, and followed the Hello Triangle tutorial with uniforms sent to the shader and different vertex data.
How should I go about fixing this "bug"?
Is it even a bug?
Figured it out!
I guess in func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize), I have to add the line metalLayer.frame = view.layer.frame and it works.
Related
I'm working on a universal ios device spritekit/swift game. I've run into a problem.
I have a 15x50 texture that needs to stay 15x50. So when I render the scene with the resize fill scale mode: the texture gets stretched vertically. But if I render the scene with the aspect fill scale mode: the textures are perfect but the scene itself is too big-the view doesn't show everything and elements that should be on the screen are way off. How do I tackle this problem? I can't seem to find anyone with a similar problem so I'm feeling a bit lost.
Thanks yall.
I'm writing a project in Xcode 7 / Swift 2 that it is optimized for iPhone 6/6s (i.e. the project has a launch screen file and launch screen images for iPhone 6/6s).
Fortunately or unfortunately, iPhone 6 users have the ability to turn on the ‘Display Zoom’ setting on device which enlarges elements of the interface. When turned on, this setting effectively enlarges a standard iPhone 5 screen size to fit in the iPhone 6 screen space, upsampling to x1.171875. This upsampling causes elements that are raster based such as images, icons, or views that contain UIBezierPath() drawings to display blurred (mildly but noticable).
A few questions:
Appreciate any experienced responses on this conundrum. Thanks.
1 - How can I instruct elements (e.g. a UIView) on the Storyboard in code to disregard the Display Zoom setting when the user has turned it on?
2 - What techniques are there to ensure pixel perfect accuracy remains when Display Zoom is on? (e.g. Is it possible to render graphics using OpenGL rendering, if so, how?)
3 - Is it possible to replace a x2 image with a x4 image to reduce any blurring when Display Zoom is on? (i.e. will iOS downsample a x4 image to x2 image on iPhone 6?)
4 - How can UIBezierPath() drawings maintain pixel perfect accuracy when Display Zoom is on?
There's nothing you can do about this. A user who chooses zoomed mode is deliberately throwing away pixel accuracy. The points in the drawing no longer match the pixels on the screen one-to-one (or one-to-two or one-to-three or any integral ratio). This choice therefore blurs the screen for everything the user does, not just your app.
Nor can you detect what is happening, because in effect zoomed iPhone 6 is presented to your app as an iPhone 5 (and a zoomed 6 Plus is presented to your app as a 6).
As #matt says, there's nothing you can do about this for normal UIKit content
However, for Open GL ES or Metal content, you are able to opt-out of the sampling that the device does, and render straight into the device's physical coordinates - allowing for pixel perfect drawing.
In a graphics app that uses Metal or OpenGL ES, content can be easily rendered at the precise dimensions of the display without requiring an additional sampling stage. This is critical in high-performance 3D apps that perform many calculations for each rendered pixel. Instead, create buffers to render into that are the exact resolution of the display.
Open GL ES
Set the contentsScale of the CAEAGLayer to the [UIScreen mainScreen].bounds.nativeScale, or use a GLKView which will automatically do this.
You will then want to create your framebuffer with the size of the device's physical coordinates.
Metal
Set the contentsScale of your CAMetalLayer to the [UIScreen mainScreen].bounds.nativeScale, or use an MTKView which will automatically do this.
You will also want to adjust the drawable size to account for the scale (lifted from the docs):
CGSize drawableSize = self.bounds.size;
drawableSize.width *= self.contentScaleFactor;
drawableSize.height *= self.contentScaleFactor;
metalLayer.drawableSize = drawableSize;
See also this interesting blog post on how the iPhone 6 Plus renders content, plus the follow-up post specifically about Display Zoom.
We're developing an AIR app for android and iOS. An important part of the app is taking photos. Using flash.media.CameraUI works perfectly on android, but we experience problems on iOS.
In the iOS camera application, when we rotate the iPad, the orientation is wrong: If we rotate the pad clockwise, the image is rotated anticlockwise. The UI buttons have the correct orientation though, and when the photo is taken, the resulting bitmap has the right orentation based on the orientation of the camera, not the actual view on the screen.
Looking at different camera apps, I notice that when the pad is rotated so the orientation changes clockwise, the camera does three things: First, the displayed image immediately becomes rotated 90 degrees anti-clockwise (so it looks wrong). Then, the image slowly rotates 90 degrees clockwise, to restore the correct orientation. In addition, the UI buttons change orientation so the text is displayed correctly.
It seems as in our app, it only rotates the image slowly without doing the first immediate rotation. Thus, the end result is wrong.
Anyone know how to fix this?
After some more research, I found that the issue isn't just restricted to AIR.
UIImagePickerController camera view rotating strangely on iOS 8 (pictures)
I am working with OpenGL ES 2.0 and GLKit for iOS.
My app only needs to run at a resolution of 480 by 320 just like the pre iPhone 4 displays as it is using retro style graphics.
The texture graphics are made according to this resolution and a GLKit projection matrix of (0, 480, 0, 320).
This all looks fine on the 3GS but on later models OpenGL (understandably) does some sort of resizing in order to stretch the scene. This resizing results in an undesirable blurring/smoothing of the graphics - probably using some sort of default interpolation scheme.
Is it possible to affect the way this resizing is done by OpenGL? Preferably setting it to no interpolation where the pixels are just directly enlarged.
You need to set the scaling filters on the view like this.
self.layer.magnificationFilter = kCAFilterNearest;
self.layer.minificationFilter = kCAFilterNearest;
I want to render virtual content over the camera image from the back-facing camera of the iPad 2. To achieve this, OpenGL ES is used to transform the content into the correct screen coordinates.
projectionMatrix =
GLKMatrix4MakePerspective(GLKMathDegreesToRadians(FOV),
16.0f / 9.0f, 0.05f, 5.0f);
The problem is the field of view parameter.
There were several posts about iPhone or the iPad 1; however, I couldn't find one yet for iPad 2.
What is the field of view of the iPad 2 in landscape 16:9 HD mode?
This blog will probably help you:
http://hunter.pairsite.com/blogs/20110317/
In particular,
Using some basic trigonometry, this allowed me to determine that 4:3 stills taken with the iPad 2 back camera have an approximate 34.1 degree vertical field of view and an approximate 44.5 degree horizontal field of view. This equates roughly to a hypothetical 43mm focal length lens on a 35mm camera.