I'm using Qt 5.1 beta on iOS. I am deploying my app on an ipad.
The problems I am having regard how touch events are sensed and processed by qt. As long as I keep the ipad oriented straight (i.e. frontal camera is up), everything works fine. In that configuration, if I touch the screen, the coordinates of the point of touch sensed through mousePressedEvent(QMouseEvent *e) indicates that, as expected, the origin of the coordinate system is in the upper left corner of the screen.
When I turn my ipad, let's say left, so that the camera is to the left, my ui correctly rotates so that the buttons that I have are aligned to the screen. However, if I sense the touch events as described above, the origin of the coordinate system has not changed, so now it is in the lower left corner of the screen. Because of this, the buttons act as if they were not aligned to the screen but as if they turned around the same way I turned the screen (even if they are rendered aligned) So, if I tap on the visualized button it won't sense the touch, but it will if I tap were they would be if they would have not changed orientation as the screen did.
Does anyone have any idea what might cause this or how it could be fixed?
Please ask if you would like to see code. I did not put any as I would not know what might be of interest and my app is quite big already.
Thanks,
Corneliu
You file a bug report to the trolls about it. And also check to see if there is a bug report about it already.
http://bugreports.qt-project.org/
For the mean time you could push your coordinates through a mapping function of some sort based on the orientation of the device.
Also calling 'adjustSize()` on widgets tends to fix sizing and positioning with layouts.
Hope that helps.
Related
I maintain an OpenGL app that's been running on iOS since 2010. It uses the full screen and hides the status bar. It launches without any .nib file and creates an OpenGL view & controller that, in turn, displays all app content.
What changes do I need to make so the app will work on iPhone X using the new 'safe area' layout design? Presumably the only real change is just creating my "EAGL" surface/view with the same dimensions and location as the safe area instead of the entire screen?
How you respect the safe area in a "fullscreen" app (like most GL, Metal, etc games) is really two questions: one of design, and one of implementation. (But it's easier to tackle them in the reverse of that order, so here goes...)
Making fullscreen OpenGL views
If you have a fullscreen view (e.g. the window's root view controller) and you just set its layerClass to CAEAGLLayer (as is par for the course in most OpenGL ES work), you get a view that covers the entire 1125 x 2436 rectangle of the iPhone X screen. (Be sure to set the scale, too, so you actually get all those pixels... 375 x 812 # 1x scale probably looks hideous on that screen.)
That's probably the user experience you want for your app/game (and it's the one Apple encourages)... your 3D content extends all the way to the edges of the screen, around the curves at the bottom and the 🤘 at the top. That makes a much nicer UX than leaving black borders around all your content.
Designing fullscreen content for iPhone X
On the other hand, the existing design of how your OpenGL content appears may or may not fit well with the curiously shaped screen of iPhone X. If you have anything along the very top that the user is expected to see, it'll be obscured behind the camera/sensor/speaker cutout. Similarly, if you have anything important at the bottom, its edges will be cut off behind the curved corners.
In that case, you'll want to leave the unimportant parts of your fullscreen content (like a game's view of a 3D gameplay world) fullscreen, but inset any important content like UI overlays or interactive 3D elements. As for how you might do that, there's a couple of feasible approaches, with tradeoffs:
Hard-code the iPhone X obstruction dimensions, detect when you're running on iPhone X, and fix your layout accordingly. This is straightforward, but not robust. If Apple decides to change the way software UI elements around screen edges (like the swipe-to-home indicator) work, or makes iPhone XI (or X2? or XX?) next year with a slightly different shape, you'll need to update again to adapt.
Use the Safe Area guides even though you're not using UIKit or Auto Layout to draw/position onscreen content. Ask the view for its safeAreaLayoutGuide and convert that guide's bounds to whatever coordinate system you use for positioning the content you draw with OpenGL. This is a little more work, but it ensures that your app is ready for any curveballs Apple throws in the future.
One more thing...
It uses the full screen and hides the status bar.
When designing for iPhone X, it's worth rethinking whether a "fullscreen" app should hide the status bar. On other iOS devices, showing the status bar means taking away useful space from your app's content. But on iPhone X, most apps don't have anything useful they can do with those "devil horn" corners anyway — your user might appreciate still being able to see the clock, battery, etc.
I have been looking for days how to make appear a small window in screen showing in zoom what is under the finger, so you can move the finger around viewing whats under it.
You can see this feature for example in scanner applications, when you are cutting the margins with precision.
Sorry there is no code , honestly i have been doing the home work but does not know how to start coding this. thanks.
here is an example.
screen shot from scanner app. the circle shows in zoomed view what is under the finger, who is in the rigt corner of the picture!!
When I run my app on the simulator, the slider works great, accurate.
However, when the app runs on my phone, the slider is not accurate. For example, my finger points to value 100.0 and at the moment I release my finger from the screen, the value jumps to 102.2 or 98.2 or 91.5. It's never stops on the right value.
Can I fix it? Does anyone faced this problem before?
I would appreciate any help!
The simulator allows precise control using a mouse cursor. It doesn't represent reality at all compared to an actual iOS device.
A real iOS device requires the use of your much fatter finger. As you lift your finger you actually touch different parts of the screen. This is typical and not much you can do about it.
I have a really strange issue here. One of my testers found when they rotate my device in increments of 180 degrees the device UI will stop responding (not crash), but when rotated by 90 degrees, it works fine.
After some exploration with the Reveal app I found that after the app becomes responsive, a UISnapshotView is covering my UI, when I tell that layer to hide, I find that my app is still working fine behind it.
I am not creating that layer, and I believe its the layer that iOS uses to animate transitions and rotations. So my question is, what could be happening to cause that layer to get stuck, and only get stuck when you rotate 180 degrees (ie. flip the device to the opposite orientation.)
I do not do any custom animations, and I have no code being called on rotate.
Apparently this was related to some code I found to remove the rounded corners on the split-view in-case anyone else has this problem.
I am facing a strange issue with screen scrolling on 9810 device and simulator.
I have a complete order screen, which is shown when the order of the user is confirmed.
At the top there is Vertical Field Manager which contains another VerticalFieldManager ( containing Label Fields and buttonFields ) and a FlowFieldManager (containing images).
Now the problem that i am facing is that whenever i scroll the screen up and down , there are many gray lines appearing on the screen. It seems as if there is some screen refresh issue with the device. I tested on previous OS (4.5, 4.5 4.7 5.0) version, everything is working just fine on them. The problem is arising on OS version above 6.0 .
While the correct screen must be like
As you can see these gray lines appear whenevr i scroll screen up and down. Any ideas how to rectify this issue ?
In the first image, it looks like you are trying to add a shadow effect at the top of the screen. The vertical field manager uses some graphics optimization to improve scroll performance. Instead of repainting everything, it picks up the pixels on screen in the layout area, and shifts them. This works so long as all the painting code is relative to the virtual extent.
Certain UI effect, like a shadow effect, are relative to the screen, rather than the virtual extent, so this optimization picks up those effects and copies them elsewhere, which looks bad. It also tends to look just like your first image.
There are two ways to fix this:
Turn off the optimization. Override isScrollCopyable to return false. Your visual problems should go away, but scrolling performance will suffer.
Don't add UI effects on top of a scrollable area.
I am very sorry for the late reply. However i fixed the issue by myself. I just overrided the paintBackground method in my class and inside that i wrote graphics.clear(). This seems to fix this scrolling issue. I will try Michael method too though.