I have been looking for days how to make appear a small window in screen showing in zoom what is under the finger, so you can move the finger around viewing whats under it.
You can see this feature for example in scanner applications, when you are cutting the margins with precision.
Sorry there is no code , honestly i have been doing the home work but does not know how to start coding this. thanks.
here is an example.
screen shot from scanner app. the circle shows in zoomed view what is under the finger, who is in the rigt corner of the picture!!
Related
When I double tap, I get this weird pill shaped mini display that shows what my finger touches (without maginifying it or anything). Scrolling does not work while it is displayed. What's the point of it, and can I disable it in my PWA application?
I can provide a partial answer to your question. This pill shaped view essentially shows the user what is under their finger / thumb. It aids in text selection (for copying certain parts of text), or cursor placement while editing text.
The other part about the question regarding whether you can disable it in your PWA, I am not sure, but essentially this is default webview behaviour. I hope someone can help you out with this, or maybe you’re able to figure this out.
I'm looking for some suggestions to on how to approach an animation/functionality in an iPad app.
My client has tasked me with rebuilding an old app of theirs. They don't have the original code, only the app installed on a simulator on their Mac. The video below was made using the old app on a simulator.
The best way to describe what I'm trying to do is to simply show you a short clip, here: https://youtu.be/odft0pNGdvg
Basically, I need to slide the background/scene/sprite/whatever it may be, over to reveal another panel. Then move tiles over from the new panel onto the main panel. The main panel, while moved somewhat over, can also still be interacted with.
I have a good portion of the app built with SpriteKit already, this last big features alludes me though, lol. Any ideas on how to approach this? One idea I had was stacked SKViews or SKScenes, but I can't really figure that out.
Thanks in advance for any ideas!
Speaking about UIKit structures, I can see a "slide menu" where the left view have a double animation of scaling and fading. Speaking about sprite-kit, if you want to realize something similar to that video, you can make an SKScene with two SKNode, a couple of buttons on the top of the first SKNode to change its behaviour, each node should have a "background" sprite where the left node should have 55% or 60% of the full scene size width, the second node should have the full size of the scene.
The first situation will be: the left node hided and zoomed out and the right node covering the full screen with a zPosition greater than the left node.
Then you should implement the swiping gestures and animating the nodes following your demonstration video, so the left node should move to the right with a right swipe and in the same time left node should appear , scaling and fading to the left part of the screen. It should simple to making.
[enter link description here][1]The iOS app Device 6 has really left a mark on me, and as a iOS developer, I am confused as to how the app was created.
Does anyone know how they were able to create parallax scrolling with non-interative text all around the image?
Here's a link demoing the app, in about 16 seconds into the video, you can see the parallax scrolling.
https://www.youtube.com/watch?v=-VdeB9_q9nU
P.S. for anyone confused, basically I created a label with a few lines of text. Now underneath that label, I want to place a box that's 2x2 inches. Inside that box should be a image, and as the user scrolls down, the image inside the box should scroll down too, just at a different speed. I have no idea how to do that! For a visual interpretation, just see the video link I posted above.
Recently I found a weird iPad behavior on touches. I have a UITableView that slides in from the right edge of the screen(Like Facebook app has it on the left side) on swipe. In my implementation I have added a strip of UIView and have added swipe gesture recognizer. My application is in Landscape mode only
Now since the view comes out from the right edge, the general behavior would be to start swiping the finger from outside the edge of the screen. The menu shows up perfectly all the time, if the swiped finger starts from the edge that has the home button. However in landscape right mode i.e when the camera edge is on the right, and I swipe from that end, the gesture gets recognized once in 3-4 attempts.
I implemented touchesBegan method in the same class and got exactly the same behavior.
Does my application fails to identify touches from that edge(that to selectively)? What exactly it is happening? looks like it is a deadlock on Coding front.
Please help.
Thanks
Since you want the touch to begin from the edge of the screen, why not use UIScreenEdgePanGestureRecognizer? This class is specifically used for touches that begin near the edge of the screen.
There is a simple example here.
Is your app is full-screen? if so than the ios system is trying to pull down the SB-settings from that side
I'm seeing the exact same thing. And I do use UIScreenEdgePanGestureRecognizer, I even apply the recognizer to the whole screen view.
Also, I see the same behavior in "portrait-upside-down" orientation.
I think it's a bug in the touch recognition in iOS 7 and 8 (actually until I saw this post I thought it's the gesture recognition but that doesn't explain the same behavior for generic touch handling).
I'm using Qt 5.1 beta on iOS. I am deploying my app on an ipad.
The problems I am having regard how touch events are sensed and processed by qt. As long as I keep the ipad oriented straight (i.e. frontal camera is up), everything works fine. In that configuration, if I touch the screen, the coordinates of the point of touch sensed through mousePressedEvent(QMouseEvent *e) indicates that, as expected, the origin of the coordinate system is in the upper left corner of the screen.
When I turn my ipad, let's say left, so that the camera is to the left, my ui correctly rotates so that the buttons that I have are aligned to the screen. However, if I sense the touch events as described above, the origin of the coordinate system has not changed, so now it is in the lower left corner of the screen. Because of this, the buttons act as if they were not aligned to the screen but as if they turned around the same way I turned the screen (even if they are rendered aligned) So, if I tap on the visualized button it won't sense the touch, but it will if I tap were they would be if they would have not changed orientation as the screen did.
Does anyone have any idea what might cause this or how it could be fixed?
Please ask if you would like to see code. I did not put any as I would not know what might be of interest and my app is quite big already.
Thanks,
Corneliu
You file a bug report to the trolls about it. And also check to see if there is a bug report about it already.
http://bugreports.qt-project.org/
For the mean time you could push your coordinates through a mapping function of some sort based on the orientation of the device.
Also calling 'adjustSize()` on widgets tends to fix sizing and positioning with layouts.
Hope that helps.