I'm trying to build a set of buttons that behave slightly different than regular buttons. The requirements are:
When a user's finger slides over a button, it should highlight (a custom image changes).
When a user's finger slides off the button, it reverts the highlight.
When a user's finger slides off the button and onto a new button (without lifting the finger), a new button highlights and the old one reverts.
If a user's finger is released while on top of the button, the button triggers and the highlight stays.
I think I can implement 1, 2 and 4 using existing the existing button framework.
However, 3 is not possible. As the system continues to register touches when I drag off the button and does not register touches on the new button unless I release. Any ideas?
Related
I have developed a custom tab bar (ready to paste playground source), but now, in addition to user tapping the buttons to switch views, I would like to be able to swipe finger over the tab bar, and have views activated as I'm dragging over the tab bar buttons.
I tried listening on DragGesture(minimumDistance: 0) on individual buttons.
This helps in activating the view on touch down instead of touch up (which is how default Button works), but will only activate the button where user started dragging.
I assume I would somehow need to add .simultaneousGesture(DragGesture(minimumDistance: 0) to the whole tab bar, and I could then probably interpret the touch coordinates against individual button hit tests.
However, this doesn't feel like the SwiftUI way - is there an easier way to let the SwiftUI do the heavy lifting?
(Please note - in playground, I now get AttributeGraph: cycle detected through attribute 2584 notices, and the actual selection lags one tap behind the last tapped button for some reason, but it works okay in Xcode project.)
I don't know if it's a best solution, but I:
Saved a frame in global coordinates in each button
Used DragGesture(minimumDistance: 0, coordinateSpace: .global) on a tab bar, to manually hit test each button
And disabled hit testing on individual buttons with .allowsHitTesting(false), otherwise they would 'steal' the gesture.
Here's a diff of changes
In my React Native app I'm trying to have a button that the user can long press, and without lifting their finger, able to interact with another view. Here is roughly what I want:
Think of it like how 3D touch/long press worked prior to iOS 13/14 (depending on place in system and device): user either 3D touched or long pressed a button, for example an app icon, and a contextual menu popped up. Then, users could, without lifting the finger, hover onto one of the buttons and release their finger, triggering the button tap.
I have complete control over my buttons, touchables, and views (even the tab bar is custom, as opposed to the illustrations I made above).
How can I achieve this? (I'm on React Native 0.63)
There may be a better solution to this but off the top of my head I would use the Gesture Responder System
https://reactnative.dev/docs/gesture-responder-system
You can have a one container view that wraps tab bar and buttons. Then listen to the onResponderMove event to decide when these buttons should appear. This may happen for example when the locationY exceeds some value.
You can also use the onResponderRelease event (again with the help of locationX and locationY parameters) to determine if the finger was released above the button.
Example 1:
When invoking 3D Touch on app icon, you are able to make selections without lifting the finger up.
Example 2
Long pressing on a keyboard key allowing you to drag in to different selections without lifting finger up.
If the app icon is the first view and the pop up is the second view, how can I transfer touch down from first to second view?
Normally, a view loses control of the touches when the fingers leave its area. But if you set isMultipleTouchEnabled to true, it will keep control over touches if the finger leave its area. If you use a button or another UIControl you can assign actions to touchDragExit, touchUpOutside or touchDragOutside etc. to handle events outside of the control.
So if any of you have used the tumblr mobile app recently, you'll notice that the reblog function has a tap and hold capability. Essentially when you tap and hold the reblog button, more buttons pop up around it so the user can just drag their finger over one of the new buttons and release to select it. I've been digging around and no one seems to have an answer to this specific question.I've always seen this as a very elegant way to have a sub menu and would like to implement it into my own apps. For ios btw.
Add gesture recognizer to a view. write a method to create buttons when long press gesture is done on the view.
On the keyboard and in the native calculator app on iOS, it's possible to put your finger down on one button, like '0', and then move your finger to the another button, like '1', release your finger, and have it enter '1'. On the calculator it darkens the button under your finger.
If you start pressing a button, drag your finger outside of the buttons, and then move it back in, it'll continue to highlight the buttons under your finger. However, if you don't start on a button—like you start dragging from the calculator results label—and drag onto the buttons, the buttons do not highlight.
What's the best approach to mimic the calculator's behaviour for buttons? I'm mostly looking for code structuring guidance rather than code examples here!
It seems I won't be encapsulate each button in its own view class, but I'll have to have a Keyboard that handles all the touches, and manually draws the buttons.
I think the easiest way is to add handling for touch control events:
UIControlEventTouchDragInside
UIControlEventTouchDragOutside
UIControlEventTouchDragEnter
And link all the components with some kind of processing logic. Initial control event location and so on.