I have a button that the user has to hold to perform its action.
When the user touch the button initially an animation should start and update the frame and colour of a given view
While the user is still holding the button after some time (2 seconds) the long press should be recognised and a new animation should take place
When the user releases the button we need to update the view to reset its original constraints and:
if not recognised : do nothing (for now)
if recognised : perform the action (unhide a view)
I tried UILongPressGestureRecognizer but I didn't manage to get the status before the long press is recognised
I tried nesting UIViewAnimations and stoping them with view.layer.removeAllAnimations() but some animations was not being removed (I suspect the cycle of callbacks)
The effect, created on an apple watch, is what I want to recreate on my iphone app: http://natashatherobot.com/wp-content/uploads/MenuDemo.mp4
Can anyone point me in the right direction?
Related
In my React Native app I'm trying to have a button that the user can long press, and without lifting their finger, able to interact with another view. Here is roughly what I want:
Think of it like how 3D touch/long press worked prior to iOS 13/14 (depending on place in system and device): user either 3D touched or long pressed a button, for example an app icon, and a contextual menu popped up. Then, users could, without lifting the finger, hover onto one of the buttons and release their finger, triggering the button tap.
I have complete control over my buttons, touchables, and views (even the tab bar is custom, as opposed to the illustrations I made above).
How can I achieve this? (I'm on React Native 0.63)
There may be a better solution to this but off the top of my head I would use the Gesture Responder System
https://reactnative.dev/docs/gesture-responder-system
You can have a one container view that wraps tab bar and buttons. Then listen to the onResponderMove event to decide when these buttons should appear. This may happen for example when the locationY exceeds some value.
You can also use the onResponderRelease event (again with the help of locationX and locationY parameters) to determine if the finger was released above the button.
My question is that i have a UIButton inside my app. When touch it and move my finger outside of button, highlighting continues to specific points around it and then returns back to normal.
My expectation is that when i go to outside of button boundary it should return back to normal immediately.
Do you have any idea for this problem ?
I think you need to target drag event when drags fingers goes out side the boundary of the button it should release the content and here you can check dragoutside event
Triggering a UIButton's method when user drags finger into button's area?
I'm creating a UIViewController that houses some of RosyWriter's functionality, which I'm reworking to create a video recorder. The screen contains a title, a clipped-to-bounds subview that contains the AVCaptureVideoPreviewLayer (so the sublayer CALayer of the video content is added into that subview, so I think it's quite deep-nested), and two buttons acting like a toggle button - in the form of Start and Stop buttons placed in the Storyboard for the UIViewController.
The Start button works fine, even though the video's preview layer is on screen and showing the camera. When I start recording, though, I switch the buttons round, making the Start button hidden and the Stop button hidden=false.
The start button works - this is pressed when the video preview is on-screen and updating, but the actual recording (grabbing samples in buffers and writing them to a file - nothing UIKit related as far as I can see) has not started.
When the video recording is active, with the Stop button showing and the Start button hidden, the visible stop button isn't pressable, but the hidden start button can still be pressed.
I've tried moving the Stop button above the UIView containing the video, in case the CALayer or something else stretches outwith the clipped UIView bounds. This doesn't help. The stop button always acts as though it's not enabled - but it is enabled, and nothing appears to overlap the button. The button works fine if the UIView containing the video (which, I'll reiterate, is lower than the broken button) is never shown.
Can anyone think why this'd happen? I immediately thought about setNeedsLayout and setNeedsDisplay and tried just throwing some of those in, because it's almost as though the view had updated with my request to hide or show buttons, but an interaction layer hadn't updated.
Thanks
I now think this is due to my ignorant use (or misuse) of dispatch queues. I'm still not sure how I was able to interact with buttons that aren't showing - perhaps it's because their own state change responds in part, immediately, and the rest can't take place except on the main queue (the visual refresh, for example).
So I solved the problem, in a sense, by forcing a particular asynchronous delegate method to be called on the main queue. This doesn't affect the operation of this particular step in my recording process.
I have a question about programming my own third-party keyboard for iOS8. My keyboard already looks pretty good, but some functionalities are missing. If you look at the standard keyboard on an iPhone you can press any button and if you swipe your finger to another button, the touch event of the first button gets cancelled and your second button is "active". So e.g. if I press the button "E" and swipe my finger to "R" and release my finger, the letter "R" is the selected one. But I don't know how to implement this.
Now in my app when I press a button and swipe my finger around, this button never gets "released". Seems like I'm stuck on that button as long is I have my finger put on the display.
I think I need these touch events:
TouchUpInside: when the user taps a button, and releases it inside the buttons' frame, this event gets fired (that's when I want to write a letter)
TouchDragInside: That's the event when I already have my finger on the display and swipe my finger "inside" the buttons' frame.
TouchDragOutside: Same as above, just swiping outside the buttons' frame.
But here's the problem: TouchDragInside just get's fired for the button I tap. So when TouchDragOutside gets fired, I have to "release" the button and make the button active where my finger is at the moment.
I hope you understand my question, if you need some further information or some details just let me know.
You could consider not using UIControl at all.
Just use a transparent overlay on top of all the keys so that you can manually deal with which key the message should be dispatched towards.
I'm trying to build a set of buttons that behave slightly different than regular buttons. The requirements are:
When a user's finger slides over a button, it should highlight (a custom image changes).
When a user's finger slides off the button, it reverts the highlight.
When a user's finger slides off the button and onto a new button (without lifting the finger), a new button highlights and the old one reverts.
If a user's finger is released while on top of the button, the button triggers and the highlight stays.
I think I can implement 1, 2 and 4 using existing the existing button framework.
However, 3 is not possible. As the system continues to register touches when I drag off the button and does not register touches on the new button unless I release. Any ideas?