How to manually controll animation progress in iOS app? - ios

I am planing a "tutorial/demo app" that should show the user how to control other apps/software by presenting a screenshot walkthrough, e.g.
Show in Screenshot1
Fade in a label above the Screenshot that tells the user "Touch this button"
Run som click/touch animation above the button
Fade in Sceenshot2
Zoom in to some area of the Screenshot
...
Creating this using basic UIView animation is not a big deal:
Start animation to show the lable
Run click/touch animation with some delay
...
However the user should be able to manually control the progress of this animation, move forward and backward, pause, etc. Just as if the presentation would be a movie that can be controlled using the timeline.
If the user does not do anything, the animation simply runs and some view (e.g. a UISlider) shows the progress. But if the user touches the slider the animation stops. By moving the slider back or forth, the user can rewind or fast forward the presentation.
How can this be done?
Stopping the animation at is no problem, but UIView animation do not have any properties to control the progress, do they?

You can try to use this framework https://github.com/IFTTT/RazzleDazzle

Related

React Native press and hold, drag finger to another touchable and capture touch by that view

In my React Native app I'm trying to have a button that the user can long press, and without lifting their finger, able to interact with another view. Here is roughly what I want:
Think of it like how 3D touch/long press worked prior to iOS 13/14 (depending on place in system and device): user either 3D touched or long pressed a button, for example an app icon, and a contextual menu popped up. Then, users could, without lifting the finger, hover onto one of the buttons and release their finger, triggering the button tap.
I have complete control over my buttons, touchables, and views (even the tab bar is custom, as opposed to the illustrations I made above).
How can I achieve this? (I'm on React Native 0.63)
There may be a better solution to this but off the top of my head I would use the Gesture Responder System
https://reactnative.dev/docs/gesture-responder-system
You can have a one container view that wraps tab bar and buttons. Then listen to the onResponderMove event to decide when these buttons should appear. This may happen for example when the locationY exceeds some value.
You can also use the onResponderRelease event (again with the help of locationX and locationY parameters) to determine if the finger was released above the button.

Animate view and stop midway

I want to make a view/header collaspe/expand animation, but I also want to be able to pause or stop from expanding/collapsing while the user takes his finger off the screen.
I have a current implementation for this, but the animation is not very smooth.
I will also add some images to explain better what I want to do.
So the first image is how it should look when expanded.
The second image is how it should look if the user decides to take take his finger off the screen while he would scroll slowly so the animation/the view should stop in a state similar to that.
The third image is how it should look when it is collapsed.
If someone has any recommendation how I could achieve this I would be very grateful.
If you are targeting iOS 10+, use UIViewPropertyAnimator which encapsulates an animation and allows scrubbing and controlling the animation:
Start, pause, resume, and stop animations; see the methods of the UIViewAnimating protocol.
Add animation blocks after the original animations start using the addAnimations(_:) and addAnimations(_:delayFactor:) methods.
Scrub through a paused animation by modifying the fractionComplete property.
Change the animation’s direction using the isReversed property.
Modify the timing and duration of a partially complete animation by pausing the animation and using the continueAnimation(withTimingParameters:durationFactor:) method to finish it.
You just need to wire up touches to UIViewPropertyAnimator object encapsulating the animation.
There are many good tutorials on how to start with it, e.g. this one.

UIView: Inheriting Touch

Example 1:
When invoking 3D Touch on app icon, you are able to make selections without lifting the finger up.
Example 2
Long pressing on a keyboard key allowing you to drag in to different selections without lifting finger up.
If the app icon is the first view and the pop up is the second view, how can I transfer touch down from first to second view?
Normally, a view loses control of the touches when the fingers leave its area. But if you set isMultipleTouchEnabled to true, it will keep control over touches if the finger leave its area. If you use a button or another UIControl you can assign actions to touchDragExit, touchUpOutside or touchDragOutside etc. to handle events outside of the control.

ForceTouch / Tap & Hold animation

I have a button that the user has to hold to perform its action.
When the user touch the button initially an animation should start and update the frame and colour of a given view
While the user is still holding the button after some time (2 seconds) the long press should be recognised and a new animation should take place
When the user releases the button we need to update the view to reset its original constraints and:
if not recognised : do nothing (for now)
if recognised : perform the action (unhide a view)
I tried UILongPressGestureRecognizer but I didn't manage to get the status before the long press is recognised
I tried nesting UIViewAnimations and stoping them with view.layer.removeAllAnimations() but some animations was not being removed (I suspect the cycle of callbacks)
The effect, created on an apple watch, is what I want to recreate on my iphone app: http://natashatherobot.com/wp-content/uploads/MenuDemo.mp4
Can anyone point me in the right direction?

UIView subviews updating visually but interact as though in previous state

I'm creating a UIViewController that houses some of RosyWriter's functionality, which I'm reworking to create a video recorder. The screen contains a title, a clipped-to-bounds subview that contains the AVCaptureVideoPreviewLayer (so the sublayer CALayer of the video content is added into that subview, so I think it's quite deep-nested), and two buttons acting like a toggle button - in the form of Start and Stop buttons placed in the Storyboard for the UIViewController.
The Start button works fine, even though the video's preview layer is on screen and showing the camera. When I start recording, though, I switch the buttons round, making the Start button hidden and the Stop button hidden=false.
The start button works - this is pressed when the video preview is on-screen and updating, but the actual recording (grabbing samples in buffers and writing them to a file - nothing UIKit related as far as I can see) has not started.
When the video recording is active, with the Stop button showing and the Start button hidden, the visible stop button isn't pressable, but the hidden start button can still be pressed.
I've tried moving the Stop button above the UIView containing the video, in case the CALayer or something else stretches outwith the clipped UIView bounds. This doesn't help. The stop button always acts as though it's not enabled - but it is enabled, and nothing appears to overlap the button. The button works fine if the UIView containing the video (which, I'll reiterate, is lower than the broken button) is never shown.
Can anyone think why this'd happen? I immediately thought about setNeedsLayout and setNeedsDisplay and tried just throwing some of those in, because it's almost as though the view had updated with my request to hide or show buttons, but an interaction layer hadn't updated.
Thanks
I now think this is due to my ignorant use (or misuse) of dispatch queues. I'm still not sure how I was able to interact with buttons that aren't showing - perhaps it's because their own state change responds in part, immediately, and the rest can't take place except on the main queue (the visual refresh, for example).
So I solved the problem, in a sense, by forcing a particular asynchronous delegate method to be called on the main queue. This doesn't affect the operation of this particular step in my recording process.

Resources