Recognize long press in swift without UILongPressGestureRecognizer - ios

I'm building an app around gesture recognition.
I've already built my code with recognition of taps, swipes (even with multiples fingers), pinches.
Now I'd like to recognize long press gesture without using UILongPressGestureRecognizer because it enters in conflit with my recognition of other gesture after (I tried).
What I'm currently doing is that I get the time in touchesBegan, in touchesMoved i calculate the time difference, and if it's greater than 400ms (for exemple), i call a function.
The thing is that this function is only called when the finger moved a bit and not when it's perfectly static.
Another option is to set a kind of delay in the touchesBegan and check if the finger is still on the screen after 400ms and then call the function.
How could I do that without blocking the rest of the gesture recognition ?
The aim of this long press would be to do a variation of intensity of a light or something like that (from 0 to 1s, light increase until max is reach, and then lower until minimum etc).
Next, I'll try to recognize a rotation gesture (with only one finger), so if you also have an answer for this, that'd be perfect.
Thanks !

Do not set delay. Start a timer that will fire after 400ms. In touchesEnded invalidate the timer, in case it was called before 400ms. When the timer fires, call the desired function.
As to your second question, probably you will need to calculate the trajectory of the points in touchesMoved method. If somehow the moves resemble rotation (you will need some kind of threshold for that), call the appropriate function.

Related

UILongPressGestureRecognizer only checks for numberOfTouchesRequired at initial touch

I have an app that has three long press gestures for 1,2, and 3 touches.
An Issue: through testing, my success rate of getting 2 or 3 touches to register correctly at the start of gesture recognition was far less than 100%. In further testing, it appears that the UILongPressGestureRecognizer [LPGR] only checks for the number of touches at the start and fails instantly if the number of touches is not what it's expecting.
My (potential) solution: I have started building a generic UIGestureRecognizer that will check for the number of touches at the end of a time interval, then pass touches to whatever branch of code that handles what was 1,2,and 3 touches.
My Question: Is there a better way? And barring that, is there a way to use the code already in an existing gestureRecognizers? I have not been able to actually change the state through reference (someRecognizer.state = .changed does not seem to do anything)

Avoid triggering touchesBegan: until a swipe gesture recognizer fails

I'm making a game on the iPad where the player swipes up, down, left, or right to move the character. An attack is controlled by touchesBegan:withEvent:
My problem is that the character attacks whenever he moves.
Is there a way to set up a swipe gesture so the code doesn't run touchesBegan:withEvent: until it sees if the motion is the beginning of a swipe or not?
This is not too easy of a task. Without using some custom gestures I would suggest you to try the combination of UISwipeGestureRecognizer and UILongPressGestureRecognizer. I know this sound silly but it is not: An UILongPressGestureRecognizer acts pretty much the same as the pan gesture so even if the finger is dragged you will receive events. You need to set some proper minimum duration till it fires (depends on the swipe gesture) and some large minimum drag length so it doesn't get canceled for dragging. You need to remove the touch event methods then and move the code to long press gesture action.
To explain the result, your long press gesture will (if set correctly) work just the same as touch events except it will wait for specified duration. If in that duration a swipe is detected your long press gesture will not fire. Seems just what you need...

Accurate start position for UIPanGestureRecognizer?

I am using a UIPanGestureRecogniser to implement drag and drop. When the drag starts I need to identify the object that is being dragged. However the objects are relatively small. And if the user doesn't hit the object right in the centre of the object it isn't getting dragged.
The problem is that when the gesture handler is first called with the state UIGestureRecognizerStateBegan, the finger has already moved several pixels, and so [UIPanGestureRecognizer locationInView:] returns that point, which is not where the gesture truly started. That makes sense as it can only recognize a pan after a few pixels of movement. However, I need the absolute start of the gesture, not the position after the gesture has first been recognized.
I'm thinking that maybe I need to implement a tap gesture recognizer as well, purely to capture the first touch. But that seems like a hack for what is not an unusual requirement. Is there no other way of getting that first touch from within the pan gesture recognizer?
UIGestureRecognizerDelegate protocol provides methods gestureRecognizerShouldBegin: and gestureRecognizer:shouldReceiveTouch: that can help you evaluate the touches before the pan has transitioned to state UIPanGestureRecognizerStateBegan

iOS - How to make an animation track touches

What is the best way to implement a smooth reversing animation which tracks touches? I am referring to those animations in which, for example, if the user executes a swipe gesture some elements smoothly animate on screen, and others off, but if the user instead slowly drags a pan gesture back and forth the same objects will move forward/backward as a percent in accordance with the touch position. This is seen in many app intros and also in transitions. I have found
One tutorial which discusses the built-in facility for this but it is only between view controller transitions, not providing the full granular control I see in many apps (http://www.doubleencore.com/2013/09/ios-7-custom-transitions/)
Jazzhands, which is a kit by IFTTT, but this is a packaged solution that might not cover how the solution is best implemented at a lower level (https://github.com/IFTTT/JazzHands)
A question here for which one answer shows how you might execute an animation after a gesture ends (iOS Touch, Gestures, Animation)
What I don't grasp - and I'm comfortable using CAAnimations and gestures - is how something can be both animated and interactive.
Typically, when I create an animation, I commit the animation and it goes from start to finish. While I could interrupt the animation as touches continue, that seems like it would be stilted.
On the other hand, moving things in response to user input is easy, but that is not animated.
How is the effect achieved where something can change according to an animation, but also have that exact same animation occur tied to touches, and yet still also have it so that although the animation reaches completion it doesn't really "finish" (become irreversible) unless the user releases touch, while at any point during interaction if the user releases panning then the animation either reverts backwards to its starting position or animates to completion depending on the last touch location and velocity. These requirements are baffling.
The glimpses of this technique I see all involve keyframe animations, but what I don't understand is where the touch events intersect with an animation to create these smooth effects I see.
Any tips, examples, or tutorials are most welcome.
What I don't grasp - and I'm comfortable using CAAnimations and gestures - is how something can be both animated and interactive.
It is because, having set up an animation, you can set that animation to any "frame" you wish. Thus you can track the animation in correspondence to the movement of a gesture.
The way this works is that an animation is a feature of the render tree, belonging to a CALayer. CALayer implements the CAMediaTiming protocol. The timeOffset of a CALayer thus determines what "frame" of an animation that layer displays. If a complex animation involves many different layers, no problem; just set the timeOffset of their mutual superlayer, to control the frame of the entire animation.
That in fact is exactly how the new iOS 7 interactive custom transition feature works (to which you rightly refer in your question). For instance, in this example code:
https://github.com/mattneub/Programming-iOS-Book-Examples/blob/master/iOS7bookExamples/bk2ch06p296customAnimation2/ch19p620customAnimation1/AppDelegate.m
... I keep updating the UIPercentDrivenInteractiveTransition to tell it how far through the gesture the user is, and the animation therefore tracks the gesture. Now ask yourself: how the heck is this possible???
Well, the UIPercentDrivenInteractiveTransition, in turn, behind the scenes, keeps adjusting the layer's timeOffset to portray the animation at the corresponding frame. (You can actually add logging code to my example to see that this is true.)
Moreover, when I end the gesture at an incomplete point, the animation either hurries to its end or runs backwards to its beginning - again, this is because of the CAMediaTiming protocol, which lets you change the speed of the animation, including a negative value to run it backwards.

CABasicAnimation speed -- Keeping up with user input

Update: It really was as simple as not animating the UI element when utilizing touches. It perfectly follows touches now with no lag.
I'm currently attempting to implement a UI feature by implementing a CALayer subclass inside of a UIView subclass. I receive touch events in the custom UIVIew's corresponding view controller, notify the UIView about the touches, which in turn notifies the CALayer in order to animate the UI elements drawn in the layer.
It all works, but I have noticed that when there is a big delta in movement (as in when quickly scrolling a finger), the CABasicAnimation lags behind. Ideally I want the animation to stay perfectly aligned with the user's finger.
I've come up with a hacky way of just setting the animation's speed arbitrarily high as in
anim.speed = 10.0f;
which essentially keeps up with the user's finger, but I feel that this is a total hack and not a shippable solution. Should I be artificially limiting how many touch events are processed in order to solve this problem? Is there some sort of calculation I should be doing for the speed/duration of the animation that I'm not aware of?
Thanks for any help with this!
During the continuous gesture, one shouldn’t animate movements, but rather just move directly to the gesture’s location. When the gesture finishes, if you want it to settle in some other position, then animate that final, post-gesture, destination. But don’t animate during the gesture itself.
In rare cases, where rendering of a single frame is incredibly slow, there can still be perceived lagginess. Obviously, one should optimize the draw(_:) process so that it isn’t slow (or take a snapshot and animate the snapshot view rather than the complicated view). But during the gesture, you can also use “predictive touches,” where the OS estimates where user’s gesture is going to be in the future. For example, you can implement touchesMoved(_:with:) and then call predictedTouches(for:). By moving the view to the predicted touch location, it reduces perceived lagginess.

Resources