Update UISnapBehavior Point - ios

In my app, I have small views that can be moved around by dragging them. When the user begins dragging on one of the small views, it stays in place, but is 'stretched' for a certain distance (~200 points). After the distance is exceeded, then the small view should perform a 'snap' effect to the user's finger. The whole animation is best described as a rubber band breaking.
So far, I've implemented the snap effect like so:
UISnapBehavior *snapBehavior = [[UISnapBehavior alloc] initWithItem:iconView snapToPoint:[[touches anyObject] locationInView:self.view]];
[self.snapAnimator addBehavior:snapBehavior];
The problem is, if the user continues moving their finger while the snap animation is being performed, the view continues to move towards the point where the finger was when the effect began.
Is there any way to update UISnapBehavior's destination when touches are moved?

No, there isn't. Snap behavior is extremely simple: it just snaps to the point you tell it, right then. That's all it does. One behavior, one point, one item, one snap.
You would need, therefore, to remove the snap behavior and substitute a different snap behavior with a different point. Or use a different kind of behavior, one where you can move the target point as the user's finger moves (such as UIAttachmentBehavior).

In iOS 9 and above, you can. There is now:
#property (nonatomic, assign) CGPoint snapPoint NS_AVAILABLE_IOS(9_0);
It works as expected.

Related

How to get correct screen coordinate from UITapGestureRecognizer while VoiceOver is on

I'm currently working on an interactive view that relies heavily on the user's touch location. I have found that there are a few ways to interact with the UITapGestureRecognizer while VoiceOver is on, but when I tap my point the values given are very wrong. I've looked elsewhere, but my use case is outside of the norm so there is not a lot to tell me what is going on. Has anyone experienced this before?
I am aware that I can change accessibilityTrait to UIAccessibilityTraitAllowsDirectInteraction which will give me the correct screen point when used, but I would like to know what is causing this issue at the very least for the sake of knowledge. To interact with the UITapGestureRecognizer I either double tap or do a 3D touch by pressing on hard on the screen. The ladder method doesn't work for the tap gesture but will work for the pan gesture.
This is the only line I use to get my screen points. My map view is a UIImageView
CGPoint screenPoint = [tapGesture locationInView:map];
I'm using a map of a building and I try to tap the same corner or landmark for my testing. I know I can't hit the same exact point every time, but I do use a stylus and I can get pretty close.
Without VoiceOver on I would get the result: (35.500, 154.363)
With VoiceOver on and tapping in generally the same spot, I get : (187.500, 197.682)
The point I am using to test is on the left side of the screen and the result from VoiceOver being on is in the middle of the screen. I believe the y-axis value may have changed because of my tool bar's size, but I have no idea what is throwing off the x-axis value. If more information is needed let me know.
UPDATE: Upon further investigation, it turns out that the UITapGestureRecognizer will always return (187.500, 197.682) no matter where I touch in the map view when VoiceOver is on. That point seems to be the middle of the map view. Oddly enough though, the UIPanGestureRecognizer will give me the correct (x,y) for my view if I use the 3D touch while VoiceOver is on.
On a side note not relating to the problem at hand, it seems if I use the accessibility trait UIAccessibilityTraitAllowsDirectInteraction the method UIAccessibilityConvertFrameToScreenCoordinates returns a frame that is higher than my view. It works fine if I do not change the trait.
Your problem may deal with the reference point used when VoiceOver is on.
Verify what your point coordinates are referring to : view or screen coordinates ?
I suggest you take a look at the following elements :
accessibilityFrame
accessibilityFrameInContainerSpace
UIAccessibilityConvertFrameToScreenCoordinates
According to your project, the previous elements may be interesting to get your purposes.

Dragging an uiview like facebooks menu slide

I know this has been probably asked before but I've seen many approaches and i don't know which is best for me, so plz don't send me a link to another post unless it addresses my problem directly.
I have a controller which has a uiview on the top (like a header) (this header is bigger than it seems because is partially hidden on top). on that view i have a uibutton which now with a touch up inside shows the entire header view and taping again returns it to its starting position (changing frame with animation). I want to also be able to drag the view but only changing position on the y axis(dragging up and down)... i was thinking of adding the dragInside/Outside event to the button but this doesn't give me the position of the finger... and also want to know when the user releases the drag so the view ends animation to any of its two possible states (showing or partially hidden). Is this a "touches began" , "touches moved" , "touches ended" thing? if it is please provide a code example. I also want to do this with another view but this is on the left side... same thing but this one moves on the X axis... any help is appreciated. or maybe it can be made with drag event if i only can save a CGpoint of last touch, maybe that's better, any other suggestions
Look at using a UIPanGestureRecognizer to detect the touch movements. Use the translationInView: of the gesture to set the view y position. The translation is the total movement since the start of the gesture so you don't need to remember and accumulate the offset position yourself.
The main thing to worry about while implementing this is bounding the y position of the view so that no matter how far the user drags the view won't go too high or low on the screen.
Use a UIPanGestureRecognizer, that's a class dedicated to handling such drag/pan gestures.
Everything is described here in Apple's documentation, including examples, so you should find your answer here.
There is also some sample code in Apple Developer Library that shows you how to use Gesture Recognizers if needed.

CABasicAnimation speed -- Keeping up with user input

Update: It really was as simple as not animating the UI element when utilizing touches. It perfectly follows touches now with no lag.
I'm currently attempting to implement a UI feature by implementing a CALayer subclass inside of a UIView subclass. I receive touch events in the custom UIVIew's corresponding view controller, notify the UIView about the touches, which in turn notifies the CALayer in order to animate the UI elements drawn in the layer.
It all works, but I have noticed that when there is a big delta in movement (as in when quickly scrolling a finger), the CABasicAnimation lags behind. Ideally I want the animation to stay perfectly aligned with the user's finger.
I've come up with a hacky way of just setting the animation's speed arbitrarily high as in
anim.speed = 10.0f;
which essentially keeps up with the user's finger, but I feel that this is a total hack and not a shippable solution. Should I be artificially limiting how many touch events are processed in order to solve this problem? Is there some sort of calculation I should be doing for the speed/duration of the animation that I'm not aware of?
Thanks for any help with this!
During the continuous gesture, one shouldn’t animate movements, but rather just move directly to the gesture’s location. When the gesture finishes, if you want it to settle in some other position, then animate that final, post-gesture, destination. But don’t animate during the gesture itself.
In rare cases, where rendering of a single frame is incredibly slow, there can still be perceived lagginess. Obviously, one should optimize the draw(_:) process so that it isn’t slow (or take a snapshot and animate the snapshot view rather than the complicated view). But during the gesture, you can also use “predictive touches,” where the OS estimates where user’s gesture is going to be in the future. For example, you can implement touchesMoved(_:with:) and then call predictedTouches(for:). By moving the view to the predicted touch location, it reduces perceived lagginess.

UIPanGestureRecognizer get translation for each touch point

I have a UIPanGestureRecognizer attached to a parent view, with various CCSprite I want to move around in the parent when panned. Using [gesture locationOfTouch:i inView:recognizer.view] I can get the location of the touch but if I assign that to my subview's center it often makes the subview move unexpectedly since the original touch is probably not in the exact center of the sprite. Really what I want is to get the [gesture translationInView:recognizer.view] for each of the touch locations and use that. It works perfect when you only have 1 panning touch, but more then 1 and there appears no way to get translations for them. Because each touch can be panning in a different directon/speed. The user can use 2 fingers to move two different sprite completely independent of each other. -[UIPanGestureRecognizer translationInView:] doesn't allow me to get the different translations.
How should I do this?
I found a category CCNode-SFGestureRecognizers that adds the ability to attach UIGestureRecognizers to any CCNode. This way I can get around needing multiple translation values.
Now, if you're only interested in pan gesture, why not use cocos touch methods?
In touchBegan: calculate (and save) the location of touch in the subview, in touchMoved:, calculate translation which is just current location less the previous location. Move your subview by that amount.

hacking ios ui responder chain

I'm facing a delicated problem handling touch events. This is problably not a usual stuff to make but i think it is possible. I just don't know how...
I have a Main View (A) and Main View (B) with a lot of subviews 1,2,3,4,5,...
MainView
SubView(A)
1
2
3
SubView(B)
1
2
3
Some of these sub sub views (1,2,4) are scrollviews.
It happens that I want to change between A and B with a two finger pan.
I have tried to attach a UIPanGestureRecognizer to MainView but the scrollviews cancel the touches and it only works sometimes.
I need a consistent method to first capture the touches, detect if it is a two finger pan, and only then decide if it will pass the touches downwards (or upwards... i'm not sure) the responder chain.
I tried to create a top level view to handle that, but I cant make to have the touches being passed through that view.
I have found a lot of people with similar problems but couldn't find a solution to this problem from their solutions.
If any one could give me a light, that would be great as i'm already desperate with this.
You can create a top level view to capture the touches and the coordinates of touches then you can check if the coordinates of touch is inside of the sub views. You can do that using
BOOL CGRectContainsPoint(CGRect rect, CGPoint point)
Method. Rect is a frame of the view, and point is point of touch.
Please not that the frames and touch locations are relative to their super views, therefore you need to convert them to coordinate system of app window.
Or maybe it can be more helpful
Receiving touch events on more then one UIView simultaneously

Resources