Puzzle swipe gesture - ios

Im trying to figure the best way available in iOS to solve the following:
Basically I've built a 4x6 tile matrix with UIButtons, each containing a letter. The buttons are contained within a UIView. (Apple, Fast, Tree)
A
P F
P A
L S
E T
T R E E
All UIButtons have userInteractionEnabled set to FALSE to receive touchesBegan calls. On creation, all UIButtons are placed into a NSMutableArray.
My challenge is how to Swipe&drag from a letter(starting point) and move to a destination letter, trying to "find" the complete word.Kind of like the Ruzzle App but only horizontal & Vertical swipes.
The UIButtons that are being "multi-selected" have to change background color as a visual indication.
Im receiving the touch location via the touchesMoved. Does the entire code of detection has to be triggered under touchesMoved?
What will the best approach for this be? and the least process intensive

Instead of using touchesBegan/Moved methods, why don't you look into UIControlEventTouchDragEnter/UIControlEventTouchDragInside events? That will give you better performance as the associated action will only be called when touch enters the button or is dragged inside the button. In these methods you can keep pushing the new buttons in an array as touch enters into their bounds and check for their position with respect to buttons pushed previously. In this approach I think you will have to handle the first button touch using UIControlEventTouchDown event.
I would love to know how you implement it finally.

Related

Is it possible to attach a gesture recognizer to a button, so that the user swipes up after/during the button press?

I read through a few similar questions here, but most of them are for much older versions of Swift.
This tutorial shows how to create a gesture recognizer and works pretty well: https://www.ioscreator.com/tutorials/swipe-gesture-ios-tutorial-ios11
What I'd like to accomplish is to add functionality that would allow the user to swipe up or down after pressing a button, while still holding the button, and have my app react to the combination of the specific button being pressed and the upward or downward swipe gesture.
Here's the specific design I'm trying to implement. Basically I'd like the user to press the "A" button and then swipe up or down to get the "#" or "b".
Is this possible? The # & b could be image views or buttons (though if they're buttons, I don't want them to be pressable on their own). If this is a crazy design, I welcome suggestions for improvement.
You want to use a UILongPressGestureRecognizer (probably in conjunction with image views). It has the advantage that first it recognizes a finger held down in one spot (the "A") and then it tracks the movement of that finger (panning up to the sharp or down to the flat). Where the finger is held down — i.e., is it in the "A" or not — will determine whether to recognize in the first place. Then if you do recognize, you watch where the finger goes and decide whether it has entered the sharp or the flat.
I ended up using a Pan Gesture Recognizer, and it worked out really well! I am simply using the y coordinate of the pan gesture to determine if the user is moving his/her finger up to the sharp or down to the flat.

Custom UIGestureRecognizer conflicting with UITapGestureRecognizer

So I have this project that I took from somebody else and they have implemented this OneFingerRotationGestureRecognizer (https://github.com/melle/OneFingerRotationGestureDemo/blob/master/OneFingerRotationGestureDemo/OneFingerRotationGestureRecognizer.m) for a circular slider. Additionally, they have added a UITapGestureRecognizer on top of that, so you could tap a value within that circular slider and the value would jump to that specific one. Now the problem is, when I drag that thing just a very small amount (imagine putting your thumb onto the control and tilting left/right), then the UITapGestureRecognizer also fires! And this is a problem, because I want to be able to grab the circular slider wherever I want (there is no handle or something). And when I only drag it a little, then the value just jumps to that spot where I did that small dragging. Somehow I need to cancel that tap gesture as soon as that OneFingerRotationGestureRecognizer started registering touches. I tried what is described here: https://developer.apple.com/documentation/uikit/touches_presses_and_gestures/coordinating_multiple_gesture_recognizers/preferring_one_gesture_over_another?language=objc but didn't have any success with that :-(.
What can I do? I'm afraid the solution is so simple that I just don't see it.

Dragging an uiview like facebooks menu slide

I know this has been probably asked before but I've seen many approaches and i don't know which is best for me, so plz don't send me a link to another post unless it addresses my problem directly.
I have a controller which has a uiview on the top (like a header) (this header is bigger than it seems because is partially hidden on top). on that view i have a uibutton which now with a touch up inside shows the entire header view and taping again returns it to its starting position (changing frame with animation). I want to also be able to drag the view but only changing position on the y axis(dragging up and down)... i was thinking of adding the dragInside/Outside event to the button but this doesn't give me the position of the finger... and also want to know when the user releases the drag so the view ends animation to any of its two possible states (showing or partially hidden). Is this a "touches began" , "touches moved" , "touches ended" thing? if it is please provide a code example. I also want to do this with another view but this is on the left side... same thing but this one moves on the X axis... any help is appreciated. or maybe it can be made with drag event if i only can save a CGpoint of last touch, maybe that's better, any other suggestions
Look at using a UIPanGestureRecognizer to detect the touch movements. Use the translationInView: of the gesture to set the view y position. The translation is the total movement since the start of the gesture so you don't need to remember and accumulate the offset position yourself.
The main thing to worry about while implementing this is bounding the y position of the view so that no matter how far the user drags the view won't go too high or low on the screen.
Use a UIPanGestureRecognizer, that's a class dedicated to handling such drag/pan gestures.
Everything is described here in Apple's documentation, including examples, so you should find your answer here.
There is also some sample code in Apple Developer Library that shows you how to use Gesture Recognizers if needed.

hacking ios ui responder chain

I'm facing a delicated problem handling touch events. This is problably not a usual stuff to make but i think it is possible. I just don't know how...
I have a Main View (A) and Main View (B) with a lot of subviews 1,2,3,4,5,...
MainView
SubView(A)
1
2
3
SubView(B)
1
2
3
Some of these sub sub views (1,2,4) are scrollviews.
It happens that I want to change between A and B with a two finger pan.
I have tried to attach a UIPanGestureRecognizer to MainView but the scrollviews cancel the touches and it only works sometimes.
I need a consistent method to first capture the touches, detect if it is a two finger pan, and only then decide if it will pass the touches downwards (or upwards... i'm not sure) the responder chain.
I tried to create a top level view to handle that, but I cant make to have the touches being passed through that view.
I have found a lot of people with similar problems but couldn't find a solution to this problem from their solutions.
If any one could give me a light, that would be great as i'm already desperate with this.
You can create a top level view to capture the touches and the coordinates of touches then you can check if the coordinates of touch is inside of the sub views. You can do that using
BOOL CGRectContainsPoint(CGRect rect, CGPoint point)
Method. Rect is a frame of the view, and point is point of touch.
Please not that the frames and touch locations are relative to their super views, therefore you need to convert them to coordinate system of app window.
Or maybe it can be more helpful
Receiving touch events on more then one UIView simultaneously

Detecting gestures on overlapping screen areas in iOS5

As shown in the diagram below, my app has a few UIViews, B, C and D, side by side, and all contained in an enveloping UIView A:
I have a UIPinchGestureRecognizer in each of B, C, and D. What I'd also like to do is recognize a different gesture over the entire of area A (without hindering the other gesture recognizers from working).
What's the best strategy for this? I'm targeting iOS5+, no backwards compatability needed.
It's also worth noting that the gesture recognizer for A will probably have to be a custom gesture recognizer, since I want to detect a pinch but with > 2 fingers involved.
Thought:
If installing a gesture recognizer for A doesn't work well, it might be possible to do it the old way by using touchesBegan etc. As the UIResponder docs note, you can have an subclass of UIView just call [super touchesBegan:touches withEvent:event] to have it passed on in the responder chain if you're not interested in the touch.
Add the GestureRecognize to A as you would normally do.
Now you need to start by hit-testing what was touched.
First you need to test the z-index of the items. For example if you touch B, then your function will loop/hit-test over all the items that are affected, in this case A & B.
After your function detects both A & B (B over A) hit-test, it should check for the z-index. For example B's z-index is 2, then A z-index is 1. Now you know that the B is what the user intended to touch because it's z-index is higher and this means that it is on-top.
After you have the target identified(the B), before executing it's GestureRecognize you need to temporarily disable the GestureRecognize for A to eliminate any conflict between the overlapping GestureRecognizes. After the B touch completes/ends, enable A's GestureRecognize back.
It turns out just adding gesture recognizers in the straightforward obvious way works, at least for the gestures I want to recognize. I imagined it would be more complicated.

Resources