control multiple buttons with one swipe gesture swift - ios

I have array of UIImageView that printed on the main View (as subviews) in a matrix.
That UIImageViews interact while I am tap on them (work like a pixel when I touch one of them it turn on (move from black to green)
but I want to do it with swipe gesture so i can with one swipe trigger more than one "pixel" (UIImageView)
I found this for android triggering-multiple-buttonsonclick-event-with-one-swipe-gesture
and I wonder if there is something like that in ios with swift that recognise general touch (not tap or swipe) so I can looks for it.
The main purpose of all of this is to draw "shapes" on matrix of pixels with one swipe gestures.
If there is another way that you think will help I will be happy to here about it.
Many thanks

You are looking for the UIGestureRecognizer.
With this you can add many types of gestures, as swipe, touch..etc.
You can also get position, duration, and basically all the information about it.
You can check step by step tutorial in this link.
http://www.raywenderlich.com/76020/using-uigesturerecognizer-with-swift-tutorial
And also in the apple documentation.
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIGestureRecognizer_Class/

i Manage to do the the swipe action using touchesMoved and touchesEnded
while the main idea is to invoke the UIIMageViews using the Coordinate of the touch and compare it to the to the UIImageViews Coordinates in the touchesMoved function
using flags to disable already edited UIIMageViews (while i am in the same touch session , the finger still on the screen) and refresh the UIImageViews to be editable again in the touchesEnded
func swipeTouches(touches: NSSet!) {
// Get the first touch and its location in this view controller's view coordinate system
let touch = touches.allObjects[0] as! UITouch
let touchLocation = touch.locationInView(self.view)
for pixel in pixelArrays {
// Convert the location of the obstacle view to this view controller's view coordinate system
let pixelViewFrame = self.view.convertRect(pixel.pixelImage.frame, fromView: pixel.pixelImage.superview)
// Check if the touch is inside the obstacle view
if CGRectContainsPoint(pixelViewFrame, touchLocation) {
// check if the pixel is Editable
if(!pixel.isEditable){
let index = pixel.index
pixelArrays.insert(updatePixel(index) , atIndex: index)
}
}
}
}
the only problem that i have now that if the swipe begin on one of the UIImageViews the touchesMoved function consider it the the view to looks for the coordinate and the other UIImageViews are not effected
my idea to solve it is to add layer on top all of the UIImageViews and disable the tap recognition that they alredy have and implement the tap also with the Coordinates way.
i will be happy to hear if there is another way to do it
Update :
i mange to solve the problem above by sort of what i wrote but instead of add another layer i disable the touch on all of the UIImageViews and invoke them using the Coordinates of the touch and them
many thanks

Related

A way to pass gestures below an overlay view in Swift, depending on the input gesture type

I am developing an application, which has a map as its core feature. On this map, users can draw and edit polygons, lines, and add points of interest (POIs). To edit a polygon, for instance, one should tap on it, and the application would enter an editing mode after that.
To accomplish this kind of behaviour, I have a transparent overlay view (UIView), that lays just above the map. This view can either ‘capture’ user’s gesture (i.e. tap, long press, etc), if it hits an area of screen, that contains a polygon, or pass it down to the map, if there is no polygon met at the point of the tap. This behaviour is achieved by overriding a UIView method point(inside:with:) (docs here). The pseudocode for the implementation goes like this:
override public func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
if containsPolygongs(at: point) {
return true
} else {
return false
}
}
However, I have an edge-case that got me stuck a bit. Additionally to previously described behaviour, I want to be able to move the map when I put my finger down at the polygon and start sliding my finger around. Or the same with pinch-to-zoom behaviour. So, basically, depending on the type of gesture, I want my point(inside:with:) to return either true (for Tap gesture), or false (for Pan/Pinch gesture). Or, speaking more generically, I want my view to be ‘physically present’ when I tap on a polygon, and ‘physically absent’ when the received gesture is anything but a tap.
P. S.: I am not sure that my idea with point method is 100% correct and is a single possible one way of accomplishing this behaviour. Maybe there is a way to capture all the gestures and dispatch them to the view that lays below. Any idea is great as long as it works. Thanks!

Absolute position of touches on IOS

I want to find the absolute position of touches on the IOS screen. The main screen is OpenGL with some webviews on it so it complicates getting the overall touch position in screen coordinates. Is there a simple thing like a global screen touch position that I can access ?
Ive tried subclassing touchesBegan touchesEnded and touchesMoved on the webviews but the webviews dont pass the touches through reliably, depending on if they decide that they are trying to recognize a gesture on the webview.
Actually you can convert any CGPoint or CGRect from any UIView to any UIView. Check something like:
myView.convert(myPoint, to: anotherView)
Assuming a point is in myView this will convert coordinates to anotherView. So you could as well use UIApplication.shared.keyWindow as a target view so you could do:
myView.convert(gestureRecognizer.location(in: myView), to: UIApplication.shared.keyWindow)
But I believe in your case you do not even need a window. Using nil instead of that should already use global coordinates and should work exactly the same for single window applications.
myView.convert(gestureRecognizer.location(in: myView), to: nil)

How to programmatically connect more UIPanGestureRecognizer on more views?

How can I make one UIPanGestureRecognizer? I have multiple views and every one of them has their own UIPanGestureRecognizer. How can I make that, when user holds his finger on one and moves his finger across the screen, the tag of view is changing and also the view, until user lifts his finger from the screen? It' a little bit hard to explain... I hope you will understand. Thank you for your answers.
I believe you are trying to create a single pan gesture recogniser that will work with multiple views, and for you to know which view is currently under the users finger during the pan. If this is the case then this should help...
Create a view to act as a container for all the views you want to participate in the pan.
I've given each view it's own colour to make it visually obvious.
I've also added a label to each view to show it's tag.
Hook up a single UIPanGestureRecognizer to the container view and attach it's selector to a method in your view controller class.
- (IBAction)panGestureRecognizerTriggered:(UIPanGestureRecognizer *)recognizer
{
CGPoint location = [recognizer locationInView:recognizer.view];
// Find the view that is currently under the user's finger
for (UIView *view in recognizer.view.subviews) {
if (CGRectContainsPoint(view.frame, location)) {
NSLog(#"View %d at %#", (int)view.tag, NSStringFromCGPoint((location)));
// Found the view, stop searching :)
break;
}
}
}
This method iterates the subviews of the view attached to the gesture recogniser and determines which subview is currently under the users finger, printing the tag and current location.
Admittedly this probably isn't going to be particularly efficient if you're dealing with lots of views but for a simple case such as this it gets the job done.

UIPanGestureRecognizer get translation for each touch point

I have a UIPanGestureRecognizer attached to a parent view, with various CCSprite I want to move around in the parent when panned. Using [gesture locationOfTouch:i inView:recognizer.view] I can get the location of the touch but if I assign that to my subview's center it often makes the subview move unexpectedly since the original touch is probably not in the exact center of the sprite. Really what I want is to get the [gesture translationInView:recognizer.view] for each of the touch locations and use that. It works perfect when you only have 1 panning touch, but more then 1 and there appears no way to get translations for them. Because each touch can be panning in a different directon/speed. The user can use 2 fingers to move two different sprite completely independent of each other. -[UIPanGestureRecognizer translationInView:] doesn't allow me to get the different translations.
How should I do this?
I found a category CCNode-SFGestureRecognizers that adds the ability to attach UIGestureRecognizers to any CCNode. This way I can get around needing multiple translation values.
Now, if you're only interested in pan gesture, why not use cocos touch methods?
In touchBegan: calculate (and save) the location of touch in the subview, in touchMoved:, calculate translation which is just current location less the previous location. Move your subview by that amount.

hacking ios ui responder chain

I'm facing a delicated problem handling touch events. This is problably not a usual stuff to make but i think it is possible. I just don't know how...
I have a Main View (A) and Main View (B) with a lot of subviews 1,2,3,4,5,...
MainView
SubView(A)
1
2
3
SubView(B)
1
2
3
Some of these sub sub views (1,2,4) are scrollviews.
It happens that I want to change between A and B with a two finger pan.
I have tried to attach a UIPanGestureRecognizer to MainView but the scrollviews cancel the touches and it only works sometimes.
I need a consistent method to first capture the touches, detect if it is a two finger pan, and only then decide if it will pass the touches downwards (or upwards... i'm not sure) the responder chain.
I tried to create a top level view to handle that, but I cant make to have the touches being passed through that view.
I have found a lot of people with similar problems but couldn't find a solution to this problem from their solutions.
If any one could give me a light, that would be great as i'm already desperate with this.
You can create a top level view to capture the touches and the coordinates of touches then you can check if the coordinates of touch is inside of the sub views. You can do that using
BOOL CGRectContainsPoint(CGRect rect, CGPoint point)
Method. Rect is a frame of the view, and point is point of touch.
Please not that the frames and touch locations are relative to their super views, therefore you need to convert them to coordinate system of app window.
Or maybe it can be more helpful
Receiving touch events on more then one UIView simultaneously

Resources