Absolute position of touches on IOS - ios

I want to find the absolute position of touches on the IOS screen. The main screen is OpenGL with some webviews on it so it complicates getting the overall touch position in screen coordinates. Is there a simple thing like a global screen touch position that I can access ?
Ive tried subclassing touchesBegan touchesEnded and touchesMoved on the webviews but the webviews dont pass the touches through reliably, depending on if they decide that they are trying to recognize a gesture on the webview.

Actually you can convert any CGPoint or CGRect from any UIView to any UIView. Check something like:
myView.convert(myPoint, to: anotherView)
Assuming a point is in myView this will convert coordinates to anotherView. So you could as well use UIApplication.shared.keyWindow as a target view so you could do:
myView.convert(gestureRecognizer.location(in: myView), to: UIApplication.shared.keyWindow)
But I believe in your case you do not even need a window. Using nil instead of that should already use global coordinates and should work exactly the same for single window applications.
myView.convert(gestureRecognizer.location(in: myView), to: nil)

Related

How to get correct screen coordinate from UITapGestureRecognizer while VoiceOver is on

I'm currently working on an interactive view that relies heavily on the user's touch location. I have found that there are a few ways to interact with the UITapGestureRecognizer while VoiceOver is on, but when I tap my point the values given are very wrong. I've looked elsewhere, but my use case is outside of the norm so there is not a lot to tell me what is going on. Has anyone experienced this before?
I am aware that I can change accessibilityTrait to UIAccessibilityTraitAllowsDirectInteraction which will give me the correct screen point when used, but I would like to know what is causing this issue at the very least for the sake of knowledge. To interact with the UITapGestureRecognizer I either double tap or do a 3D touch by pressing on hard on the screen. The ladder method doesn't work for the tap gesture but will work for the pan gesture.
This is the only line I use to get my screen points. My map view is a UIImageView
CGPoint screenPoint = [tapGesture locationInView:map];
I'm using a map of a building and I try to tap the same corner or landmark for my testing. I know I can't hit the same exact point every time, but I do use a stylus and I can get pretty close.
Without VoiceOver on I would get the result: (35.500, 154.363)
With VoiceOver on and tapping in generally the same spot, I get : (187.500, 197.682)
The point I am using to test is on the left side of the screen and the result from VoiceOver being on is in the middle of the screen. I believe the y-axis value may have changed because of my tool bar's size, but I have no idea what is throwing off the x-axis value. If more information is needed let me know.
UPDATE: Upon further investigation, it turns out that the UITapGestureRecognizer will always return (187.500, 197.682) no matter where I touch in the map view when VoiceOver is on. That point seems to be the middle of the map view. Oddly enough though, the UIPanGestureRecognizer will give me the correct (x,y) for my view if I use the 3D touch while VoiceOver is on.
On a side note not relating to the problem at hand, it seems if I use the accessibility trait UIAccessibilityTraitAllowsDirectInteraction the method UIAccessibilityConvertFrameToScreenCoordinates returns a frame that is higher than my view. It works fine if I do not change the trait.
Your problem may deal with the reference point used when VoiceOver is on.
Verify what your point coordinates are referring to : view or screen coordinates ?
I suggest you take a look at the following elements :
accessibilityFrame
accessibilityFrameInContainerSpace
UIAccessibilityConvertFrameToScreenCoordinates
According to your project, the previous elements may be interesting to get your purposes.

control multiple buttons with one swipe gesture swift

I have array of UIImageView that printed on the main View (as subviews) in a matrix.
That UIImageViews interact while I am tap on them (work like a pixel when I touch one of them it turn on (move from black to green)
but I want to do it with swipe gesture so i can with one swipe trigger more than one "pixel" (UIImageView)
I found this for android triggering-multiple-buttonsonclick-event-with-one-swipe-gesture
and I wonder if there is something like that in ios with swift that recognise general touch (not tap or swipe) so I can looks for it.
The main purpose of all of this is to draw "shapes" on matrix of pixels with one swipe gestures.
If there is another way that you think will help I will be happy to here about it.
Many thanks
You are looking for the UIGestureRecognizer.
With this you can add many types of gestures, as swipe, touch..etc.
You can also get position, duration, and basically all the information about it.
You can check step by step tutorial in this link.
http://www.raywenderlich.com/76020/using-uigesturerecognizer-with-swift-tutorial
And also in the apple documentation.
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIGestureRecognizer_Class/
i Manage to do the the swipe action using touchesMoved and touchesEnded
while the main idea is to invoke the UIIMageViews using the Coordinate of the touch and compare it to the to the UIImageViews Coordinates in the touchesMoved function
using flags to disable already edited UIIMageViews (while i am in the same touch session , the finger still on the screen) and refresh the UIImageViews to be editable again in the touchesEnded
func swipeTouches(touches: NSSet!) {
// Get the first touch and its location in this view controller's view coordinate system
let touch = touches.allObjects[0] as! UITouch
let touchLocation = touch.locationInView(self.view)
for pixel in pixelArrays {
// Convert the location of the obstacle view to this view controller's view coordinate system
let pixelViewFrame = self.view.convertRect(pixel.pixelImage.frame, fromView: pixel.pixelImage.superview)
// Check if the touch is inside the obstacle view
if CGRectContainsPoint(pixelViewFrame, touchLocation) {
// check if the pixel is Editable
if(!pixel.isEditable){
let index = pixel.index
pixelArrays.insert(updatePixel(index) , atIndex: index)
}
}
}
}
the only problem that i have now that if the swipe begin on one of the UIImageViews the touchesMoved function consider it the the view to looks for the coordinate and the other UIImageViews are not effected
my idea to solve it is to add layer on top all of the UIImageViews and disable the tap recognition that they alredy have and implement the tap also with the Coordinates way.
i will be happy to hear if there is another way to do it
Update :
i mange to solve the problem above by sort of what i wrote but instead of add another layer i disable the touch on all of the UIImageViews and invoke them using the Coordinates of the touch and them
many thanks

UIPanGestureRecognizer get translation for each touch point

I have a UIPanGestureRecognizer attached to a parent view, with various CCSprite I want to move around in the parent when panned. Using [gesture locationOfTouch:i inView:recognizer.view] I can get the location of the touch but if I assign that to my subview's center it often makes the subview move unexpectedly since the original touch is probably not in the exact center of the sprite. Really what I want is to get the [gesture translationInView:recognizer.view] for each of the touch locations and use that. It works perfect when you only have 1 panning touch, but more then 1 and there appears no way to get translations for them. Because each touch can be panning in a different directon/speed. The user can use 2 fingers to move two different sprite completely independent of each other. -[UIPanGestureRecognizer translationInView:] doesn't allow me to get the different translations.
How should I do this?
I found a category CCNode-SFGestureRecognizers that adds the ability to attach UIGestureRecognizers to any CCNode. This way I can get around needing multiple translation values.
Now, if you're only interested in pan gesture, why not use cocos touch methods?
In touchBegan: calculate (and save) the location of touch in the subview, in touchMoved:, calculate translation which is just current location less the previous location. Move your subview by that amount.

hacking ios ui responder chain

I'm facing a delicated problem handling touch events. This is problably not a usual stuff to make but i think it is possible. I just don't know how...
I have a Main View (A) and Main View (B) with a lot of subviews 1,2,3,4,5,...
MainView
SubView(A)
1
2
3
SubView(B)
1
2
3
Some of these sub sub views (1,2,4) are scrollviews.
It happens that I want to change between A and B with a two finger pan.
I have tried to attach a UIPanGestureRecognizer to MainView but the scrollviews cancel the touches and it only works sometimes.
I need a consistent method to first capture the touches, detect if it is a two finger pan, and only then decide if it will pass the touches downwards (or upwards... i'm not sure) the responder chain.
I tried to create a top level view to handle that, but I cant make to have the touches being passed through that view.
I have found a lot of people with similar problems but couldn't find a solution to this problem from their solutions.
If any one could give me a light, that would be great as i'm already desperate with this.
You can create a top level view to capture the touches and the coordinates of touches then you can check if the coordinates of touch is inside of the sub views. You can do that using
BOOL CGRectContainsPoint(CGRect rect, CGPoint point)
Method. Rect is a frame of the view, and point is point of touch.
Please not that the frames and touch locations are relative to their super views, therefore you need to convert them to coordinate system of app window.
Or maybe it can be more helpful
Receiving touch events on more then one UIView simultaneously

Touch Coordinates from UIPanGestureRecognizer

Is it possible to get the absolute touch coordinates from [UIPanGestureRecognizer translationInView]? I'm working on an iPad app and have been searching a lot to get the touch coordinate values from UIPanGestureRecognizer!
I've also tried offsetting using the values we get from transaltionInView but I'm not really able to comprehend the math behind it...
Any suggestions guys?
Ravi
translationInView is the delta change of a gesture. If you move your finger to the left by 20 pt, you'll get (-20.0, 0.0), it's already "absolute" in that sense.
What you probably mean is that you want the locationInView, which relative to the view handed through the argument, even if said view is not the one recognizing the events. Typically, you would hand the view of the view controller, or the view that will take care of the event, or the subview which makes more sense to your implementation.
Also, keep in mind, if you need the real absolute, you can hand nil through the arguments, and it returns it relative to the window (aka. "absolute")
And, if you need to do logic with other views, you can convert the coordinate from one view to another with the UIView instance methods: convertRect:fromView:, convertRect:toView:, convertPoint:fromView:, convertPoint:toView:. These methods also accept nil as the view argument to mean "absolute" to the window.
Here is an easier way:
gesture.locationInView(myView)
Returns the point computed as the location in a given view of the gesture represented by the receive as CGPoint.

Resources