I am quite new to iOS and Swift to please bear with be while I explain and if anything is unclear let me know and I will explain.
I have an iOS application, it includes some UIImages, UIButtons, etc.. and they interact when touched.
I would like to dim the screen when the application has not been touched (anywhere on the screen) for a while (lets say 10 seconds), and then as soon as a touch is detected anywhere on the UIScreen I would like to increase the brightness again.
I have found that the following line can be used to adjust the screen brightness:
UIScreen.main.brightness = CGFloat(0.5)
but I don't know how to detect a touch anywhere on the screen (without disturbing all the other buttons etc...), and also to combine that with a timer.
(my app is only for a specific purpose, it will not be distributed and it runs only on a iPhone 7 device with iOS10)
Well, there is an API for that purpose.
func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView?
Returns the farthest descendant of the receiver in the view hierarchy (including itself) that contains a specified point.
This method traverses the view hierarchy by calling the point(inside:with:) method of each subview to determine which subview should receive a touch event. If point(inside:with:) returns true, then the subview’s hierarchy is similarly traversed until the frontmost view containing the specified point is found. If a view does not contain the point, its branch of the view hierarchy is ignored. You rarely need to call this method yourself, but you might override it to hide touch events from subviews.
You need to create the subclass of UIView or UIWindow in which you want to detect touches.
Then override hitTest function and implement required logic inside it.
Update:
See runnable Xcode 8 project to get an idea. Run app and try to press on views and outside of them, you will see logs in debugger console with info about touched view. Hope it will help you.
Related
The Question
When the user taps on a view which of these two functions is called first? touchesBegan(_:with:) or point(inside:with:).
The Context
I want to subclass a PKCanvasView (which inherits from UIScrollView) to allow interaction through the view (i.e. allow interaction to the view below and disable interaction to the PKCanvasView) when the point of the touch is outside of the UIBezierPath of a stroke on the canvas.
This is easy enough by overriding point(inside:with:). My issue lies in the fact that I only want to allow interaction to the view below if the touch event UITouch.TouchType is not an apple pencil : .pencil. (so that the user can draw with the apple pencil and interact with the view below using their finger)
The only way I think can get this information is by also overriding touchesBegan(_:with:). Here I can access the event and its touch type. I would then somehow pass this to be read inside of point(inside:with:).
However, that all relies on extracting the UITouch.TouchType information before I check if the touch point is overlapping with any PKStroke paths.
So: Is touchesBegan(_:with:) called before point(inside:with:)?
I want to find the absolute position of touches on the IOS screen. The main screen is OpenGL with some webviews on it so it complicates getting the overall touch position in screen coordinates. Is there a simple thing like a global screen touch position that I can access ?
Ive tried subclassing touchesBegan touchesEnded and touchesMoved on the webviews but the webviews dont pass the touches through reliably, depending on if they decide that they are trying to recognize a gesture on the webview.
Actually you can convert any CGPoint or CGRect from any UIView to any UIView. Check something like:
myView.convert(myPoint, to: anotherView)
Assuming a point is in myView this will convert coordinates to anotherView. So you could as well use UIApplication.shared.keyWindow as a target view so you could do:
myView.convert(gestureRecognizer.location(in: myView), to: UIApplication.shared.keyWindow)
But I believe in your case you do not even need a window. Using nil instead of that should already use global coordinates and should work exactly the same for single window applications.
myView.convert(gestureRecognizer.location(in: myView), to: nil)
There is a similar SO Question that exists for this problem but unfortunately there were no suitable answers provided.
I have a google maps view (GMSMapView) that is entirely covered by a transparent sibling view that acts as a container for thumbnail images. The thumbnails are child views of the container view, not the map view. These child views are randomly scattered about the map and therefore partially hide portions of the map's surface.
Tapping on one of these thumbnails triggers a segue to a different VC that shows a zoomed view of the image.
The problem:
Given that these thumbnails lie on top of the map, they prevent the normal map gestures from occurring if the gesture intersects one of the thumbnails. For example, if a user wishes to pinch-zoom, rotate or pan the map and one of his/her fingers begins overtop of a thumbnail, the touches are intercepted by the thumbnail.
Non-starters:
Obviously, I can't set userInteractionEnabled to false on a thumbnail because I need to detect tap gestures to trigger the segue.
I don't think that I can customize the responder chain using UIView's hitTest:withEvent and pointInside:withEvent methods on the thumbnail view because they are not in the same branch in the view hierarchy as the map view AND the dispatching logic is dependent on the type of gesture (which I don't think is available at this point - touchesBegan, etc. are called once the appropriate view has been chosen to receive the event). Please correct me if I'm wrong...
Attempted Solution:
Given the above, the strategy I'm attempting is to overlay all other views in the view controller with a transparent "touch interceptor view". This view's only purposes is to receive all touch messages -- by overriding touchesBegan(), touchesMoved(), touchesEnded() -- and dispatch them to other views as appropriate.
In other words, depending on the type of gesture recognized (tap vs. other), I could call the appropriate target view's (either one of the thumbnails or the map) touchesBegan(), touchesMoved(), touchesEnded() methods directly by forwarding the touches and event parameter.
Unfortunately, while this works when the target view is a simple UIView, it seems most UIView subclasses (including GMSMapView) don't allow forwarding of touch events in this manner; as described in the following article (see section: A Tempting Non-Solution).
Any ideas would be greatly appreciated.
What are the standard UISystemGestureGateGestureRecognizers installed on the top level UIView of an iOS app for?
My app consists of two views - one fills the top half of the screen, the other is a custom keyboard and fills the bottom half. I found that taps on the space bar didn't always work and after some investigation found that the timing of tap events in the bottom 20 pixels or so was different to the rest of the view. For most of the view the period between touchesBegan/Ended was about 100ms, where as for the space bar it was 1-2ms. (My app is an emulator and this is too fast for it to detect the key press).
After some more digging I found the main UIView of the application (ie: my main view's superview) has 2 UISystemGestureGateGestureRecognizer's installed. By removing them in ViewDidAppear the bottom of the screen is no longer affected. (Presumably these are cancelling the touch press events to my keyboard hence the faster timing).
These system recognizers are present on at least iOS 5 through 7 and on both iPad and iPhone. I thought they may be related to swipe from top/bottom but this functionality still works with them removed.
So I have a fix, but I'd like to understand more about what's going on here - in particular what I might be breaking by removing these.
This delayed touches bothered me too.
Just as an addition to what's said before,
here's a simple fix:
override func viewDidAppear(_ animated: Bool) {
let window = view.window!
let gr0 = window.gestureRecognizers![0] as UIGestureRecognizer
let gr1 = window.gestureRecognizers![1] as UIGestureRecognizer
gr0.delaysTouchesBegan = false
gr1.delaysTouchesBegan = false
}
no need to remove those gesture recognizers.
just add this to the main view controller.
It appears that these recognizers are meant to prevent accidental touches near the top and bottom of the screen. They aren't configured with any targets, but can (like any UIResponder) absorb touches to prevent them from being passed up the responder chain.
Notes (tested on iOS 7.1):
Both gesture recognizers are always present in the key window.
I inspected both gestures' _targets ivar, and found they aren't configured with any targets at all. Swizzled out addTarget:action: to verify that targets weren't being added or removed on the fly.
delegate is always nil for both instances.
If you disable the gesture recognizers, they will re-enable themselves
The gesture that that doesn't delay content touches fires when you drag up from the bottom or drag down from the top. I couldn't trigger the instance that delays touches.
I'm facing a delicated problem handling touch events. This is problably not a usual stuff to make but i think it is possible. I just don't know how...
I have a Main View (A) and Main View (B) with a lot of subviews 1,2,3,4,5,...
MainView
SubView(A)
1
2
3
SubView(B)
1
2
3
Some of these sub sub views (1,2,4) are scrollviews.
It happens that I want to change between A and B with a two finger pan.
I have tried to attach a UIPanGestureRecognizer to MainView but the scrollviews cancel the touches and it only works sometimes.
I need a consistent method to first capture the touches, detect if it is a two finger pan, and only then decide if it will pass the touches downwards (or upwards... i'm not sure) the responder chain.
I tried to create a top level view to handle that, but I cant make to have the touches being passed through that view.
I have found a lot of people with similar problems but couldn't find a solution to this problem from their solutions.
If any one could give me a light, that would be great as i'm already desperate with this.
You can create a top level view to capture the touches and the coordinates of touches then you can check if the coordinates of touch is inside of the sub views. You can do that using
BOOL CGRectContainsPoint(CGRect rect, CGPoint point)
Method. Rect is a frame of the view, and point is point of touch.
Please not that the frames and touch locations are relative to their super views, therefore you need to convert them to coordinate system of app window.
Or maybe it can be more helpful
Receiving touch events on more then one UIView simultaneously