Sorry for the long-winded explination, but this question - or something similar - has been asked a few times and I havent found a satisfactory answer. I am writing an iPad app in iOS 8 that implements UISplitViewController. Recently I have been attempting to get it to work on the iPhone. It transferred over pretty well, everything collapses automatically and a back button is included in the left side of my nav. bar.
My problem is that I want to keep the back button functionality to pop one view off the stack, but also be able to pan back to the primary view even if there are several detail views on top of it. Ideally, I want to be able to overwrite or redirect the interactivePopGestureRecognizer so that the gesture smoothly pans to the primary view (in some cases it can have anywhere from 1 to 4 detail views stacked on top of it). But, I cannot figure out how to do this.
My current solution (code below) is to disable the interactivePopGestureRecognizer in the detail viewcontroller and implement my own ScreenEdgePanGestureRecognizer that, when triggered, executes popToRootViewController. I've subclassed the ScreenEdgePanGestureRecognizer so it treats the screen edge pan as a discrete "swipe" (i.e. once a large enough screen edge swipe is detected - pop everything off the stack so the primary view is visible).
Code in detail view controller to stop interactivePopGestureRecognizer:
-(void)viewWillAppear : (BOOL) animated {
[super viewWillAppear : animated];
// stops navigation controller from responding to the default back swipe gesture
if ([self.navigationController respondsToSelector:#selector(interactivePopGestureRecognizer)]) {
self.navigationController.interactivePopGestureRecognizer.enabled =NO;
self.navigationController.interactivePopGestureRecognizer.delegate = self;
}
}
// Disable the default back swipe gesture tied to automatically included back button
-(BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer {
if ([gestureRecognizer isEqual:self.navigationController.interactivePopGestureRecognizer]) {
return NO;
} else {
return YES;
}
}
I didn't think it was necessary to include my subclass for the screenEdgePanGestureRecognizer because it has nothing to do with the solution I am asking about here is some pseudocode that shows what my #selector does in the detail viewcontroller:
- (IBAction)leftEdgeSwipe:(ScreenEdgeSwipeGestureRecognizer*)sender {
if (sender.swipeIsValid) {
[(UINavigationController *)self.splitViewController.viewControllers[0]
popToRootViewControllerAnimated:YES];
}
}
I tried to use the continuous pan, but cannot find a way to present the primary view in the background as I am pulling the current view aside to give that clean, smooth panning effect. I am able to make it so I can move the current view around, but there is just a grey background behind it where I would want my primary view to be.
Summation: If there is indeed no way to change the interactivePopGestureRecognizer to always jump to my primary view (ideal solution), then any info on how I can make my own smooth pan back to my primary view would be much appreciated.
So I have been messing around with making a smooth panning gesture subclass. Currently it functions similarly to Apple's back gesture except it jumps all the way back to the root view controller instead of popping one view off the stack. The only problem is that it does not yet show the primary view in the background while panning. I will update the answer once I get that worked out.
Here is the subclass:
#import <UIKit/UIKit.h>
#import <UIKit/UIGestureRecognizerSubclass.h>
#import "ScreenEdgeSwipeGestureRecognizer.h"
#interface ScreenEdgeSwipeGestureRecognizer ()
#property (nonatomic) UINavigationController* navController;
#end
#implementation ScreenEdgeSwipeGestureRecognizer{
CGPoint _screenCenter;
CGPoint _cumulativePanDistance;
}
- (id)initWithNavigationController:(UINavigationController*)navController {
self = [super initWithTarget:self action:#selector(leftEdgePan:)];
_screenCenter = CGPointZero;
_cumulativePanDistance = CGPointZero;
self.edges = UIRectEdgeLeft;
self.navController = navController;
return self;
}
- (IBAction)leftEdgePan:(ScreenEdgeSwipeGestureRecognizer*)sender {
assert(sender == self);
switch (self.state) {
case UIGestureRecognizerStateBegan:
[self initializePositions];
break;
case UIGestureRecognizerStateChanged:
[self updatePositions];
break;
case UIGestureRecognizerStateEnded:
[self animateViewBasedOnCurrentLocation];
break;
case UIGestureRecognizerStateCancelled:
[self animateViewToCenter];
break;
default:
break;
}
// Reset velocity of the pan so current velocity does not compound with velocity of next cycle
[sender setTranslation:CGPointMake(0, 0) inView:sender.view];
}
- (void)initializePositions {
_screenCenter = self.view.center;
_cumulativePanDistance = CGPointZero;
}
- (void)updatePositions {
// Track position of user touch event
CGPoint deltaSinceLastCycle = [self translationInView:self.view];
// View center = view center at last cycle + distance moved by user touch since last cycle
self.view.center=CGPointMake((self.view.center.x + deltaSinceLastCycle.x), self.view.center.y+ 0);
// Update the total positive distance traveled by the user touch event.
_cumulativePanDistance.x = _cumulativePanDistance.x + deltaSinceLastCycle.x;
}
- (void)animateViewBasedOnCurrentLocation {
if (_cumulativePanDistance.x >= (_screenCenter.x - 50)){
[self reset];
[_navController popToRootViewControllerAnimated:YES];
}else{
[self animateViewToCenter];
[self reset];
}
}
- (void)animateViewToCenter {
[UIView animateWithDuration:0.25 animations:^{self.view.center = self->_screenCenter;}];
}
- (void)reset {
[super reset];
_cumulativePanDistance = CGPointZero;
self.state = UIGestureRecognizerStatePossible;
}
#end
Here is how I instantiate the recognizer in my view controller:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
// Initialize the screen edge pan gesture recognizer.
_masterNavigationController = self.splitViewController.viewControllers[0];
ScreenEdgePanGestureRecognizer* edgePanRecognizer = [[ScreenEdgeSwipeGestureRecognizer alloc] initWithNavigationController:_masterNavigationController];
// Add recognizer to view this controller is bound to.
[self.view addGestureRecognizer:_edgePanRecognizer];
}
I'm trying to track hits on UI elements (tap and long press) using UIGestureRecognizer. After hit was tracked (let's say logged via NSLog) UI element should do it's job.
I'm creating gesture recognizers like this:
UITapGestureRecognizer* tap = [[UITapGestureRecognizer] alloc initWithTarget:self action:(OnGesture:)]
tap.cancelsTouchesInView = NO;
tap.delegate = self;
[view addGestureRecognizer:tap];
UILongPressGestureRecognizer* longPress = [[UILongPressGestureRecognizer] alloc initWithTarget:self action:(OnGesture:)]
longPress.cancelsTouchesInView = NO;
longPress.delegate = self;
[view addGestureRecognizer:longPress];
I've overridden some gesture recognizer methods:
-(BOOL)gestureRecognizer:(UIGestureRecognizer*)_recognizer shouldReceiveTouch(UITouch*)_touch
{
return YES;
}
-(BOOL)gestureRecognizer:(UIGestureRecognizer*)_recognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer*)_otherRecognizer
{
return YES;
}
Inside the gesture recognizer handler, I'm trying to find the exact subview of the tap by using the hitTest method.
-(void)OnGesture:(UIGestureRecognizer*)_recognizer
{
if([_recognizer.state == UIGestureRecognizerStateEnded])
{
if([_recognizer isKindOfClass:[UITapGestureRecognizer class]]
|| [_recognizer isKindOfClass:[UILongPressGestureRecognizer class])
{
CGPoint location = [_recognizer locationOfTouch:0 inView:_recognizer.view];
// my problem occurs here:
//---------------------------------------------------------------------------
UIView* hitView = [_recognizer.view hitTest:location withEvent:nil];
//---------------------------------------------------------------------------
NSLog(#"Hit on view: %#", hitView);
}
}
}
So my problem is:
Sometimes (1 out of 10 cases) when I press the UIButton OnGesture method fires, but the IBAction of the "Touch Up Inside" event of that button is not firing.
But when I comment out hitTest call:
//UIView* hitView = [_recognizer.view hitTest:location withEvent:nil];
the bug stops being reproducible. IBAction always gets called.
Why is this happening? How can I fix this?
P.S. there could be some typos in the sample code above.
According to the docs, in order for it to work:
This method ignores view objects that are hidden, that have disabled user interactions, or have an alpha level less than 0.01. This method does not take the view’s content into account when determining a hit. Thus, a view can still be returned even if the specified point is in a transparent portion of that view’s content.
So you might wanna do self.someSubview.userInteractionEnabled = YES;
On my mapview I draw polygon overlays that belong to a specific annotation. I want that annotation to be selected when the overlay is tapped. My first attempt was to add a UITapGestureRecognizer to the mapview, test whether the tapped point is inside a polygon and perform [mapView selectAnnotation:myAnnotation] on success. The problem is that after this, the mapview decides there was a tap not on any annotations, so it deselects the annotation again.
My question is how to best prevent this from happening, I don't seem to be able to find a nice solution. What I have tried:
Create a new UIGestureRecognizer subclass that recognizes just taps inside overlays, then iterate through mapView.gestureRecognizers and call requireGestureRecognizerToFail on each. However, the mapview does not expose any recognizers through its property.
Return YES for shouldBeRequiredToFailByGestureRecognizer in my custom recognizer for any other recognizer that isKindOfClass tap recognizer. However, there still seems to be another recognizer that is not passed in there.
Place a transparent view on there and do the polygon check in pointInside:withEvent, but does also blocks any other gestures besides only taps.
EDIT:
After poking around a bit more, I have code that is almost working, of which I know where it goes wrong. I have a custom recognizer as before. In its delegate I do:
- (BOOL)gestureRecognizer:(UIGestureRecognizer*)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer*)otherGestureRecognizer
{
[otherGestureRecognizer requireGestureRecognizerToFail:gestureRecognizer]; // can possibly do this in custom recognizer itself instead
return YES;
}
Now taps inside polygons successfully prevent deselection. However, when I then do:
- (void)mapView:(MKMapView *)mapView didSelectAnnotationView
{
// displayRegion is chosen to center annotation
[mapView setRegion:self.displayRegion animated:YES];
}
it breaks again, and the annotation gets deselected again..
It seems we have the same problem ( a little different: i'm tryng to select manually an annotation in a gesture recognizer )
I'm doing so ( and it works, but it seems to complex to me , feel free to ask more if it's not clear ):
i'm working with a long pressure event :
...
_lp1 = [[UILongPressGestureRecognizer alloc]
initWithTarget:self action:#selector(handleOverlayLp1:)];
((UILongPressGestureRecognizer*)_lp1).minimumPressDuration = 0.05;
_lp1.delegate = self;
[_mapView addGestureRecognizer:_lp1];
...
I collect all gesture recognizers in a global var :
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
if (_gestureRecognizers==nil)
_gestureRecognizers = [NSMutableSet set];
[_gestureRecognizers addObject:otherGestureRecognizer];
return YES;
}
// when i recognize gestures, disable everything and call an asyncrhronous task where i re-enable
- (void)handleOverlayLp1:(UIGestureRecognizer*)recognizer
{
// Do Your thing.
if (recognizer.state == UIGestureRecognizerStateBegan)
{
BOOL found=NO;
...
if (found) {
// disable gestures, this forces them to fail, and then reenable in selectOverlayAnnotation that is called asynchronously
for (UIGestureRecognizer *otherRecognizer in _gestureRecognizers) {
otherRecognizer.enabled = NO;
[self performSelector:#selector(selectOverlayAnnotation:) withObject:polyline afterDelay:0.1];
}
}
}
}
- (void)selectOverlayAnnotation: (id<MKAnnotation>) polyline
{
[_mapView selectAnnotation:polyline animated:NO];
for (UIGestureRecognizer *otherRecognizer in _gestureRecognizers) {
otherRecognizer.enabled = YES;
}
}
I have a UITapGestureRecognizer that will hide and show a toolbar over my MKMap when the user taps the Map - simple.
However, when the user taps on an MKMapAnnotation, I do not want the map to respond to a tap in the normal way (above). Additionally, when the user taps elsewhere on the map to de-select an MKAnnotation callout, I also don't want the toolbar to respond. So, the toolbar should only respond when there are no MKAnnotations currently in selected state. Nor should it respond when the user clicks on an annotation directly.
So far, I have being trying the following action that reacts to the tap gesture on the map - however the Annotation View is never detected (the first if statement) and also, the annotation view is also launched regardless of this method.
-(void)mapViewTapped:(UITapGestureRecognizer *)tgr
{
CGPoint p = [tgr locationInView:self.mapView];
UIView *v = [self.mapView hitTest:p withEvent:nil];
id<MKAnnotation> ann = nil;
if ([v isKindOfClass:[MKAnnotationView class]])<---- THIS CONDITION IS NEVER MET BUT ANNOTATIONS ARE SELECTED ANYWAY
{
//annotation view was tapped, select it…
ann = ((AircraftAnnotationView *)v).annotation;
[self.mapView selectAnnotation:ann animated:YES];
}
else
{
//annotation view was not tapped, deselect if some ann is selected...
if (self.mapView.selectedAnnotations.count != 0)
{
ann = [self.mapView.selectedAnnotations objectAtIndex:0];
[self.mapView deselectAnnotation:ann animated:YES];
}
// If no annotation view is selected currently then assume control of
// the navigation bar.
else{
[self showToolBar:self.navigationController.toolbar.hidden];
}
}
}
I need to control the launch of the annotation call out programmatically and detect when the tap event has hit an annotation in order to achieve this.
Any help would be appreciated.
I think you will find the following links very useful:
http://blog.asynchrony.com/2010/09/building-custom-map-annotation-callouts-part-2/
How do I make a MKAnnotationView touch sensitive?
The first link discusses (among other things) how to prevent the propagation of touches to the annotations so that they selectively respond, and the second one how to detect the touches.
I think that because MKMapAnnotationView are on top of MKMapView, they will get the touch event and respond to it (be selected) so I don't think you need to select your annotation manually.
Then, if you have a look at Advanced Gesture Recognizer WWDC 2010 video, you will see that your MKMapView will receive tap event anyway, even if it's below the annotation view. That's probably why your -(void)mapViewTapped:(UITapGestureRecognizer *)tgr method get called.
Apart from that, I can't see why your if ([v isKindOfClass:[MKAnnotationView class]]) is never true. I do the exact same thing in my code and it works fine!
Finally, to answer your last question, if you don't want to do anything when the user is just trying to close the callout, you could keep track of a custom isCalloutOpen boolean value like this:
- (void)mapView:(MKMapView *)mapView didSelectAnnotationView:(MKAnnotationView *)view {
//some code
_isCalloutOpen = YES;
}
- (void)mapView:(MKMapView *)mapView didDeselectAnnotationView:(MKAnnotationView *)view {
// delay the reset because didDeselectAnnotationView could (and is often) called before your gesture recgnizer handler method get called.
[self performSelector:#selector(resetCalloutOpenState) withObject:Nil afterDelay:0.1];
}
- (void)resetCalloutOpenState {
_isCalloutOpen = NO;
}
- (void)mapViewTapped:(UITapGestureRecognizer *)tgr {
if (_isCalloutOpen) {
return;
}
}
Let's say we have a view controller with one sub view. the subview takes up the center of the screen with 100 px margins on all sides. We then add a bunch of little stuff to click on inside that subview. We are only using the subview to take advantage of the new frame ( x=0, y=0 inside the subview is actually 100,100 in the parent view).
Then, imagine that we have something behind the subview, like a menu. I want the user to be able to select any of the "little stuff" in the subview, but if there is nothing there, I want touches to pass through it (since the background is clear anyway) to the buttons behind it.
How can I do this? It looks like touchesBegan goes through, but buttons don't work.
Create a custom view for your container and override the pointInside: message to return false when the point isn't within an eligible child view, like this:
Swift:
class PassThroughView: UIView {
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
for subview in subviews {
if !subview.isHidden && subview.isUserInteractionEnabled && subview.point(inside: convert(point, to: subview), with: event) {
return true
}
}
return false
}
}
Objective C:
#interface PassthroughView : UIView
#end
#implementation PassthroughView
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
for (UIView *view in self.subviews) {
if (!view.hidden && view.userInteractionEnabled && [view pointInside:[self convertPoint:point toView:view] withEvent:event])
return YES;
}
return NO;
}
#end
Using this view as a container will allow any of its children to receive touches but the view itself will be transparent to events.
I also use
myView.userInteractionEnabled = NO;
No need to subclass. Works fine.
From Apple:
Event forwarding is a technique used by some applications. You forward touch events by invoking the event-handling methods of another responder object. Although this can be an effective technique, you should use it with caution. The classes of the UIKit framework are not designed to receive touches that are not bound to them .... If you want to conditionally forward touches to other responders in your application, all of these responders should be instances of your own subclasses of UIView.
Apples Best Practise:
Do not explicitly send events up the responder chain (via nextResponder); instead, invoke the superclass implementation and let the UIKit handle responder-chain traversal.
instead you can override:
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
in your UIView subclass and return NO if you want that touch to be sent up the responder chain (I.E. to views behind your view with nothing in it).
A far simpler way is to "Un-Check" User Interaction Enabled in the interface builder. "If you are using a storyboard"
Lately I wrote a class that will help me with just that. Using it as a custom class for a UIButton or UIView will pass touch events that were executed on a transparent pixel.
This solution is a somewhat better than the accepted answer because you can still click a UIButton that is under a semi transparent UIView while the non transparent part of the UIView will still respond to touch events.
As you can see in the GIF, the Giraffe button is a simple rectangle but touch events on transparent areas are passed on to the yellow UIButton underneath.
Link to class
Top voted solution was not fully working for me, I guess it was because I had a TabBarController into the hierarchy (as one of the comments points out) it was in fact passing along touches to some parts of the UI but it was messing with my tableView's ability to intercept touch events, what finally did it was overriding hitTest in the view I want to ignore touches and let the subviews of that view handle them
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
UIView *view = [super hitTest:point withEvent:event];
if (view == self) {
return nil; //avoid delivering touch events to the container view (self)
}
else{
return view; //the subviews will still receive touch events
}
}
Building on what John posted, here is an example that will allow touch events to pass through all subviews of a view except for buttons:
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
// Allow buttons to receive press events. All other views will get ignored
for( id foundView in self.subviews )
{
if( [foundView isKindOfClass:[UIButton class]] )
{
UIButton *foundButton = foundView;
if( foundButton.isEnabled && !foundButton.hidden && [foundButton pointInside:[self convertPoint:point toView:foundButton] withEvent:event] )
return YES;
}
}
return NO;
}
Swift 3
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
for subview in subviews {
if subview.frame.contains(point) {
return true
}
}
return false
}
According to the 'iPhone Application Programming Guide':
Turning off delivery of touch events.
By default, a view receives touch
events, but you can set its userInteractionEnabled property to NO
to turn off delivery of events. A view also does not receive events if it’s hidden
or if it’s transparent.
http://developer.apple.com/iphone/library/documentation/iPhone/Conceptual/iPhoneOSProgrammingGuide/EventHandling/EventHandling.html
Updated: Removed example - reread the question...
Do you have any gesture processing on the views that may be processing the taps before the button gets it? Does the button work when you don't have the transparent view over it?
Any code samples of non-working code?
As far as I know, you are supposed to be able to do this by overriding the hitTest: method. I did try it but could not get it to work properly.
In the end I created a series of transparent views around the touchable object so that they did not cover it. Bit of a hack for my issue this worked fine.
Taking tips from the other answers and reading up on Apple's documentation, I created this simple library for solving your problem:
https://github.com/natrosoft/NATouchThroughView
It makes it easy to draw views in Interface Builder that should pass touches through to an underlying view.
I think method swizzling is overkill and very dangerous to do in production code because you are directly messing with Apple's base implementation and making an application-wide change that could cause unintended consequences.
There is a demo project and hopefully the README does a good job explaining what to do. To address the OP, you would change the clear UIView that contains the buttons to class NATouchThroughView in Interface Builder. Then find the clear UIView that overlays the menu that you want to be tap-able. Change that UIView to class NARootTouchThroughView in Interface Builder. It can even be the root UIView of your view controller if you intend those touches to pass through to the underlying view controller. Check out the demo project to see how it works. It's really quite simple, safe, and non-invasive
I created a category to do this.
a little method swizzling and the view is golden.
The header
//UIView+PassthroughParent.h
#interface UIView (PassthroughParent)
- (BOOL) passthroughParent;
- (void) setPassthroughParent:(BOOL) passthroughParent;
#end
The implementation file
#import "UIView+PassthroughParent.h"
#implementation UIView (PassthroughParent)
+ (void)load{
Swizz([UIView class], #selector(pointInside:withEvent:), #selector(passthroughPointInside:withEvent:));
}
- (BOOL)passthroughParent{
NSNumber *passthrough = [self propertyValueForKey:#"passthroughParent"];
if (passthrough) return passthrough.boolValue;
return NO;
}
- (void)setPassthroughParent:(BOOL)passthroughParent{
[self setPropertyValue:[NSNumber numberWithBool:passthroughParent] forKey:#"passthroughParent"];
}
- (BOOL)passthroughPointInside:(CGPoint)point withEvent:(UIEvent *)event{
// Allow buttons to receive press events. All other views will get ignored
if (self.passthroughParent){
if (self.alpha != 0 && !self.isHidden){
for( id foundView in self.subviews )
{
if ([foundView alpha] != 0 && ![foundView isHidden] && [foundView pointInside:[self convertPoint:point toView:foundView] withEvent:event])
return YES;
}
}
return NO;
}
else {
return [self passthroughPointInside:point withEvent:event];// Swizzled
}
}
#end
You will need to add my Swizz.h and Swizz.m
located Here
After that, you just Import the UIView+PassthroughParent.h in your {Project}-Prefix.pch file, and every view will have this ability.
every view will take points, but none of the blank space will.
I also recommend using a clear background.
myView.passthroughParent = YES;
myView.backgroundColor = [UIColor clearColor];
EDIT
I created my own property bag, and that was not included previously.
Header file
// NSObject+PropertyBag.h
#import <Foundation/Foundation.h>
#interface NSObject (PropertyBag)
- (id) propertyValueForKey:(NSString*) key;
- (void) setPropertyValue:(id) value forKey:(NSString*) key;
#end
Implementation File
// NSObject+PropertyBag.m
#import "NSObject+PropertyBag.h"
#implementation NSObject (PropertyBag)
+ (void) load{
[self loadPropertyBag];
}
+ (void) loadPropertyBag{
#autoreleasepool {
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
Swizz([NSObject class], NSSelectorFromString(#"dealloc"), #selector(propertyBagDealloc));
});
}
}
__strong NSMutableDictionary *_propertyBagHolder; // Properties for every class will go in this property bag
- (id) propertyValueForKey:(NSString*) key{
return [[self propertyBag] valueForKey:key];
}
- (void) setPropertyValue:(id) value forKey:(NSString*) key{
[[self propertyBag] setValue:value forKey:key];
}
- (NSMutableDictionary*) propertyBag{
if (_propertyBagHolder == nil) _propertyBagHolder = [[NSMutableDictionary alloc] initWithCapacity:100];
NSMutableDictionary *propBag = [_propertyBagHolder valueForKey:[[NSString alloc] initWithFormat:#"%p",self]];
if (propBag == nil){
propBag = [NSMutableDictionary dictionary];
[self setPropertyBag:propBag];
}
return propBag;
}
- (void) setPropertyBag:(NSDictionary*) propertyBag{
if (_propertyBagHolder == nil) _propertyBagHolder = [[NSMutableDictionary alloc] initWithCapacity:100];
[_propertyBagHolder setValue:propertyBag forKey:[[NSString alloc] initWithFormat:#"%p",self]];
}
- (void)propertyBagDealloc{
[self setPropertyBag:nil];
[self propertyBagDealloc];//Swizzled
}
#end
Try set a backgroundColor of your transparentView as UIColor(white:0.000, alpha:0.020). Then you can get touch events in touchesBegan/touchesMoved methods. Place the code below somewhere your view is inited:
self.alpha = 1
self.backgroundColor = UIColor(white: 0.0, alpha: 0.02)
self.isMultipleTouchEnabled = true
self.isUserInteractionEnabled = true
Try this
class PassthroughToWindowView: UIView {
override func test(_ point: CGPoint, with event: UIEvent?) -> UIView? {
var view = super.hitTest(point, with: event)
if view != self {
return view
}
while !(view is PassthroughWindow) {
view = view?.superview
}
return view
}
}
I use that instead of override method point(inside: CGPoint, with: UIEvent)
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
guard self.point(inside: point, with: event) else { return nil }
return self
}
If you can't bother to use a category or subclass UIView, you could also just bring the button forward so that it is in front of the transparent view. This won't always be possible depending on your application, but it worked for me. You can always bring the button back again or hide it.