Sorry for the long-winded explination, but this question - or something similar - has been asked a few times and I havent found a satisfactory answer. I am writing an iPad app in iOS 8 that implements UISplitViewController. Recently I have been attempting to get it to work on the iPhone. It transferred over pretty well, everything collapses automatically and a back button is included in the left side of my nav. bar.
My problem is that I want to keep the back button functionality to pop one view off the stack, but also be able to pan back to the primary view even if there are several detail views on top of it. Ideally, I want to be able to overwrite or redirect the interactivePopGestureRecognizer so that the gesture smoothly pans to the primary view (in some cases it can have anywhere from 1 to 4 detail views stacked on top of it). But, I cannot figure out how to do this.
My current solution (code below) is to disable the interactivePopGestureRecognizer in the detail viewcontroller and implement my own ScreenEdgePanGestureRecognizer that, when triggered, executes popToRootViewController. I've subclassed the ScreenEdgePanGestureRecognizer so it treats the screen edge pan as a discrete "swipe" (i.e. once a large enough screen edge swipe is detected - pop everything off the stack so the primary view is visible).
Code in detail view controller to stop interactivePopGestureRecognizer:
-(void)viewWillAppear : (BOOL) animated {
[super viewWillAppear : animated];
// stops navigation controller from responding to the default back swipe gesture
if ([self.navigationController respondsToSelector:#selector(interactivePopGestureRecognizer)]) {
self.navigationController.interactivePopGestureRecognizer.enabled =NO;
self.navigationController.interactivePopGestureRecognizer.delegate = self;
}
}
// Disable the default back swipe gesture tied to automatically included back button
-(BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer {
if ([gestureRecognizer isEqual:self.navigationController.interactivePopGestureRecognizer]) {
return NO;
} else {
return YES;
}
}
I didn't think it was necessary to include my subclass for the screenEdgePanGestureRecognizer because it has nothing to do with the solution I am asking about here is some pseudocode that shows what my #selector does in the detail viewcontroller:
- (IBAction)leftEdgeSwipe:(ScreenEdgeSwipeGestureRecognizer*)sender {
if (sender.swipeIsValid) {
[(UINavigationController *)self.splitViewController.viewControllers[0]
popToRootViewControllerAnimated:YES];
}
}
I tried to use the continuous pan, but cannot find a way to present the primary view in the background as I am pulling the current view aside to give that clean, smooth panning effect. I am able to make it so I can move the current view around, but there is just a grey background behind it where I would want my primary view to be.
Summation: If there is indeed no way to change the interactivePopGestureRecognizer to always jump to my primary view (ideal solution), then any info on how I can make my own smooth pan back to my primary view would be much appreciated.
So I have been messing around with making a smooth panning gesture subclass. Currently it functions similarly to Apple's back gesture except it jumps all the way back to the root view controller instead of popping one view off the stack. The only problem is that it does not yet show the primary view in the background while panning. I will update the answer once I get that worked out.
Here is the subclass:
#import <UIKit/UIKit.h>
#import <UIKit/UIGestureRecognizerSubclass.h>
#import "ScreenEdgeSwipeGestureRecognizer.h"
#interface ScreenEdgeSwipeGestureRecognizer ()
#property (nonatomic) UINavigationController* navController;
#end
#implementation ScreenEdgeSwipeGestureRecognizer{
CGPoint _screenCenter;
CGPoint _cumulativePanDistance;
}
- (id)initWithNavigationController:(UINavigationController*)navController {
self = [super initWithTarget:self action:#selector(leftEdgePan:)];
_screenCenter = CGPointZero;
_cumulativePanDistance = CGPointZero;
self.edges = UIRectEdgeLeft;
self.navController = navController;
return self;
}
- (IBAction)leftEdgePan:(ScreenEdgeSwipeGestureRecognizer*)sender {
assert(sender == self);
switch (self.state) {
case UIGestureRecognizerStateBegan:
[self initializePositions];
break;
case UIGestureRecognizerStateChanged:
[self updatePositions];
break;
case UIGestureRecognizerStateEnded:
[self animateViewBasedOnCurrentLocation];
break;
case UIGestureRecognizerStateCancelled:
[self animateViewToCenter];
break;
default:
break;
}
// Reset velocity of the pan so current velocity does not compound with velocity of next cycle
[sender setTranslation:CGPointMake(0, 0) inView:sender.view];
}
- (void)initializePositions {
_screenCenter = self.view.center;
_cumulativePanDistance = CGPointZero;
}
- (void)updatePositions {
// Track position of user touch event
CGPoint deltaSinceLastCycle = [self translationInView:self.view];
// View center = view center at last cycle + distance moved by user touch since last cycle
self.view.center=CGPointMake((self.view.center.x + deltaSinceLastCycle.x), self.view.center.y+ 0);
// Update the total positive distance traveled by the user touch event.
_cumulativePanDistance.x = _cumulativePanDistance.x + deltaSinceLastCycle.x;
}
- (void)animateViewBasedOnCurrentLocation {
if (_cumulativePanDistance.x >= (_screenCenter.x - 50)){
[self reset];
[_navController popToRootViewControllerAnimated:YES];
}else{
[self animateViewToCenter];
[self reset];
}
}
- (void)animateViewToCenter {
[UIView animateWithDuration:0.25 animations:^{self.view.center = self->_screenCenter;}];
}
- (void)reset {
[super reset];
_cumulativePanDistance = CGPointZero;
self.state = UIGestureRecognizerStatePossible;
}
#end
Here is how I instantiate the recognizer in my view controller:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
// Initialize the screen edge pan gesture recognizer.
_masterNavigationController = self.splitViewController.viewControllers[0];
ScreenEdgePanGestureRecognizer* edgePanRecognizer = [[ScreenEdgeSwipeGestureRecognizer alloc] initWithNavigationController:_masterNavigationController];
// Add recognizer to view this controller is bound to.
[self.view addGestureRecognizer:_edgePanRecognizer];
}
I have a UIButton subclass that does some custom drawing and animations. That is all working fine and dandy.
However, most of my buttons dismiss the current view via their superview calling [self dismissViewControllerAnimated] once it is confirmed with the model that whatever the button push was supposed to accomplish was actually done, and I want there to be a delay to allow the animation to complete before dismissing the view.
I am able to easily enough animate the UIButton subclass on touchesEnded and then call [super touchesEnded], which works fine except that it doesn't let my animations finish before dismissing the view. Like this:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CABasicAnimation *myAnimation = [CABasicAnimation animationWithKeyPath:#"transform.foo"];
//set up myAnimation's properties
[self.layer addAnimation:shakeAnimation forKey:nil];
[super touchesEnded:touches withEvent:event]; //works! but no delay
}
My first attempt at creating a delay was by using CATransaction, as follows:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CABasicAnimation *myAnimation = [CABasicAnimation animationWithKeyPath:#"transform.foo"];
//set up myAnimation's properties
[CATransaction begin];
[CATransaction setCompletionBlock:^{
[super touchesEnded:touches withEvent:event]; //doesn't seem to do anything :-/
}];
[self.layer addAnimation:shakeAnimation forKey:nil];
[CATransaction commit];
}
Which, as far as I can tell, is executing CATransaction's completionBlock, but it just isn't doing anything.
I've also tried assigning the touches and event arguments from touchesEnded to both properties and global variables, and then executing [super touchesEnded] in another method called by an NSTimer. The same thing seems to be occurring where the code is executing, but my call to [super touchesEnded] isn't doing anything.
I've dug around online for hours. Added stubs of the other touches methods from UIResponder which just contain [super touches...]. Tried setting up my global variables for the method called by NSTimer differently (I very well may be missing something in regards to global variables...). This button is being created by the Storyboard, but I've set the class to my custom class, so I don't think UIButton's +(UIButton *)buttonWithType method is affecting this.
What am I missing? Is there some small thing I'm forgetting about or is there just no way to delay the call to [super touchesEnded] from a UIButton subclass?
I was not able to solve this, only find a way around it.
My last stab at solving this was to figure out whether the [super touchesEnded...] inside the completion block was being executed in a thread that was different from the thread it was executed in when it was outside the completion block... and no, they both appear to be the main thread (Apple's documentation on CATransaction does state that its completionBlock is always run in the main thread).
So in case someone else is banging their head against this, here's my less-than-elegant solution:
1.) In my subclass of UIButton I created a weak property called containingVC.
2.) In every single (ugh) VC that uses the custom button class I have to do this:
#implemenation VCThatUsesCustomButtonsOneOfWayTooMany
-(void)viewDidLayoutSubviews
{
[super viewDidLayoutSubviews];
self.firstCustomButton.containingVC = self;
self.secondCustomButton.containingVC = self;
self.thirdCustomButton.containingVC = self;
....
self.lastCustomButton.containingVC = self;
//you're probably better off using an IBOutletColletion and NSArray's makeObjectPerformSelector:withObject...
}
#end
3.) Then in my custom UIButton class I have something like this:
-(void)animateForPushDismissCurrView
{
CAAnimation *buttonAnimation = ...make animation
[CATransaction begin];
[CATransaction setCompletionBlock:^{
[self.containingVC performSegueWithIdentifier:#"segueID" sender:self.containingVC];
}];
[self.layer addAnimation:buttonAnimation forKey:nil];
[CATransaction commit];
}
4.) Then in whatever VC that the user is currently interacting with, after making sure the button has done whatever it was supposed to do (in my case it checks with the model to confirm that the relevant change was made), each button has to call [someCustomButton animateForPushDismissCurrView] which then animates the button press and then fires the UIStoryboardSegue that actually dismisses the view.
Obviously this would work for going deeper, not just unwinding, but you'd need additional logic in the custom button's -(void)animateForPush method or a separate method entirely.
Again, if I'm missing something here, I'd love to know what it is. This seems like an absurd number of hoops to jump through to accomplish what seems like a simple task.
Lastly, and most importantly, if it just won't work with the [super touchesEnded...] method in CATransaction's completionBlock, I'd like to know WHY. I suspect that it has something to do with threads or the weirdness that is Objective-C's super keyword.
I'm working on code for an expandable tray view that uses UIDynamicAnimator to achieve a nice expand/contract animation.
To achieve a realistic acceleration I use UIGravityBehavior to make my tray fall, until the "tab" of the tray hits the bottom of the screen.
This works well, but even though all items in the scene have stopped moving, UIDynamicAnimatorDelegate dynamicAnimatorDidPause: is never called. This means that the animator continues using CPU cycles to animate the scene ( the delegate is set, and fires for UIDynamicAnimatorDelegate dynamicAnimatorDidPause: ).
I tried removing the UIGravityBehavior from the scene, which did indeed cause the animator to stop in the end. I can't time the removal of the gravity behavior right though, since I need to remove it from the scene once everything has stopped moving.
I understand that gravity is a constant force, but I still assumed it would stop the animator once everything has 0 velocity and 0 acceleration.
Is this last assumption false?
Anyone having similar problems?
You are correct that the animator should pause once everything comes to rest.
Check what items are attached to your gravity behavior, and make sure that there aren't other items still falling. For example, it is easy to accidentally create the following bug:
Add a view to gravity and collision
Remove view from superview and from collision
Fail to remove view from gravity
In this situation, the "ghost item" will fall forever.
Another possible problem (though less likely given your description) is if your items are attached to other behaviors that are causing infinite but small "bounce." I would check the full list of behaviors on your animator (remember to check child behaviors, too). In particular I'd be interested in any UIDynamicItemBehavior that adds elasticity.
EDIT:
You may also want to go the other way. Start with a very basic dynamics system and add components from yours until you can reproduce the problem. For instance, the following does converge quite quickly (logging "pause"):
#interface PTLViewController () <UIDynamicAnimatorDelegate>
#property (nonatomic, strong) UIDynamicAnimator *animator;
#end
#implementation PTLViewController
- (void)viewDidLoad {
[super viewDidLoad];
UIView *view = [[UIView alloc] initWithFrame:CGRectMake(100,100,100,100)];
view.backgroundColor = [UIColor lightGrayColor];
[self.view addSubview:view];
self.animator = [[UIDynamicAnimator alloc] initWithReferenceView:self.view];
self.animator.delegate = self;
UICollisionBehavior *collisionBehavior = [[UICollisionBehavior alloc] initWithItems:#[view]];
collisionBehavior.translatesReferenceBoundsIntoBoundary = YES;
[self.animator addBehavior:collisionBehavior];
UIGravityBehavior *gravityBehavior = [[UIGravityBehavior alloc] initWithItems:#[view]];
[self.animator addBehavior:gravityBehavior];
}
- (void)dynamicAnimatorDidPause:(UIDynamicAnimator *)animator {
NSLog(#"pause");
}
#end
To your question about getting all item velocities, I don't know of an easy way to do that. Unfortunately, UIDynamicAnimator doesn't directly know all of its items. This is indirectly because UIDyanamicBehavior doesn't include an items property. If this bothers you as much as it does me, consider duping radar://15054405.
But there is a solution if you just want to know the current linear velocity of specific items. Just add a UIDynamicItemBehavior with a custom action to log it:
UIDynamicItemBehavior *dynamicItemBehavior = [[UIDynamicItemBehavior alloc] initWithItems:#[view]];
__weak UIDynamicItemBehavior *weakBehavior = dynamicItemBehavior;
dynamicItemBehavior.action = ^{
NSLog(#"Velocity: %#", NSStringFromCGPoint([weakBehavior linearVelocityForItem:view]));
};
[self.animator addBehavior:dynamicItemBehavior];
I had a similar issue recently. In the end I used a UICollisionBehavior with boundaries instead of items (because otherwise the moving items were bumping the others...) and implement the delegate method collisionBehavior:beganContactForItem:withBoundaryIdentifier:atPoint: to know when I should remove the gravity
UICollisionBehavior *collide = [[[UICollisionBehavior alloc] initWithItems:borders] retain];
[collide addItem:movingItem];
[collide setCollisionMode:UICollisionBehaviorModeBoundaries];
[collide setTranslatesReferenceBoundsIntoBoundary:YES];
If you find a better solution, let me know :) !
My problem is the same. My animator never comes to rest so once started, my app consumes 3 to 4% CPU forever. My views all appear to stop moving within 1/2 second. So rather than figure out why I'm not reaching equilibrium, I just hit it with a hammer and kill the animator with a timer. I give it 2 seconds.
- (void)createAnimator {
if (_timer) {
[_timer invalidate];
}
if (_animator) {
[_animator removeAllBehaviors];
}
_animator = [[UIDynamicAnimator alloc] initWithReferenceView:self.view];
// create all behaviors
// kill the animator in 2 seconds
_timer = [NSTimer scheduledTimerWithTimeInterval:2.0 target:self selector:#selector(killAnimator:) userInfo:nil repeats:NO];
}
- (void)killAnimator:(NSTimer *)timer {
[_animator removeAllBehaviors];
_animator = nil;
_timer = nil;
}
Let's say we have a view controller with one sub view. the subview takes up the center of the screen with 100 px margins on all sides. We then add a bunch of little stuff to click on inside that subview. We are only using the subview to take advantage of the new frame ( x=0, y=0 inside the subview is actually 100,100 in the parent view).
Then, imagine that we have something behind the subview, like a menu. I want the user to be able to select any of the "little stuff" in the subview, but if there is nothing there, I want touches to pass through it (since the background is clear anyway) to the buttons behind it.
How can I do this? It looks like touchesBegan goes through, but buttons don't work.
Create a custom view for your container and override the pointInside: message to return false when the point isn't within an eligible child view, like this:
Swift:
class PassThroughView: UIView {
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
for subview in subviews {
if !subview.isHidden && subview.isUserInteractionEnabled && subview.point(inside: convert(point, to: subview), with: event) {
return true
}
}
return false
}
}
Objective C:
#interface PassthroughView : UIView
#end
#implementation PassthroughView
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
for (UIView *view in self.subviews) {
if (!view.hidden && view.userInteractionEnabled && [view pointInside:[self convertPoint:point toView:view] withEvent:event])
return YES;
}
return NO;
}
#end
Using this view as a container will allow any of its children to receive touches but the view itself will be transparent to events.
I also use
myView.userInteractionEnabled = NO;
No need to subclass. Works fine.
From Apple:
Event forwarding is a technique used by some applications. You forward touch events by invoking the event-handling methods of another responder object. Although this can be an effective technique, you should use it with caution. The classes of the UIKit framework are not designed to receive touches that are not bound to them .... If you want to conditionally forward touches to other responders in your application, all of these responders should be instances of your own subclasses of UIView.
Apples Best Practise:
Do not explicitly send events up the responder chain (via nextResponder); instead, invoke the superclass implementation and let the UIKit handle responder-chain traversal.
instead you can override:
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
in your UIView subclass and return NO if you want that touch to be sent up the responder chain (I.E. to views behind your view with nothing in it).
A far simpler way is to "Un-Check" User Interaction Enabled in the interface builder. "If you are using a storyboard"
Lately I wrote a class that will help me with just that. Using it as a custom class for a UIButton or UIView will pass touch events that were executed on a transparent pixel.
This solution is a somewhat better than the accepted answer because you can still click a UIButton that is under a semi transparent UIView while the non transparent part of the UIView will still respond to touch events.
As you can see in the GIF, the Giraffe button is a simple rectangle but touch events on transparent areas are passed on to the yellow UIButton underneath.
Link to class
Top voted solution was not fully working for me, I guess it was because I had a TabBarController into the hierarchy (as one of the comments points out) it was in fact passing along touches to some parts of the UI but it was messing with my tableView's ability to intercept touch events, what finally did it was overriding hitTest in the view I want to ignore touches and let the subviews of that view handle them
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
UIView *view = [super hitTest:point withEvent:event];
if (view == self) {
return nil; //avoid delivering touch events to the container view (self)
}
else{
return view; //the subviews will still receive touch events
}
}
Building on what John posted, here is an example that will allow touch events to pass through all subviews of a view except for buttons:
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
// Allow buttons to receive press events. All other views will get ignored
for( id foundView in self.subviews )
{
if( [foundView isKindOfClass:[UIButton class]] )
{
UIButton *foundButton = foundView;
if( foundButton.isEnabled && !foundButton.hidden && [foundButton pointInside:[self convertPoint:point toView:foundButton] withEvent:event] )
return YES;
}
}
return NO;
}
Swift 3
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
for subview in subviews {
if subview.frame.contains(point) {
return true
}
}
return false
}
According to the 'iPhone Application Programming Guide':
Turning off delivery of touch events.
By default, a view receives touch
events, but you can set its userInteractionEnabled property to NO
to turn off delivery of events. A view also does not receive events if it’s hidden
or if it’s transparent.
http://developer.apple.com/iphone/library/documentation/iPhone/Conceptual/iPhoneOSProgrammingGuide/EventHandling/EventHandling.html
Updated: Removed example - reread the question...
Do you have any gesture processing on the views that may be processing the taps before the button gets it? Does the button work when you don't have the transparent view over it?
Any code samples of non-working code?
As far as I know, you are supposed to be able to do this by overriding the hitTest: method. I did try it but could not get it to work properly.
In the end I created a series of transparent views around the touchable object so that they did not cover it. Bit of a hack for my issue this worked fine.
Taking tips from the other answers and reading up on Apple's documentation, I created this simple library for solving your problem:
https://github.com/natrosoft/NATouchThroughView
It makes it easy to draw views in Interface Builder that should pass touches through to an underlying view.
I think method swizzling is overkill and very dangerous to do in production code because you are directly messing with Apple's base implementation and making an application-wide change that could cause unintended consequences.
There is a demo project and hopefully the README does a good job explaining what to do. To address the OP, you would change the clear UIView that contains the buttons to class NATouchThroughView in Interface Builder. Then find the clear UIView that overlays the menu that you want to be tap-able. Change that UIView to class NARootTouchThroughView in Interface Builder. It can even be the root UIView of your view controller if you intend those touches to pass through to the underlying view controller. Check out the demo project to see how it works. It's really quite simple, safe, and non-invasive
I created a category to do this.
a little method swizzling and the view is golden.
The header
//UIView+PassthroughParent.h
#interface UIView (PassthroughParent)
- (BOOL) passthroughParent;
- (void) setPassthroughParent:(BOOL) passthroughParent;
#end
The implementation file
#import "UIView+PassthroughParent.h"
#implementation UIView (PassthroughParent)
+ (void)load{
Swizz([UIView class], #selector(pointInside:withEvent:), #selector(passthroughPointInside:withEvent:));
}
- (BOOL)passthroughParent{
NSNumber *passthrough = [self propertyValueForKey:#"passthroughParent"];
if (passthrough) return passthrough.boolValue;
return NO;
}
- (void)setPassthroughParent:(BOOL)passthroughParent{
[self setPropertyValue:[NSNumber numberWithBool:passthroughParent] forKey:#"passthroughParent"];
}
- (BOOL)passthroughPointInside:(CGPoint)point withEvent:(UIEvent *)event{
// Allow buttons to receive press events. All other views will get ignored
if (self.passthroughParent){
if (self.alpha != 0 && !self.isHidden){
for( id foundView in self.subviews )
{
if ([foundView alpha] != 0 && ![foundView isHidden] && [foundView pointInside:[self convertPoint:point toView:foundView] withEvent:event])
return YES;
}
}
return NO;
}
else {
return [self passthroughPointInside:point withEvent:event];// Swizzled
}
}
#end
You will need to add my Swizz.h and Swizz.m
located Here
After that, you just Import the UIView+PassthroughParent.h in your {Project}-Prefix.pch file, and every view will have this ability.
every view will take points, but none of the blank space will.
I also recommend using a clear background.
myView.passthroughParent = YES;
myView.backgroundColor = [UIColor clearColor];
EDIT
I created my own property bag, and that was not included previously.
Header file
// NSObject+PropertyBag.h
#import <Foundation/Foundation.h>
#interface NSObject (PropertyBag)
- (id) propertyValueForKey:(NSString*) key;
- (void) setPropertyValue:(id) value forKey:(NSString*) key;
#end
Implementation File
// NSObject+PropertyBag.m
#import "NSObject+PropertyBag.h"
#implementation NSObject (PropertyBag)
+ (void) load{
[self loadPropertyBag];
}
+ (void) loadPropertyBag{
#autoreleasepool {
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
Swizz([NSObject class], NSSelectorFromString(#"dealloc"), #selector(propertyBagDealloc));
});
}
}
__strong NSMutableDictionary *_propertyBagHolder; // Properties for every class will go in this property bag
- (id) propertyValueForKey:(NSString*) key{
return [[self propertyBag] valueForKey:key];
}
- (void) setPropertyValue:(id) value forKey:(NSString*) key{
[[self propertyBag] setValue:value forKey:key];
}
- (NSMutableDictionary*) propertyBag{
if (_propertyBagHolder == nil) _propertyBagHolder = [[NSMutableDictionary alloc] initWithCapacity:100];
NSMutableDictionary *propBag = [_propertyBagHolder valueForKey:[[NSString alloc] initWithFormat:#"%p",self]];
if (propBag == nil){
propBag = [NSMutableDictionary dictionary];
[self setPropertyBag:propBag];
}
return propBag;
}
- (void) setPropertyBag:(NSDictionary*) propertyBag{
if (_propertyBagHolder == nil) _propertyBagHolder = [[NSMutableDictionary alloc] initWithCapacity:100];
[_propertyBagHolder setValue:propertyBag forKey:[[NSString alloc] initWithFormat:#"%p",self]];
}
- (void)propertyBagDealloc{
[self setPropertyBag:nil];
[self propertyBagDealloc];//Swizzled
}
#end
Try set a backgroundColor of your transparentView as UIColor(white:0.000, alpha:0.020). Then you can get touch events in touchesBegan/touchesMoved methods. Place the code below somewhere your view is inited:
self.alpha = 1
self.backgroundColor = UIColor(white: 0.0, alpha: 0.02)
self.isMultipleTouchEnabled = true
self.isUserInteractionEnabled = true
Try this
class PassthroughToWindowView: UIView {
override func test(_ point: CGPoint, with event: UIEvent?) -> UIView? {
var view = super.hitTest(point, with: event)
if view != self {
return view
}
while !(view is PassthroughWindow) {
view = view?.superview
}
return view
}
}
I use that instead of override method point(inside: CGPoint, with: UIEvent)
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
guard self.point(inside: point, with: event) else { return nil }
return self
}
If you can't bother to use a category or subclass UIView, you could also just bring the button forward so that it is in front of the transparent view. This won't always be possible depending on your application, but it worked for me. You can always bring the button back again or hide it.
It seems that the UIView has not methods like "didRemoveFromSuperview" or "willRemoveFromSuperview".Then,How to listen to the event when a UIView removed from its superView?I should use KVO? thanks in advance!
This works (tested on iOS8):
-(void) didMoveToWindow {
[super didMoveToWindow]; // (does nothing by default)
if (self.window == nil) {
// YOUR CODE FOR WHEN UIVIEW IS REMOVED
}
}
According to the UIView docs:
The default implementation of this method does nothing. Subclasses can override it to perform additional actions whenever the window changes.
The window property may be nil... This occurs when the receiver has just been removed from its superview or when the receiver has just been added to a superview that is not attached to a window.
This topic is quite old, but I found a way to do it .Since google search wasn't helpful enough, here it is (taken from UIView's docs)
Observing View-Related Changes
– didAddSubview:
– willRemoveSubview:
– willMoveToSuperview:
– didMoveToSuperview
– willMoveToWindow:
– didMoveToWindow
- (void) willMoveToSuperview: (UIView *) newSuperview{
if(newSuperview == nil){
// UIView was removed from superview
} else {
// UIView was added to superview
}
}
You can subclass your UIView and post notifications from it's - (void)removeFromSuperview method.