How do I bringToFront a subview when I touch it? - ios

I have a rotating carousel menu made up of 6 UIViews that were added as subViews to self.view. When you rotate the carousel, some subviews are partially behind the subview closest to the user but the problem is that the subview closest to the user may not have been added after the one behind it so when I touch it, the one behind it gets triggered.
My question is, is there a way to programmatically use bringToFront whenever a subview is touched so that it will not matter whether or not it was added first or last to the view.

When you have 2 views responding to touch events and one is in front of the other, the other one will never receive the touch.
Instead of messing around with the event chain (aka subclassing UIView and overriding hitTest…) I'd suggest you reorder the views while spinning the carousel.
The view which appears to be in front should be the topmost view in the view hierarchy.

Related

Make a UIView receive taps but pass swipes to the underlying view

I've got a UITableView and a big "button" view in front of it. The "button" view, which has transparent areas, should be able to response to a tap. But enabling user interaction for this view blocks any scrolling touches from getting to the table view located under the "button" view.
The upper view is a UIView (not UIButton). Given how the two views work together, the upper view is essentially part of what's going on with the table view and reacts to the table view being scrolled. But scrolling is the main thing and I'd like the user to have the largest scrolling area possible.
How do I best resolve this conflict so that the table view is scrollable as usual?
I guess you could subclass your UIButton and UITableView common superview, and override its hitTest:withEvent: to verify which view is hit, something like if you are in a clear or an opaque part of the button?
As pbush25 is mentionning however, it goes more or less against Apple's recommendation.

Opaque UIView not letting me scroll UIView behind it

I have a view hierarchy that looks like this.
buttonsView <-- UIView with 1-3 small buttons
MKMapView <-- bottom most view
When my buttonsView is shown I still want the user to be able to scroll the MKMapView if the user is NOT touching any of the buttons.
I have tried different combinations of userInteractionEnabled = NO but nothing helps.
You have several ways to solve this:
The top view with the three small buttons can be much smaller, so that is only covers the area the three small buttons need. By this, the top view won't cover the map view, and you can still scroll it around.
Implement your own hitTest / pointInside functions to let the top view decide whether it wants to catch an event (when you tap on one of its buttons) or it decides to send the event further up the responder chain (when the user taps anything else). See for example here for possible ways to do it: Allowing interaction with a UIView under another UIView
The view on the front takes all the interactions. No matter if it is transparent or not. You should pass the gestures to the MKMapView in the bottom.
Another approach is to resize the view with the buttons to not cover the whole MKMapView, but only the part with the buttons. In that way, the user will be able to scroll only in the area where there are no buttons.

Gesture Recognizer on sub-view outside view's bounds

I'm not sure if this is possible, but I have a view that is able to be dragged around the screen via pan gestures. Once the view is selected, little grippers appear on the corners of the view that allow the user to resize the view. The problem is, those grippers go outside the bounds of the view (they still show up, because clipSubviews is off), but gesture recognizers on those grippers are not firing when selecting the part of them that is drawn outside of the view. Making the view bigger to actually hold the grippers would break a lot of already created logic that is based on the size of the view, so that is a last resort for me.
Is there any other way to get gesture recognizers to work on views that are drawn outside of their parent view?
You could try overriding hitTest:withEvent: in a UIView subclass, and return the gripper view.

Scroll View doesn't scroll when touching and holding then swiping

I have UIScrollView with other UIView elements inside. My other UIView elements are mostly segmented controls. If I click on a UISegmentedControl and hold for a second and then try to scroll, then no scrolling happens. I can only scroll when my finger touches and swipes immediately. I checked other iOS applications such as mail. The behavior was that you touch and hold on a mail, then it's highlighted, but as soon as finger moves away, the scrolling happens and highlighting is undone. How can I implement this behavior?
The issue was the property of UIScrollView. The property canCancelContentTouches was set to NO. Because of that, touch events were handled by subviews of scroll view and swiping didn't cause scrolling.
You can follow one of these steps:
If you are using UISegmentedControl over you UIScrollView, instead of that, add the UISegmentedControl over your controller's view.
If you want to use UISegmentedControl over your scrollView, then you have to create a custom scrollView by creating a subclass of UIScrollView and use an image view instead of UISegmentedControl adding the labels which can act as the segments. This is because your UISegmentControl itself is a touch handler and it breaks the UIResponder chain. So, the scrolling might face issues during the touch events.
Please let me know if any of these works. Thanks :)

Recognize a touch for a UIView that is only created at the moment that touch occurs

I created a "slide view" (a UIView subclass) which animates on screen by dragging it up. The animation and everything else related to the animation works perfectly fine. This question targets only the very first touch on the screen when the slide view itself will be initialized:
The slide view itself uses the UIPanGestureRecognizerto recognize touches. The thing is, my slide view will be initialized only at the time when the user touches down a UIButton. Parts of the slide view are initially locates on that button, so that when the user touches that button, the touch is also located inside the slide view's frame.
I only want to create the view at the time the touch occurs, because the view is pretty heavy. I don't want to waste resources cause often the button is not even used.
How can I make the slide view recognize that first touch that also initializes (and adds it as a subview to super) the slide view itself?
You can check this out for more details:
Gestures
Well and you can add both gesture pan as well as tap gesture. It will definitely work as tap is not the first action of the pan gesture. So no need to wait for tap gesture to fail.
In short you can add both gestures and handle them simply.

Resources