I found that touchDown event is kind of slow, in that it requires a major, fairly long touch and does not respond to a light tap. Why is that?
Whereas, touchesBegan responds just when I need it to, i.e. responds even to very light, quick touches. But that's not an event but a method that can be overridden.
Problem is, touchesBegan apparently requires me to either 1) subclass a label (I need to respond to touching a label), or 2) analyze the event to figure out whether it came from the right label. I am wondering whether it's a code smell and whether there should be an event for simple touch.
Try to add a UITapGestureRecognizer to your label.
First of all, allow label to handle user interaction:
label.userInteractionEnabled = true
You can assign tap gesture to label. Then in handler method you need to switch over state property of recognizer. If it is .Began, you got the event that you need.
The cool thing about this approach that you could use this one handler for all of your labels. Inside handler, you can get touched label like this:
let label = tapRecognizer.view as! UILabel
"Code smell"? No, it's a user interface smell. A user interface stink.
If you make a button in your user interface behave different from buttons in any other application, people will hate your app. Do what people are used to.
Related
I have a custom control to increment and decrement values. Now that I've added support for voice over, I've stumbled upon a problem.
My customView has the accessibility trait .adjustable and I implemented the correct methods for increasing and decreasing the values.
However, the voice over user can also double tap on that view to activate it. The problem is, that this triggers a gesture which is irrelevant to voice over users.
Is there a way to prevent an adjustable accessibility view from being activated so that the element is only adjustable, not double-tappable like a button?
There are two important properties to know when a double-tap occurs:
accessibilityActivate.
accessibilityActivationPoint.
In your case, you could just return true by overriding accessibilityActivate and if it's not enough, provide as well a CGPoint coordinate that triggers nothing (depends of your custom control and its neighborhood).
Otherwise, use the accessibilityElementIsFocused instance method to know wether you can trigger actions as this complete example shows up.
I ended up using UIAccessibility.isVoiceOverRunning to stop any tasks which would be triggered by a doubletap on that specific element.
In our current UI, next to certain labels, we have a help-tip button that when clicked, explains the details of what the label references. As such, VoiceOver identifies these two items as separate accessibility items.
However, when using accessibility, we're hoping we can just do everything in the label itself. This way when the label gets focused, the user will here 'Account value, $20 (the accessibilityLabel), double-tap for help (the accessibilityHint)'
However, unlike a button, a label doesn't have an action associated with it so I'm not sure how to wire up actually triggering the accessibility gesture indicating I want to do something.
Short of converting all of our labels over to buttons, is there any way to listen to the accessibility 'action' method on our labels?
My current work-around is to make only the Help-tip buttons accessible, then move all the relevant information to their accessibility properties, but that seems like code smell as it's easy for a developer to miss that when updating the code.
In your UILabel subclass, override accessibilityActivate() and implement whatever double-tapping should do:
override func accessibilityActivate() -> Bool {
// do things...
return true
}
If the action can fail, return false in those instances.
Have your tried adding a UITapGestureRecognizer to the Labels?
Something like :
let tapGesture: UITapGestureRecognizer = UITapGestureRecognizer(target: self, action: "tapResponse:")
tapGesture.numberOfTapsRequired = 1
sampleLabel.userInteractionEnabled = true
sampleLabel.addGestureRecognizer(tapGesture)
func tapResponse(recognizer: UITapGestureRecognizer) {
print("tap")
}
Ok, this was easier than I thought. To make a UILabel respond to accessibility actions similar to how a button does, you simply implement a UITapGestureRecognizer. The Accessibility framework uses that just like any other UIView.
let tapGestureRecognizer = UITapGestureRecognizer(target:self, action:#selector(labelTapped))
testLabel.userInteractionEnabled = true
testLabel.addGestureRecognizer(tapGestureRecognizer)
Once you do that, your label will respond to accessibility actions.
Group your label and your hint button as one unique accessible element.
Once done, you can use :
The accessibilityActivationPoint property to define the hint button to be triggered when the double tap occurs.
The accessibilityActivate method to indicate the action to be done when you double tap your new element.
According to your environment, I don't recommend to implement a custom action for such a simple use case... the two solutions above should do the job.
Absolutely! You can do this by using UIAccessibilityCustomActions on the accessibility element rather than using tap gesture recognizers. This is because accessibility operates differently than normal users and single tapping while the voice over focus lands somewhere will not give you the desired result as in the case of a normal use, nor will it permit you to execute multiple options on the same accessibility element.
At their recent WWDC, Apple put out an excellent video explaining how to add UIAccessibilityCustomActions to any kind of accessibility element. If you start this video 33 minutes in, you will be able to see how this is implemented.
Once in place, your Voice Over users will be able to scroll through the options and select the one that most suits his/her intentions, thereby permitting multiple actions to be accessible from the same UILabel.
I'm rather confident [editable] UITextView's become firstResponder when a long press or tap gesture occurs within the scrollView. I want to identify where in the view this touch occured. Digging through the documentation and source code didn't yield me much. I might be going about this wrong. My concern is a race condition if I just add my own tap recognizer (how can I be sure it is called before the textView's delegate methods).
For practical clarification, I want to call two similar functions from a delegate function (editingDidBegin) but depending if they touched the left or right half of the text view, I want to call either of the two.
Is there a way to begin a UIPanGestureEvent if the finger is already pressed at the time the object is instantiated?
I have a situation where when a user holds their find on a screen I create a UIView under their finger.
I want them to be able to drag that around and as such I have put a UIPanGestureRecognizer inside the UIView.
Problem is I need to take my finger off and put it back to trigger the UIPanGestureRecognizer to start up. I need it to start from an already pressed state.
Do you know how I can activate a UIPanGesture from an already pressed state i.e. can I get the touch event thats already active at the time of instantiation and pass it along?
You can do it, but the UIPanGestureRecognizer will need to exist already on the view behind the view you create (and you will then have to adjust your calculations based on this; not difficult).
The reason is that, under the circumstances you describe, the touch does not belong to the UIView you create - it belongs to the UIView behind it, the one that the user was originally touching. And given the nature of iOS touch delivery, you can't readily change that. So it will be simpler to let that view, the actual original touch view, do the processing of this touch.
I think Matt's solution is best so I am going to mark it as correct.
However my code structure wasn't going to allow me to cleanly implement it. Compounding the issue was the object listening was listening for a UILongGestureRecognizer.
So my solution was as follows:
Create a callback in my ViewController that would handle the longGestureOverride call
Add a callback to the object listening for the longGesture that would call the longGestureOverride callback and pass along the point
Manually move the object based on the point passed back
If the user lifts their finger, I disable the longGestureOverride callback, and begin using the UIPanGesture inside the new object
I'm trying to handle touch events with touchesBegan in an overlay to a parent UIView but also allow the touch input to pass through to sibling UIViews underneath. I expected there would be some straight-forward way to process a touch event and then say "now send it to the next responder as if this one didn't exist", but all I can find is the nextResponder method which appears to be giving back the parent to my overlay view. That parent is then not really passing it onto the next sibling of that overlay view so I'm stuck uncertain how to do what seems like a simple task that is usually accomplished with a touch callback that gets a True or False return value to tell it whether to keep processing down the widget hierarchy.
Am I missing something obvious?
Late answer, but I think you would be better off overriding hitTest:withEvent: instead of touchesBegan. It seems to me that touchesBegan is a pretty "high-level" method that is there to just do a simple thing, so you cannot alter at that level if the event if propagated further. The right place to do that is hitTest:withEvent:.
Also have a look at this S.O. answer for more details about this point.
I understand the desired behavior you're looking for Joey - I haven't found something in the API that supports this automatic messaging-up-the-chain behavior with sibling views.
What I originally wrote below was with respect to just informing a parent UIView about a touch. This still applies, but I believe you need to take it a step further and have the parent UIView use the hit testing technique that Sergio described on each of it's subviews that are siblings to the overlay, and have the parent UIView manually invoke a "do something" method on each of it's subviews that pass the hit test. Each of those sibling views can return a BOOL value on whether to abort informing other siblings or continue the chain.
If you find yourself using this pattern a lot, consider adding a category method on UIView that encapsulates the hit testing and asking views to perform a selector.
My Original Answer
With a little bit of manual work, you can wire this together yourself. I've had to do this, and it worked for me, because I had an oft-repeated use case (an overlay view on a button), where it made sense to create some custom classes. If your situation is similar, one of these techniques will suffice.
Option 1:
If the overlay doesn't need to do anything but look pretty, have it opt out of touch handling completely with userInteractionEnabled = NO. This will make it so that the touch event goes to it's parent UIView (the one it is an overlay to).
Option 2:
Have the overlay absorb the touch event (as it would by default), and then invoke a method on the parent UIView indicating that a touch or certain gesture was recognized, and here's what it is. This way, the UIView behind the overlay still gets to act on the touch recognition, even if someone else did the interception.
With Option 2, it's more a fit for simple UIControlEvent types, like UIControlEventTouchDown and UIControlEventTouchUpInside. In my case (a custom UIButton subclass with a custom overlay view on top of it), I'll wire touch down and touch up events on the button to two separate methods. These fire if a touch down or touch up inside event occurs on the button itself. But, they are also hooks I can invoke from the overlay view if I need to simulate that a button press occurred.
Depending on your needs, you could have a known protocol between the overlay and it's parent UIView or just have the overlay test the UIView informally, with a respondsToSelector: check before invoking performSelector: on it with the custom method you want called that would have fired automatically if the UIView wasn't covered by an overlay.