UIImageView: add tap event from Interface Builder - ios

I know how to programmatically add a gesture recognizer. I was wondering how come when i Ctrl+drag from a UIImageView in the Interface Builder into the code, I'm only given the possibility to link Outlets and Outlets collections but not actions. I have enabled user interactions on the image view, so I would like to know id it's possible to access its actions in a "drag to add" way.

Setting user interactions enabled only allows the image view to work with gesture recognizers. They have no IBAction because they don't allow touch on their own. You would have to drag a gesture recognizer onto the image view to get it to work.
From the UIImageView class ref:
Image views ignore user events by default. Normally, you use image views only to present visual content in your interface. If you want an image view to handle user interactions as well, change the value of its userInteractionEnabled property to true. After doing that, you can attach gesture recognizers or use any other event handling techniques to respond to touch events or other user-initiated events.

Related

iOS - change animation when user taps screen

Lets say that i have an animation - an image is going from left side of the screen to the right. I would like to make it a little bit interactive - when user taps on a screen i want to change direction of image movement. Whats the best approach to implement it?
What I do in some cases is take the main view of the View Controller, in Storyboard, and change the class type of that UIView to UIControl.
In the code that is accessed as MyViewController.view, which you can write:
var viewAsControl = myViewController.view as UIControl
In Swift or some equivalent of that.
The UIControl subclass of UIView is the hierarchical layer (class) that adds the action/target facilities to a view. For example, UIButton is a UIControl, because it generates events (actions), and it is also a UIView so it can be added as a subview.
Then from the Connections Inspector, accessed via the far right Icon of the far right panel (that is, the panel to the right of the storyboard editor window), I'd select the Touch Up Inside event type or some other event and drag it to an #IBAction tagged function I'd add to the View Controller's source code, to receive the tap event. From that tap notification, you can cancel the current animation and add a new one, etc...
Alternatively, you can create an IBOutlet for the view if you've turned it into a UIControl in IB, and use the addTarget() method to assign an action handler for a specific event, e.g. to make it call a function in your code.
Either way the effect will be that any time the view is tapped, it will generate the event for you to respond to

Proper UIGestureRecognizer and Delegate design

This is a pretty hypothetical question just to understand proper design but lets say I have two custom UIViews.
One of them is essentially a container that I'll call a drawer. Its purpose is to hide and show content. It's a lot like the notification center on iOS where you swipe to pull it open and flick it back up to close it. It's a generic container than can contain any other UIView. It has a UIPanGestureRecognizer to track the finger that's pulling it open/closed. It might also have a UISwipeGestureRecognizer to detect a "flick".
The other view is a custom map widget that has UIPan/Rotation/Pinch GestureRecognizers.
I think the drawer view should be the UIGestureRecognizerDelegate for the Pan/Swipe GestureRecognizers so that it can prevent touches from being delivered unless the user is grabbing "the handle".
My first instinct is for the map to be the UIGestureRecognizerDelegate of the pan/rotation/pinch gestures so that it can allow them to all run simultaneously.
The problem I'm having is that, I really don't want the map to receive any touches or begin recognizing gestures until the drawer is completely open. I'd like to be able to enforce this behavior automatically in the drawer itself so that it works for all subviews right out of the box.
The only way that I can think to do this is to wire all of the gestures handlers to the ViewController and let it do everything, but to me that breaks encapsulation as now it has to know that the map gestures need to run simultaneously, that the drawer should only get touches on it's handle and that the map should only get touches when it's open.
What are some ways of doing this where the logic can stay in the Views where I think it belongs?
I would do something like this to make the subviews of the drawer disabled while panning. Essentially loop through the drawer's subviews and disbale interaction on them.
[self.subviews enumerateObjectsUsingBlock:^(UIView *subview, NSUInteger idx, BOOL *stop){
subview.userInteractionEnabled = NO;
}];
And something similar again for when you want to re-enable user interaction on the subviews.
This should already Just Work™. A gesture recogniser is attached to a view; when a continuous gesture is recognised, all subsequent touches associated with that gesture are associated with that view.
So in your case, when the drawer pan is recognised, no touches associated with that pan should ever cause behaviour in your map view's pan/pinch/rotation gestures (unless you explicitly specify that they should using the appropriate delegate methods).
Or do you mean that you want to prevent the user from, halfway through opening the drawer, using another finger (i.e. another gesture) to start scrolling the (half-visible) map? If so, you should just set userInteractionEnabled on the drawer's contentView (or equivalent) to NO at UIGestureRecognizerStateBegan/Changed and YES again at UIGestureRecognizerStateEnded/Cancelled.

What methods to call for handling events for UIObject?

Is there a pattern Apple follows for which methods to call for handling an event for a UI objects?
What I mean is, to set an action for UIButton is [button addTarget:action:forControlEvents]
and for an ImageView I have to [imageView addGestureRecognizer]
I can never remember what methods to call. Is there an easy way to remember?
The "pattern" is that it is normal behavior for a button to respond to control events while it is not very common for an image view to respond to user interaction. Normally if you want to have a tappable image, you would use a UIButton and set an image on it. Apple decided to write the control events into UIButtons but not normal UIViews or UIImageViews.
So basically, you can use a control event if it is a button, otherwise you must use a different method. For normal views, gesture recognizers are a good option.

How to get stepper and longpress to coexist?

I tried setting up a view with a longpress gesture and a stepper configured for continuous updates. With the longpress, the continuous feature of the stepper does not occur. For now, I've disabled the longpress. I guess I don't need it. But for future reference, how would I allow for both to coexist?
Just to be clear, here is the way the screen was set up when I tried this.
App was set up with a simple view controller.
A subview was added to this view (could have been a controller, but I just made it a UIView).
Several labels and stepper were added to this subview.
The steppers were wired up as outlets and actions.
A longpress recognizer was added to the main view in IB.
For completeness, a tap gesture was also added to the main view in IB.
Taps on the main view function as expected. Taps on the steppers function as expected. Longpress on the main view functions as expected. Longpress on the stepper does not.
I modified the code called by the longpress to check for the frame of the subview and not act if the touch location was within that rectangle, but that didn't make a difference. I did not try getting the longpress to fail in that situation, but I suppose I'll try that next. EDIT: OK, maybe not. There doesn't seem to be an API for that. However, there is this kludge, that I'm not going to try.
Attached is a screen shot from profiler with an inverted call tree so you can see what each item is being called by.
darkStepped: is the IBAction that is called by the stepper. If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
The stack trace reveals that the stepper's _updateCount method is dispatched through a timer.
This could be related to the fact that a stepper has an "autoIncrement" mode where, as long as your keep it pressed, it will update at a given (varying) rate. So, instead of simply calling _updateCount, the stepper sets up a timer to handle this behaviour.
For whatever reason the timer is used, the timer explains why you do not see the gesture recogniser in the stack trace.
In your case what happens is that the stepper gets the touches, handles them, and do not forward them to any gesture recognisers attached to it.
This can be explained as follows, although this snippet does not explicitly mention a long press recogniser in relation to a UIStepper control:
According to Apple Docs:
Interacting with Other User Interface Controls
In iOS 6.0 and later, default control actions prevent overlapping gesture recognizer behavior. For example, the default action for a button is a single tap. If you have a single tap gesture recognizer attached to a button’s parent view, and the user taps the button, then the button’s action method receives the touch event instead of the gesture recognizer. This applies only to gesture recognition that overlaps the default action for a control, which includes:
A single finger single tap on a UIButton, UISwitch, UIStepper, UISegmentedControl, and UIPageControl.
...
If you have a custom subclass of one of these controls and you want to change the default action, attach a gesture recognizer directly to the control instead of to the parent view. Then, the gesture recognizer receives the touch event first. As always, be sure to read the iOS Human Interface Guidelines to ensure that your app offers an intuitive user experience, especially when overriding the default behavior of a standard control.
So, it seems you can attach the gesture recogniser directly to the control (possibly you need to subclass UIStepper for this to work, I am not really sure how to interpret the last paragraph). Hopefully this will not disable the basic workings of the stepper (but maybe it will).
After carefully reviewing Apple's docs again, I've found the solution. I added the view controller as the delegate to the longpress gesture recognizer
self.longPress.delegate = self;
(and, of course, adding <UIGestureRecognizerDelegate> to the interface, and then added this method to the view controller:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
// Determine if the touch is inside the custom subview
if (gestureRecognizer == self.longPress) {
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(self.antControl.frame, touchLocation)) {
return NO;
}
}
return YES;
}
This way the gesture recognizer doesn't even get called when the longpress occurs within the frame of self.antControl, which is the subview mentioned in the question.

How to react to a "touch up inside" event in UIAccessibilityElement subclass?

I have a map with drawn items. The map handles touch events and determines the touched items as if they were buttons.
I made the map a container and implemented the methods to return accessibleElements. For each item I create one instance of a UIAccessibilityElement subclass.
It seems UIAccessibilityAction protocol has no callback for a "tap" or "button pressed" event.
How would I mimic the effect of a UIButton with UIAccessibilityElement then?
Assuming you are running under iOS 5 or iOS 6, consider the following workaround. It is not optimal, but will work until there is a better way:
Create a dummy view that is not, itself, an accessibility element.
On this view, implement your "tap" handling in -touchesEnded:withEvent:.
Set your accessibility element's accessibilityActivationPoint to a value that falls within this dummy view.
Your dummy view will receive touch events when the corresponding accessibility element is activated. Make sure to ignore touch handling in your dummy view if VoiceOver and other assistive technologies are not running.
EDIT: Another less hacky approach is to implement a tap gesture recognizer on the view you're concerned with, convert the coordinate from -touchesEnded:withEvent: to screen coordinates, and manually hit test the point against the frames of your accessibility elements.

Resources