I have implemented Vertical Slider in one of my App which extends UISlider. when scrolling is ended/done I am sending commands to the server with the Slider values. Sometimes when I scroll fast up/down and release then slider values are getting mismatched before sending command and after sending command.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"Value of Slider before sending command=%f",self.value);
[self sendCommand]; // Here value is something else
[super touchesEnded:touches withEvent:event];
NSLog(#"Slider value after sending command=%f",self.value); // Here value changed
}
But if I place super call before sending command then everything works fine. Please explain if anyone knows why this is happening.
[super touchesEnded:touches withEvent:event];
More interesting fact is if I don't call super then also everything works well.
This is because the method [super touchesEnded:touches withEvent:event]; is where the slider updates it's value, based on the interaction. So the value is out of date before super is called.
A call to super should usually be the first line of an overridden method.
Even though the name touchesEnded: implies that the slider is done updating, it still might update the value depending on how the user lifted up their finger. This makes sense if you think about a user sliding quickly and then lifting up their finger—they probably expect it to go slightly farther in that direction than their finger actually went.
UIKit calls this method when a finger or Apple Pencil is no longer
touching the screen. Many UIKit classes override this method and use
it to clean up state involved in the handling of the corresponding
touch events. The default implementation of this method forwards the
message up the responder chain. When creating your own subclasses,
call super to forward any events that you do not handle yourself. For
example, [super touchesEnded:touches withEvent:event];
If you override this method without calling super (a common use
pattern), you must also override the other methods for handling touch
events, even if your implementations do nothing.
Related
Recently I have put a breakpoint in a UIViews method
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
}
method and checked if the compiler stops here when a user taps on the UIView while voiceover is on, but it never came to the breakpoint, does anyone know what gets called and how the touch can be intercepted?
The standard hitTest mechanism is not used when VoiceOver is on. Instead, UIView has an _accessibilityHitTest:withEvent: method, but unlike macOS, it is private and can't easily be overridden or called.
Similar to hitTest, _accessibilityHitTest uses _accessibilityPointInside:withEvent:, which, in turn, calls pointInside:withEvent: (which is public).
First of all, note that users must double-tap to "activate" or "tap" a view when VoiceOver is enabled. If you still aren't hitting hitTest:…, then break on acccessibilityActivate(). This is the default accessibility action triggered by a double-tap. You may also be interested in the activationPoint, which is the default location of the simulated touch VoiceOver emits upon activation. Note that the activation point isn't relevant to all VoiceOver interactions (eg. adjustable controls).
The hit-test view is given the first opportunity to handle a touch event. If the hit-test view cannot handle an event, the event travels up that view’s chain of responders as described in “The Responder Chain Is Made Up of Responder Objects” until the system finds an object that can handle it. Please look at this.
I have a UIButton in a View and over that button I have a UILabel (TTTAttributedLabel). (https://github.com/mattt/TTTAttributedLabel) (Though, this is not a direct question related to TTTAttributedLabel custom class)
This UILabel performs some actions with touch and If doesn't like touch coordinates it forwards them to super with the followings in corresponding methods:
[super touchesBegan:touches withEvent:event];
[super touchesMoved:touches withEvent:event];
[super touchesEnded:touches withEvent:event];
[super touchesCancelled:touches withEvent:event];
I see that the button below is being pressed in that case. I also see that all of those methods in UIButton is getting called perfectly. I also log UITouch objects to see if they match.
The problem is, the action set to UIButton is not getting called (even for UIControlEventTouchDown). What does trigger those actions? I thought that a matching touchesBegan with touchesEnded should have called the selector added for target action to that button.
Touches trigger the control events, and I would say your button does not receive touches simply because there is a label over a button.
There are two ways you could go around this. First, and probably the preferred way is to use the atributedTitle proprety of a UIButton instead of mashing an additional label where it's not necessary:
https://developer.apple.com/library/ios/documentation/uikit/reference/UIButton_Class/UIButton/UIButton.html#//apple_ref/occ/instm/UIButton/attributedTitleForState:
The other way is to delve deeper into UIControl, possibly subclass it, and then handle the touch events on a bit of a finer scale, although this is far more complex than the solution above:
https://developer.apple.com/library/ios/documentation/uikit/reference/uicontrol_class/reference/reference.html
I have a UIView subclass where I handle touches using touchesBegan, touchesMoved, touchesEnded.
I noticed that when I start a touch inside and then drag outside the UIView touchesEnded is not triggered, is there a way to have it called when I'm dragging outside the UIView frame?
Thanks
Use touchesCancelled instead.
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
}
Pssibly it is calling the - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
Please check UIResponder Class Reference
touchesCancelled:withEvent:
Sent to the receiver when a system event (such as a low-memory
warning) cancels a touch event.
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
Parameters
touches
A set of UITouch instances that represent the touches for the ending phase of the event represented by event. event
An object representing the event to which the touches belong.
Discussion
This method is invoked when the Cocoa Touch framework receives a
system interruption requiring cancellation of the touch event; for
this, it generates a UITouch object with a phase of
UITouchPhaseCancel. The interruption is something that might cause the
application to be no longer active or the view to be removed from the
window
When an object receives a touchesCancelled:withEvent: message it
should clean up any state information that was established in its
touchesBegan:withEvent: implementation.
The default implementation of this method does nothing. However
immediate UIKit subclasses of UIResponder, particularly UIView,
forward the message up the responder chain. To forward the message to
the next responder, send the message to super (the superclass
implementation); do not send the message directly to the next
responder. For example,
[super touchesCancelled:touches withEvent:event];
If you override this method without calling super (a common use
pattern), you must also override the other methods for handling touch
events, if only as stub (empty) implementations. Availability
Available in iOS 2.0 and later.
You should still be getting touchesMoved in your viewController even you are outside of your view's frame. I would add array to your custom UIView which would keep touch objects that this view is assigned to and implement the following logic:
When you receive touchesBegan - add touch object to array for given UIView where touch coordinates match (you may have to iterate over all your subviews and match coordinates with frame to find the right one)
When you receive touchesMoved - remove touch from UIView's array for previous location and add it to UIView for current location if any
When you receive touchesEnded & touchesCancelled - remove touches from all UIViews arrays
When you remove or add touch object from your custom UIView array you may call your delegate methods (implement observer pattern) because at that point you know if it's really pressed or unpressed event. Keep in mind you may have many touch objects in one array so before calling your delegate check whether you added first object or deleted last one because that's the only case you should call your delegate.
I have implemented such solution in my last game, but instead of UIViews I had virtual buttons.
I upgraded my iPad to 5.0 a couple days ago, and upgraded Xcode to 4.2 at the same time so I could continue to test my apps. Now I am having problems with touch code in several apps that worked with previous versions of Xcode.
I subclassed UIImageView to add some dragging features by overriding -(void)TouchesBegan and -(void)TouchesMoved. I did not override -(void)TouchesEnded in the subclass, but handled that in the view controller for the view that contains the image view.
I pulled the subclassed UIImageView into a new project for testing, and have narrowed down the issue to the fact that the parent UIView (the template created by Xcode) does not seem to be forwarding touch events to the view controller (also created by Xcode).
If I add this to my subclass:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touches ended event in ImageToDrag");
[self.nextResponder touchesEnded:touches withEvent:event];
}
and this to my parent view's view controller:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touches ended event in ViewController");
}
when I let go of the image I am dragging around the screen, I get the "touches ended event in ImageToDrag", but not the log from the view controller.
However, if I intentionally skip over the view by doing this in the subclassed view:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touches ended event in ImageToDrag");
[[self.nextResponder nextResponder] touchesEnded:touches withEvent:event];
}
then I get both log entries.
The only explanation I can come up with is that for some reason, UIView is consuming the touchesEnded event and not passing it to the view controller.
I have verified that exclusiveTouch is set to NO, and userInteractionEnabled is set to YES on the parent view and the subclassed UIImageView.
I have also tested compiling for iOS 5.0 and iOS 4.2, and deploying the test app to both an iOS 5 iPad and iOS 4.3.1 iPad.
The only way I have been able to get the touch event to the viewController is by skipping over the view and using the double nextResponder in the subclass. Although that method functions, it seems like a hack to me and I'm sure it will come back to bite me later.
Has anybody else seen this behavior? Is there any way for me to find out what the UIView is doing with my touch events?
Thanks,
Dan
I've been trying to track down the a similar issue for the last few hours. Finally managed to solve it with the help of this post
Actually it looks like I just managed to solve it, using the hint from
https://devforums.apple.com/message/519690#519690
Earlier, I just
forwarded the touchesEnded event to self.nextResponder. When I added
touchesBegan, Moved, Cancelled handlers with similar implementations
as the touchesEnded, the event seems to bubble up to the root view
controller.
So I guess on iOS5, views discard touchesEnded events
when they have not seen the relevant touchesBegan.
I didn't need to add Moved/etc., I just forwarded on TouchesBegan, and then TouchesEnded start working again!
Some touch handling did chance in iOS 5.0; especially if you re-link your application against the 5.0 SDK.
There's a section UIView's touch handling methods that says this:
If you override this method without calling super (a common use pattern), you must also override the other methods for handling touch events, if only as stub (empy) implementations.
So if you do one, you need to do them all. I know UIKit started taking steps to make sure this was the case in 5.0.
So I'd start there - override all the methods on your view and see what happens.
I have a UITableView which I present in a UIPopoverController. The table view presents a list of elements that can be dragged and dropped onto the main view.
When the user begins a pan gesture that is principally vertical at the outset, I want the UITableView to scroll as usual. When it's not principally vertical at the outset, I want the application to interpret this as a drag-and-drop action.
My unfortunately lengthy journey down this path has compelled me to create a custom UIGestureRecognizer. In an attempt to get the basics right, I left this custom gesturer as an empty implementation at first, one that merely calls the super version of each of the five custom methods Apple says should be overridden:
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
(void)reset;
This results in nothing happening, i.e. the custom gesture's action method is never called, and the table view scrolls as usual.
For my next experiment, I set the gesture's state to UIGestureRecognizerStateBegan in the touchesBegan method.
This caused the gesture's action method to fire, making the gesture appear to behave just like the standard UIPanGestureRecognizer. This obviously suggested I was responsible for managing the gesture's state.
Next up, I set the gesture's state to UIGestureRecognizerStateChanged in the touchesMoved method. Everything still fine.
Now, instead, I tried setting the gesture's state to UIGestureRecognizerStateFailed in the touchesMoved method. I was expecting this to terminate the gesture and restore the flow of events to the table view, but it didn't. All it did was stop firing the gesture's action method.
Lastly, I set the gesture's state to UIGestureRecognizerStateFailed in the touchesBegan method, immediately after I had set it to UIGestureRecognizerStateBegan.
This causes the gesture to fire its action method exactly once, then pass all subsequent events to the table view.
So...sorry for such a long question...but why, if I cause the gesture to fail in the touchesBegan method (after first setting the state to UIGestureRecognizerStateBegan), does it redirect events to the table view, as expected. But if I try the same technique in touchesMoved (the only place I can detect that a move is principally vertical), why doesn't this redirection occur?
Sorry for making this more complicated than it actually was. After much reading and testing, I've finally figured out how to do this.
First, creating the custom UIGestureRecognizer was one of the proper solutions to this issue, but when I made my first test of the empty custom recognizer, I made a rookie mistake: I forgot to call [super touches...:touches withEvent:event] for each of the methods I overrode. This caused nothing to happen, so I set the state of the recognizer to UIGestureRecognizerStateBegan in touchesBegan, which did result in the action method being called once, thus convincing me I had to explicitly manage states, which is only partially true.
In truth, if you create an empty custom recognizer and call the appropriate super method in each method your override, your program will behave as expected. In this case, the action method will get called throughout the dragging motion. If, in touchesMoved, you set the recognizer's state to UIGestureRecognizerStateFailed, the events will bubble up to the super view (in this case a UITableView), also as expected.
The mistake I made and I think others might make is thinking there is a direct correlation between setting the gesture's state and the chronology of the standard methods when you subclass a gesture recognizer (i.e. touchesBegan, touchesMoved, etc.). There isn't - at least, it's not an exact mapping. You're better off to let the base behavior work as is, and only intervene where necessary. So, in my case, once I determined the user's drag was principally vertical, which I could only do in touchesMoved, I set the gesture recognizer's state to UIGestureRecognizerStateFailed in that method. This took the recognizer out of the picture and automatically forwarded a full set of events to the encompassing view.
For the sake of brevity, I've left out a ton of other stuff I learned through this exercise, but would like to point out that, of six or seven books on the subject, Matt Neuburg's Programming IOS 4 provided the best explanation of this subject by far. I hope that referral is allowed on this site. I am in no way affiliated with the author or publisher - just grateful for an excellent explanation!
That probably happens because responders expect to see an entire touch from beginning to end, not just part of one. Often, -touchesBegan:... sets up some state that's then modified in -touchesMoved..., and it really wouldn't make sense for a view to get a -touchesMoved... without having previously received -touchesBegan.... There's even a note in the documentation that says, in part:
All views that process touches,
including your own, expect (or should
expect) to receive a full touch-event
stream. If you prevent a UIKit
responder object from receiving
touches for a certain phase of an
event, the resulting behavior may be
undefined and probably undesirable.