canPerformAction:with sender method called on tap in ios 9 - ios

one thing I observed in iOS 9.0 is that when I tap on button or TableView, canPerformMethod:withSender: method is called with sender as UIButton type. I am using this method to prepare my customized option menu.
I did not observed that in previous iOS. Can anyone see me API changes, because I went through overall changes of iOS, but I did not find above mentioned changes in change log or change history.

Per Apple Documentation,
iOS 3.0 introduced system capabilities for generating motion events,
specifically the motion of shaking the device. The event-handling
methods for these kinds of events are motionBegan:withEvent:,
motionEnded:withEvent:, and motionCancelled:withEvent:. Additionally
for iOS 3.0, the canPerformAction:withSender: method allows responders
to validate commands in the user interface while the undoManager
property returns the nearest NSUndoManager object in the responder
chain.
So, all the UIResponder sub classes are entitled to receive a call back for canPerformAction:withSender:. You should use sender parameter to do the handling in this method.

Related

iOS 9 event for lightly double tapping home button i.e. Reachability? [duplicate]

Any ways to detect the new Reachability gesture of iOS8 in Objective-C?
The gesture is activated double tapping the TouchID button on the iPhone6 and iPhone6Plus.
There are no public APIs for it.
There are two related private API methods on UIApplication I can find (using either of these should get your app rejected from the App Store):
_setReachabilitySupported:, which presumably would en/disable reachability (like Spotlight)
_deactivateReachability, which would return the view to the normal place on the screen
I don't see anything that informs your application that the user has performed the gesture, however.
You could also experiment with subclassing UIWindow and overriding setFrame:. Set a breakpoint in this method, and if it fires when you enable Reachability, you can look at the stack trace for more information.

How do we use Watchkit Touch Events?

I wanna use touch events in my app. I know gesture recognisers can not be used in watchKit. Is it possible to use functions like touchesBegan, touchesMove etc ?
Apple Watch app uses WatchKit framework. UIKit events are not applicable here.
Alternate is to use forced touch event which triggers Context Menu (if available)
Instead of just tapping items on the screen, pressing the screen with a small amount of force activates the context menu (if any) associated with the current interface controller.
There is no such an api like "touchesBegan" or "touchesMove".
The only thing you can do to respond to a button event is to use IBAction.
A little late to the party, but it's possible to use SceneKit with WatchKit and SceneKit allows you to add gesture handlers. Please see the Apple example project here:
https://developer.apple.com/library/content/samplecode/WatchPuzzle/Introduction/Intro.html#//apple_ref/doc/uid/TP40017284-Intro-DontLinkElementID_2
Edit: Looks like Tap, Swipe, Long Press and Pan gesture handles can be added to any view added to the InterfaceController.

Detect iOS8 Reachability Gesture

Any ways to detect the new Reachability gesture of iOS8 in Objective-C?
The gesture is activated double tapping the TouchID button on the iPhone6 and iPhone6Plus.
There are no public APIs for it.
There are two related private API methods on UIApplication I can find (using either of these should get your app rejected from the App Store):
_setReachabilitySupported:, which presumably would en/disable reachability (like Spotlight)
_deactivateReachability, which would return the view to the normal place on the screen
I don't see anything that informs your application that the user has performed the gesture, however.
You could also experiment with subclassing UIWindow and overriding setFrame:. Set a breakpoint in this method, and if it fires when you enable Reachability, you can look at the stack trace for more information.

TOUCH_TAP triggered by TOUCH_END

The title says it all. I have both events registered for the same object. It is meant to differentiate a drag from a tap, but when TOUCH_END fires, so does TOUCH_TAP. This represents a problem if my program cannot differentiate between the two. I have tried unregistering the TOUCH_TAP handler inside of TOUCH_BEGIN and reregistering for TOUCH_END but that doesn't seem to solve the problem.
How can I differentiate between a drag and a tap? My target SDK is AIR 4.0 for iOS.

UISearchBar and dictation support

I have user interface with UISearchBar and I implement the UISearchBarDelegate's searchBarSearchButtonClicked: to perform the search. I do not have a device with dictation support to test this, so I'm going to speculate here...
On devices with dictation support, I would like to perform the search as soon as the dictation ends, without requiring the user to hit the search button manually.
Does this work out-of-the-box?
Or do I need to handle it programmatically?
Since iOS 5.1, there are new methods in UITextInput protocol and I could theoretically hook onto dictationRecordingDidEnd. Is that the way to go?
Yes, you would want to use the dictationRecordingDidEnd protocol method. Apple's documentation says this about dictationRecordingingDidEnd:
Implement this optional method if you want to respond to the
completion of the recognition of a dictated phrase.
That said, I have yet to find in Apple's human interface guidelines anything that talks about the expected use of this method.
You may also want to look at dictationRecongitionFailed as well as the UIDictationPhrase class.

Resources