CoronaSDK- How to handle touch and tap at the same time - coronasdk

I have a case I couldn't find a solution for. In my game, moving your finger moves the objects around. But there is another scenario. If you are in a certain "mode", tapping certain object should do something specific, but if the user doesn't click on that specific object, I need to reset the mode to normal.
I have a system event touch handler that handles the move. I also have an event handler on the objects which are mode aware. Now the problem is resetting mode back to normal.
System touch event handler is called before object tap event, so I cannot handle it there as I am not sure if the tap event is going to fire or not. And if the tap didn't happen on the specific object, I have no way of handling it.
What to do?

I've handled a similar situation by just setting a Boolean to true when you enter the certain "mode" and setting it to false on every tap/touch end. - Throw a conditional in there to check if the current object is a recipient of the "mode", if not, set the dragging bool to false..
-Really nothing fancy needed from the SDK.

An object can have both a touch and a tap handler on it. A tap is a well defined short duration touch and release though and it may not do what you want, because if you touch, hold too long and then release, it won't register as a tap event.
Your best bet is to just code your touch handler to see if you got any move phases and if you get an end after a begin without any move's, then act as if you were tapped.

Related

make UIAccessibilityTrait adjustable ignore double-tap (like a button) swift

I have a custom control to increment and decrement values. Now that I've added support for voice over, I've stumbled upon a problem.
My customView has the accessibility trait .adjustable and I implemented the correct methods for increasing and decreasing the values.
However, the voice over user can also double tap on that view to activate it. The problem is, that this triggers a gesture which is irrelevant to voice over users.
Is there a way to prevent an adjustable accessibility view from being activated so that the element is only adjustable, not double-tappable like a button?
There are two important properties to know when a double-tap occurs:
accessibilityActivate.
accessibilityActivationPoint.
In your case, you could just return true by overriding accessibilityActivate and if it's not enough, provide as well a CGPoint coordinate that triggers nothing (depends of your custom control and its neighborhood).
Otherwise, use the accessibilityElementIsFocused instance method to know wether you can trigger actions as this complete example shows up.
I ended up using UIAccessibility.isVoiceOverRunning to stop any tasks which would be triggered by a doubletap on that specific element.

Detect Switch touched before changing value

I have a uiswitch in settings that allows users to view some sensitive things in the app. Before allowing users to change the switch through a tap I want to prompt them to provide a password.
I tried disabling the switch but it's impossible (without some kludgy workarounds) to detect a tap/touch from a disabled switch.
Is there an event or way to detect the touch/tap before the uiswitch changes value so that I can prevent it from changing value and show the prompt?
I tried using touchdown and touch inside but they don't stop the change in value.
Thanks for any suggestions.
Here's an idea. Disable the switch as you said already. But also have another view behind the switch, a view which is the same size as the switch and has a UITapGestureRecognizer behind it.
Do you see what will happen? If the switch is disabled, the touch will fall thru to the view behind it, and the tap gesture recognizer will fire. Thus you will know that the user tried to tap the switch.
But once the password has been given and the switch is enabled, you can ignore the tap gesture recognizer (in fact it will probably never fire again, because the switch, being in front of it, will eat the touch).

How to respond to a very quick touch?

I found that touchDown event is kind of slow, in that it requires a major, fairly long touch and does not respond to a light tap. Why is that?
Whereas, touchesBegan responds just when I need it to, i.e. responds even to very light, quick touches. But that's not an event but a method that can be overridden.
Problem is, touchesBegan apparently requires me to either 1) subclass a label (I need to respond to touching a label), or 2) analyze the event to figure out whether it came from the right label. I am wondering whether it's a code smell and whether there should be an event for simple touch.
Try to add a UITapGestureRecognizer to your label.
First of all, allow label to handle user interaction:
label.userInteractionEnabled = true
You can assign tap gesture to label. Then in handler method you need to switch over state property of recognizer. If it is .Began, you got the event that you need.
The cool thing about this approach that you could use this one handler for all of your labels. Inside handler, you can get touched label like this:
let label = tapRecognizer.view as! UILabel
"Code smell"? No, it's a user interface smell. A user interface stink.
If you make a button in your user interface behave different from buttons in any other application, people will hate your app. Do what people are used to.

Begin UIPanGesture Event From A Pressed State At Time Of Instantiation

Is there a way to begin a UIPanGestureEvent if the finger is already pressed at the time the object is instantiated?
I have a situation where when a user holds their find on a screen I create a UIView under their finger.
I want them to be able to drag that around and as such I have put a UIPanGestureRecognizer inside the UIView.
Problem is I need to take my finger off and put it back to trigger the UIPanGestureRecognizer to start up. I need it to start from an already pressed state.
Do you know how I can activate a UIPanGesture from an already pressed state i.e. can I get the touch event thats already active at the time of instantiation and pass it along?
You can do it, but the UIPanGestureRecognizer will need to exist already on the view behind the view you create (and you will then have to adjust your calculations based on this; not difficult).
The reason is that, under the circumstances you describe, the touch does not belong to the UIView you create - it belongs to the UIView behind it, the one that the user was originally touching. And given the nature of iOS touch delivery, you can't readily change that. So it will be simpler to let that view, the actual original touch view, do the processing of this touch.
I think Matt's solution is best so I am going to mark it as correct.
However my code structure wasn't going to allow me to cleanly implement it. Compounding the issue was the object listening was listening for a UILongGestureRecognizer.
So my solution was as follows:
Create a callback in my ViewController that would handle the longGestureOverride call
Add a callback to the object listening for the longGesture that would call the longGestureOverride callback and pass along the point
Manually move the object based on the point passed back
If the user lifts their finger, I disable the longGestureOverride callback, and begin using the UIPanGesture inside the new object

Presentation overlay for iOS

When I demo my touch apps to remote teams the people on the other end dont know where I am touching. To remedy this, I have been working on an event intercepting view/window that can display touches over applications. No matter how may variations on nextResponder I call, I am unable to react to the touch and pass it along to the controllers underneath. Specifically scroll views dont react nor do buttons.
Is there a way to take an event, get its position, then pass it along to what ever component would have been responding to it initially (the controller underneath)?
Update:
I am making some progress with a UIView. The new view is always returning NO to pointInside.This works great for when the touch starts, but it doesnt track moves or releases. Is there a strategy to adding gesture recognizers to the touch in order to track its event lifecycle?
Joe
You could try creating your own subclass of UIApplication that overrides sendEvent:. Your implementation should call [super sendEvent:event] as well as process the event as needed.
Update your main.m and pass the name of you custom UIApplication class as the 3rd parameter to the call of UIApplicationMain.
After some more due diligence, I found my oversight. In the layer that was on top and displaying the touches, user interaction needed to be set to false. Once I set that to false, I was able to use that layer for display while catching events on the layers below. The project still isn't done but I am one step closer.
Take care,
Joe

Resources