UIView touchesMoved not called if first finger released - ios

I want to achieve a simple task - detect up to 10 touches in a UIView.
Using these:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
It all works great.
Problem is - touchesMoved:: is no more called - if first finger, that touched screen no longer touches screen.
Is it possible to fix it? (so that - while atleast one finger from 10 fingers is still touching screen - touchesMoved:: would be called?
If it is not possible in UIKit, could it be possible in Cocos2d, and how? (some links, function peaces would be really helpful)

You probably just need to set theView.multipleTouchEnabled.

Related

Detect finger touches movement on AppleWatch

I want to detect touches on AppleWatch to get touch location. As we can achieve this behaviour on iPhone app by using following delegate methods:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
//Any Logic
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Any Logic
}
So, i just used these methods on WatchKit but not working;
Any way to achieve this behaviour?
I want to make some logic on the basis of touch location on WatchKit.
Any direction of work will be appreciated.
At the moment (I don't know if things change in the future), there are no touch recognisers like that for WatchKit. You can just respond on taps on buttons or a menu with a selector.

UITableView touch position or several gestures at a time

I have UITableView, and added the ability for moving cells.
http://s017.radikal.ru/i402/1503/76/3442a9517cec.png
So, if longPressGesture is recognized, i make the cell hidden, take a snapshot of cell(on picture highlighted with grey), and change it's position while longPressGestureStateChanged. But the animation of moving looks bad.
If I add panGesture while longPressGestureBegan, it doesn't work until I touch up and touch down again, and after that panGestureStateChanged begin working, and moving become smooth.
I need panGesture begin working while longPressGestureBegan, or catch the screen touch position.
But the difficulty is: (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event; etc... doesn't work on UITableView.
I read a lot, but nothing was founded. This is the custom case. I need for example to recognize pinchGesture only when longPressGesture was recognized, without touching up.
Anybody know how to solve this?
For UITableView use ready methods to move cells. You can find more information in the documentation.

Subclassing UIControl's endTracking for multitouch

I'm creating a custom control* by subclassing UIControl and overriding the following methods:
- beginTrackingWithTouch:withEvent:
- continueTrackingWithTouch:withEvent:
- endTrackingWithTouch:withEvent:
- cancelTrackingWithEvent:
This control will support multitouch interaction. Specifically, one or two fingers may touch down, drag for some duration, and then touch up, all independently.
My issue comes at the end of these multitouch events, in endTrackingWithTouch:withEvent:. The system calls this method on my control exactly once per multitouch sequence, and it is called on the first touch up that occurs. This means that the second touch, which could still be dragging, will no longer be tracked.
This sequence of actions serves as an example for this phenomenon:
Finger A touches down. beginTracking is called.
Finger A drags around. continueTracking is called repeatedly to update my control.
Finger B touches down. beginTracking is called.
Fingers A and B drag around. continueTracking is called repeatedly to update my control.
Finger B touches up. endTracking is called.
Finger A drags around. No methods are called.
Finger A touches up. No methods are called.
The issue lies in the final three steps in this sequence. The documentation for UIControl says the following of endTrackingWithTouch:withEvent::
Sent to the control when the last touch for the given event completely ends, telling it to stop tracking.
Yet it seems to me that after finger B touches up in step 5, above, the last touch for the given event has not yet completely ended. That last touch is finger A!
*A range slider, if you must ask. But there are already plenty of open source range sliders, you say. Yes, true, but this one will support multitouch and Auto Layout.
I had the same issue in custom control with multitouch.
Because UIControl is subclass of UIView that inherits from UIResponder, so feel free to use:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
methods.
These methods work fine in UIControl subclass.

Multitouch for 3x3 Buttons

I am working on a hobby project to learn more about ios programming.
My main view has 9 buttons (3x3 grid).
What I need is a way to know if the user pressed 2 or 1 buttons and which buttons these were.
I am making a memory game where sometimes the user is required to press 2 of the buttons at the same time (One after another will not suffice).
I need to be able to make the distinction between the user pressing 1 of the 9 buttons or 2.
If they press more than 2 that is considered cheating and will count as a strike.
I have been reading this:
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/multitouch_background/multitouch_background.html
But I'm not sure how to go about it.
How do I link up my 9 buttons into 1 nice multitouch event that can tell me:
Button 3 only was pressed.
or
Buttons 4 and 5 were pressed.
?
From there the logic is perfectly clear, but I'm having trouble with IOS multitouch events.
Thanks
The touch interface is sensitive enough that humans will not be able to press two buttons at the "exact" same time.
What you'll probably have to do is see if they touched two and only two buttons within a "reasonable" window of time to be considered simultaneous.
One approach would be to log the exact moment each button is pressed with a NSDate* press = [NSDate date]; call, and then compare the presses with NSTimeInterval* interval = [press2 timeIntervalSinceDate press1];
Some clever batching of time intervals and a bit of conditional logic later and you should have a good test of "simultaneous" presses.
If I understood the docs it should work like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
methods of the parent view
In the (UIEvent *)event U should have NSSet of UITouchs
NSArray *arrTouches = [[event allTouches] allObjects]
iterate through array and:
UITouch *touch ---
touch.view -> check if this is button
should be the view in witch touh happened

CAEmitterLayer - disable animation on touch

I am trying to add CAEmitterLayer animation to my application. Everything works fine while testing on simulator, however if I run the code on the device, and I start to touch the screen additional cells are emitted completely off the birthrate I set. Is there a way to disable / overview this touch event? I tried by adding:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
OR
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
But this does not help. It seems like the touch is taken from different "layer".

Resources