pressGestureRecognizer and touchesBegan - ios

I have the following problem.
I am using a UILongPressGestureRecognizer to put a UIView into a "toggle mode". If the UIView is in "toggle mode" the user is able to drag the UIView around the screen. For dragging the UIView around the screen I am using the methods touchesBegan, touchesMoved and touchesEnded.
It works, but: I have to lift my finger in order to drag it, because the touchesBegan method got already called and therefore is not called again and therefore I can't drag the UIView around the screen.
Is there any way to manually call touchesBegan after UILongPressGestureRecognizer got triggered (UILongPressGestureRecognizer changes a BOOL value and the touchesBegan only works if this BOOL is set to YES).

UILongPressGestureRecognizer is a continuous gesture recognizer, so rather than resorting to touchesMoved or UIPanGestureRecognizer, just check for UIGestureRecognizerStateChanged, e.g.:
- (void)viewDidLoad
{
[super viewDidLoad];
UILongPressGestureRecognizer *gesture = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleGesture:)];
[self.view addGestureRecognizer:gesture];
}
- (void)handleGesture:(UILongPressGestureRecognizer *)gesture
{
CGPoint location = [gesture locationInView:gesture.view];
if (gesture.state == UIGestureRecognizerStateBegan)
{
// user held down their finger on the screen
// gesture started, entering the "toggle mode"
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
// user did not lift finger, but now proceeded to move finger
// do here whatever you wanted to do in the touchesMoved
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
// user lifted their finger
// all done, leaving the "toggle mode"
}
}

I would suggest you to use UIPanGestureRecognizer as it a recommended gesture for dragging.
You can configure the min. and max. number of touches required for a panning, using the following the properties:
maximumNumberOfTouches
minimumNumberOfTouches
You can handle the states like Began, Changed and Ended, like having animation for the required states.
Using the below method translate the point to the UIView in which you want it.
- (void)setTranslation:(CGPoint)translation inView:(UIView *)view
example:
You have to use a global variable to retain the old frame. Get this in UIGestureRecognizerStateBegan.
When the state is UIGestureRecognizerStateChanged. You can use the
-(void) pannningMyView:(UIPanGestureRecognizer*) panGesture{
if(panGesture.state==UIGestureRecognizerStateBegan){
//retain the original state
}else if(panGesture.state==UIGestureRecognizerStateChanged){
CGPoint translatedPoint=[panGesture translationInView:self.view];
//here you manage to get your new drag points.
}
}
Velocity of the drag. Based on the velocity you can provide a animation to show bouncing of a UIView
- (CGPoint)velocityInView:(UIView *)view

Related

TouchesBegan delay on left hand side of the display

On iPhone's with 3D touch enabled, there is a feature where long pressing the left hand side of the screen with enough force opens lets you change which app is active. Because of this, when a non-moving touch happens on the left hand side of the screen, the touch event is delayed for a second or two until the iPhone verifies that the user is not trying to switch tasks and is interacting with the app.
This is a major problem when developing a game with SpriteKit, as these touches are delayed by a second every time a user taps/holds their finger on the left edge of the screen. I was able to solve this problem by registering a UILongPressGestureRecognizer in the main Scene of the game, thus disabling TouchesBegan and implementing a custom touches function (used as a selector by the gesture recognizer):
-(void)handleLongPressGesture:(UITapGestureRecognizer *)gesture {
CGPoint location = [gesture locationInView:self.view];
if (gesture.state == UIGestureRecognizerStateBegan)
{
//
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
//
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
//
}
else if (gesture.state == UIGestureRecognizerStateCancelled)
{
//
}
}
-(void)didMoveToView:(SKView *)view {
/* Setup your scene here */
UILongPressGestureRecognizer *longPressGestureRecognizer = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleLongPressGesture:)];
longPressGestureRecognizer.delaysTouchesBegan = false;
longPressGestureRecognizer.minimumPressDuration = 0;
[view addGestureRecognizer:longPressGestureRecognizer];
// continue
}
The problem with this is that I would have to implement a gesture recognizer for every touch (including simultaneous ones) that I expect the user to enter. This interferes with any touchesBegan methods as subclasses of SKSpriteNode, SKScene, etc. and kills a lot of functionality.
Is there any way to disable this delay? When registering the gestureRecognizer, I was able to set delaysTouchesBegan property to false. Can I do the same somehow for my SKScene?
To see this issue in action, you can run the default SpriteKit project, and tap (hold for a second or two) near the left hand side of the screen. You will see that there is a delay between when you touch the screen and when the SKShapeNodes are rendered (as opposed to touching anywhere else on the screen).
* Edit 1 *
For those trying to find a way to get around this for now, you can keep the gesture recognizer but set its cancelsTouchesInView to false. Use the gesture recognizer to do everything you need to do until TouchesBegan kicks in (touchesBegan will receive the same touch event about a second after the gesture recognizer recognizes the touch). Once touchesBegan kicks in, you can disable everything happening in the gesture recognizer. This seems like a sloppy fix to me, but it works for now.
Still trying to find a more-or-less formal solution.
I have experienced this as an user and it is really annoying. The only thing that worked for me was to disable the 3D touch. Otherwise the left side of the touchscreen is almost useless.

How to get coordinates of the tap in 2 views simultaneously

I am working on an iOS map app and it includes interactive map. The interactive map is a subclass of UIImageView and placed on a scrollView. My view hierarchy is shown below:
When user taps some part of the map, ViewController performs animated segue (like zoom-in to that area of the map). I can start segue from any point of the screen, but to do this properly, I need exact coordinates of user's tap relative to the screen itself. As ImageView is put at the top of ScrollView, it uses different coordinate system, larger than screen size. No matter, which area of map has ben tapped, what matters is the tapped CGPoint on the screen (physical).
ImageView uses its own code to get coordinates of a tap:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
// cancel previous touch ended event
[NSObject cancelPreviousPerformRequestsWithTarget:self];
CGPoint touchPoint = \
[[touches anyObject] locationInView:self];
NSValue* touchValue =\
[NSValue
valueWithCGPoint:touchPoint];
// perform new one
[self
performSelector:#selector(_performHitTestOnArea:)
withObject:touchValue
afterDelay:0.1];
}
And the case if I place gesture recognizer, it works, but ImageView can't get any touches and, therefore, call segue.
The code for gesture recognizer, I attempted to use:
UITapGestureRecognizer *rec = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapRecognized:)];
[someView addGestureRecognizer:rec];
[rec release];
// elsewhere
- (void)tapRecognized:(UITapGestureRecognizer *)recognizer
{
if(recognizer.state == UIGestureRecognizerStateRecognized)
{
CGPoint point = [recognizer locationInView:recognizer.view];
// again, point.x and point.y have the coordinates
}
}
So, is there any way to get two coordinates in different reference systems?, or to make these recognizers work simultaneously without interfering each other?
Solved
I use this code to convert touched point from one view's reference system to
CGPoint pointInViewCoords = [self.parentView convertPoint:self.imageView.touchPoint fromView:self.imageView];
Where self.parentView is "View" on hierarchy image - with the size of the screen.

UIPanGestureRecognizer in SKScene

I've been experimenting with UIGestureRecognizers and the new SKScene/SKNode's in SpriteKit. I've had one problem, and I got close to fixing it but I am confused on one thing. Essentially, I have a pan gesture recognizer that allows the user to drag a sprite on the screen.
The single problem I have is that it takes ONE tap to actually initialize the pan gesture, and then only on the SECOND tap on it works correctly. I'm thinking that this is because my pan gesture is initialized in touchesBegan. However, I don't know where else to put it since initializing it in the SKScene's initWithSize method stopped the gesture recognizer from actually working.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (!self.pan) {
self.pan = [[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(dragPlayer:)];
self.pan.minimumNumberOfTouches = 1;
self.pan.delegate = self;
[self.view addGestureRecognizer:self.pan];
}
}
-(void)dragPlayer: (UIPanGestureRecognizer *)gesture {
CGPoint trans = [gesture translationInView:self.view];
SKAction *moveAction = [SKAction moveByX:trans.x y:-trans.y duration:0];
[self.player runAction:move];
[gesture setTranslation:CGPointMake(0, 0) inView:self.view];
}
That's because you're adding the gesture in touches began, so the gesture doesn't exist until the screen has been tapped at least once. Additionally, I would verify that you're actually using initWithSize: as your initializer, because you shouldn't have any problems adding the gesture there.
Another option is to move the code to add the gesture into -[SKScene didMovetoView:] which gets called immediately after the scene has been presented. More info in the docs.
- (void)didMoveToView:(SKView *)view
{
[super didMoveToView:view];
// add gesture here!
}
This is my first post! Hoping to not trip over my own toes...
Hi guys, so I was having an issue with a UISwipeGestureRecognizer not working. I was initializing it in my initWithSize method so based on this post I moved it to my didMoveToView method. Now it works (thanks 0x7fffffff). All I did was cut the following two lines from one method and paste them in the other.
_warpGesture = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(warpToNextLevel:)];
[self.view addGestureRecognizer:_warpGesture];
In my "investigation" I came across userInteractionEnabled and tried to set it to YES in my initWithSize method...
self.view.userInteractionEnabled = YES;
NSLog(#"User interaction enabled %s", self.view.userInteractionEnabled ? "Yes" : "No");
This would log NO even though i'd just set it to YES. Further investigation found that if I don't try to manually set userInteractionEnabled then it's NO during initWithSize (I can't seem to change this if I want to) and automatically gets set to YES when i'm in didMoveToView.
This all strikes me as relevant but I would love for someone in the know to explain just what's going on here. Thanks!

Detect when finger dragging UIButton overlaps UIImageView

I am using UIPanGestureRecognizer to drag a UIButton around the screen. The idea is that the user can drag it over a folder to insert it in the folder (like iOS icons). This code I found works fine if I want to detect when the button overlaps with the image:
-(void) touchesEnded:(NSSet *) touches {
if(CGRectIntersectsRect([imageViewA frame], [imageViewB frame]) {
NSLog(#"Do something.");
}
}
But since the button is big and there are more images one next to another, it may happen that the button overlaps with both of them. I therefore want to detect when the actual user finger holding the UIButton overlaps with the image to trigger the right action. Any ideas?
UIGestureRecognizer will recognize the pan and when it ends, you can use locationInView: to find the finger's position in the button's super view. You can then see if they are overlapping with CGRectContainsPoint(frame, point):
- (void)handlePanGesture:(UIPanGestureRecognizer*)recognizer {
if ([recognizer state] == UIGestureRecognizerStateEnded) {
CGPoint fingerPoint = [recognizer locationInView:someImageView.superview];
if (CGRectContainsPoint(someImageView.frame, fingerPoint)) {
NSLog(#"Do something");
}
}
}

Four-finger Multitasking gesture activates UIPinchGestureRecognizer gesture

I am using a UIPinchGestureRecognizer, which uses 2 fingers by default. If a user decides to perform the multitask gesture, the pinch gestures action is also activated.
Is there a way to cancel the pinch gesture from occurring if more than four UITouch instances are detected?
Edit Removed sample code as it was the wrong approach.
Since you're not subclassing the UIPinchGestureRecognizer, you shouldn't be using touchBegan:withEvent:. Instead you should be handling it in the method that is called when a pinch occurs.
- (void)handlePinch:(UIPinchGestureRecognizer *)pinchGestureRecognizer
{
// if there are 2 fingers being used
if ([pinchGestureRecognizer numberOfTouches] == 2) {
// do stuff
}
}
With a multitask gesture, the numberOfTouches returned by the UIPinchGestureRecognizer is 2 instead of 4 or 5, because some touches are ignored.
You can subclass UIPinchGestureRecognizer and override ignoreTouch:forEvent to cancel the recognizer if the event has 4 or 5 touches:
- (void) ignoreTouch:(UITouch*)touch forEvent:(UIEvent*)event
{
[super ignoreTouch:touch forEvent:event];
// Cancel recognizer during a multitask gesture
if ([[event allTouches] count] > 3)
{
self.state = UIGestureRecognizerStateCancelled;
}
}

Resources