`touchesBegan:withEvent:` is delayed at left edge of screen - ios

I'm experiencing an issue where the first call to touchesBegan:withEvent: on a UIView or UIViewController is delayed when you touch on the left edge of the screen. This seems to be a new issue with iOS 10, and only happens on devices with 3D Touch (iPhone 6s and newer). In fact, if you disable 3D Touch in General->Accessibility, the issue goes away.
However, the issue doesn't seem to happen when you use UIGestureRecognizers. My workaround at the moment is to create a UIGestureRecognizer subclass that overrides the touches* methods and forwards them to my old implementation.
Is this just a bug or is there a way to get rid of the delay?

try adding this to the viewdidappear method. this might fix the issue. it happened with me as well but i got this code from stack overflow that fixed my issue. hope it helps you too
let window = view.window!
let gr0 = window.gestureRecognizers![0] as UIGestureRecognizer
let gr1 = window.gestureRecognizers![1] as UIGestureRecognizer
gr0.delaysTouchesBegan = false
gr1.delaysTouchesBegan = false

Like danialias I'm working on a game. The solution I found (currently working, tested on an iPhone with 3D Touch enabled, where this was a real issue up to this point..) works for both games/apps:
It seems that UITapGestureRecognizer doesn't suffer from this delay, so simply add one to your view, and use it to handle taps.
In my game, I store touches and handle them on every interval update, so I overrode
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch
and there I stored the UITouch instance and returned NO.
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
[self.touches addObject:touch];
return NO;
}

Purteeek solution seems to work nicely in my case too. This is an objective-C implementation for SpriteKit:
- (void)didMoveToView:(SKView *)view {
UIGestureRecognizer* gr0 = view.window.gestureRecognizers[0];
UIGestureRecognizer* gr1 = view.window.gestureRecognizers[1];
gr0.delaysTouchesBegan = false;
gr1.delaysTouchesBegan = false;
}
This doesn't mess with other gesture recognizers, and the system 3D Touch still works fine. I wonder why this is not the default behavior.

in iOS 13.2. it seems that trick is not possible:
[Warning] Trying to set delaysTouchesBegan to NO on a system gate gesture
recognizer - this is unsupported and will have undesired side effects
Looks like the only solution is disable 3D touch in Settings.

This works for me
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
if let window = view.window,
let recognizers = window.gestureRecognizers {
recognizers.forEach { r in
r.delaysTouchesBegan = false
r.cancelsTouchesInView = false
r.isEnabled = false
}
}
}

Related

iOS 13 UIPanGestureRecognizer behave differently from iOS 12

I have a custom scroll view that works well before iOS 13 that uses UIPanGestureRecognizer:
_panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
_panRecognizer.delegate = self;
- (void)handlePan:(UIGestureRecognizer *)gestureRecognizer
{
UIPanGestureRecognizer* pgr = (UIPanGestureRecognizer*)gestureRecognizer;
if (pgr.state == UIGestureRecognizerStateChanged) {
// do something
}
}
Now it didn't work well with iOS 13. The handlePan function does not get called anymore until 3 fingers are panning together. In iOS 12, this function will be called when just 1 finger is moved.
I have tried setting the min/maximumNumberOfTouches but not working. Is there anything changed?
It sounds like your gesture is now competing with a system gesture. Did you check the .gestureRecognizers property of the view to see if something changed?
You might have to implement gestureRecognizer(_:shouldRecognizeSimultaneouslyWith:) delegate method, by default it returns false.

Disable user interaction on one UIView as the other UIView is swiped

Before any starts, I understand yourView.userInteractionEnabled = NO; is an option, but let me explain first the circumstance.
I have these 2 UIView objects, stoneOne and stoneTwo. I have 4 UISwipeGestureRecognizer objects attached to them for up, down, left and right. Imagine swiping these 'stones' around a 5x5 grid.
What I don't want is to be able to swipe both at the same time.
Currently, that bug is still an issue. I'll show you a method called swipeLeft: which represents the layout for all swipe directions.
- (IBAction)swipeLeft:(UISwipeGestureRecognizer *)recognizer {
_oldMove1 = _move1;
_oldMove2 = _move2;
if (recognizer.view == _oneStone
&& recognizer.direction == UISwipeGestureRecognizerDirectionLeft) {
_twoStone.userInteractionEnabled = NO;
_oneStone = recognizer.view;
[self moveOne:CGPointMake(-1, 0) withView:_oneStone];
self.move1++;
// 'causeADelay:' runs _twoStone.userInteractionEnabled = YES;
[self performSelector:#selector(causeADelay:) withObject:_twoStone afterDelay:1];
} else if (recognizer.view == _twoStone
&& recognizer.direction == UISwipeGestureRecognizerDirectionLeft) {
_oneStone.userInteractionEnabled = NO;
_twoStone = recognizer.view;
[self moveTwo:CGPointMake(-1, 0) withView:_twoStone];
self.move2++;
[self performSelector:#selector(causeADelay:) withObject:_oneStone afterDelay:1];
}
self.moveCount++;
}
One of the things I tried was creating a delay on when I can interact with my UIView objects. This worked ONLY IF I waited a split half-second to interact. The full delay would occur, and everything would work.
My bug is when you swipe them both at the same time. Is that because of the swipe gestures attached to them?
I also tried removing and reapplying the objects as subviews...didn't work, obviously. I really need this to work otherwise I have a dead-end game. I was very new to coding when I first started and never thought about Cocos2d or other game-driven platforms for development, so everything I have came from on-the-fly thinking.
There are several solutions, but here's a particularly easy one:
Remove the swipe gesture recognizers from the stones and attach them instead to the common superview of the stones. This solves the problem, because only one gesture recognizer on the same view will recognize at any one time.
Of course, you will now have to use hit-testing to find out which stone (if any) is being swiped. But that's an easy implementation detail, and is a small price to pay.
And of course another cool feature is that you now only need four gesture recognizers total!

Google Maps SDK iOS - prevent map from changing location on zoom

I have a problem that I can't solve for some time.
I have a GMSMapView with imageView in front of it in the center. In result - I can drag map and always have centered pin. But problems come when I zoom the map.
On zooming - position of map target changes and my imageView points to another location.
I can detect if zoom changed, but I cant actually force GMSMapView to do ONLY zoom without any location changing.
-(void) mapView:(GMSMapView *)mapView didChangeCameraPosition:(GMSCameraPosition *)position
{
if (mZoomLevel != mapView.camera.zoom)
{
mZoomLevel = mapView.camera.zoom;
}
}
So basically, I want to have always centered pin, even if I perform zoom.
I tried GMSMarker - but it has problems with performance when following map center. It doesn't do it instantly, so I decided to use imageView.
Main question: how to lock current location of the map while performing zoom?
Google fixed this issue with google maps sdk 1.10.0.
The solution is to simply add this line when configuring GMSMapview:
_mapView.settings.allowScrollGesturesDuringRotateOrZoom = NO;
Well, after 10 days of going back to this problem, I finally solved it!
The answer was pretty easy!
Few steps here:
1. Add imageView as marker on top of the GMSMapView
2. Add UIPanGestureRecognizer to the GMSMapView (don't forget to set delegate, it is important)
3. Then use this code:
- (void) didPan:(UIPanGestureRecognizer*) gestureRecognizer
{
if (gestureRecognizer.state == UIGestureRecognizerStateEnded)
{
_mapView.settings.scrollGestures = true;
}
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
if (gestureRecognizer.numberOfTouches > 1)
{
_mapView.settings.scrollGestures = false;
}
else
{
_mapView.settings.scrollGestures = true;
}
return true;
}
Swift 3:
mapView.settings.scrollGestures = false

Using gesture recognizers that dynamically stick to one touch among many

I have a view with four pan gestures attached. The first has both max and min number of touches set to 1, the second to 2, etc. This makes it so each will only recognize one touch while up to four fingers slide around on the screen.
That's working dandy. What isn't working is detecting when individual touches end. Anything I have set to happen when a gesture ends only happens when all gestures have ended completely.
Example delegate method:
- (void) handlePan:(UIPanGestureRecognizer*)recognizer {
//Setting what happens when a gesture is recognized as beginning
if (recognizer.state == UIGestureRecognizerStateBegan) {
//...whatever happens, bunnies follow your finger or whatever
} else
//Setting what happens when a gesture ends
if ((recognizer.state == UIGestureRecognizerStateEnded) |
(recognizer.state == UIGestureRecognizerStateCancelled) |
(recognizer.state == UIGestureRecognizerStateFailed)) {
NSLog(#"end");
}
}
What should be happening is that I see "end" in the console whenever any finger is lifted. Instead, I see nothing until all fingers are lifted, at which point I see "end" repeated four times (or as many times as fingers that were on the screen).
Is there any way I can make this work the way I intend?
edit After fiddling I see that I may not be analyzing my problem correctly. The whole reason I want to detect when a gesture's touch ends is that I want to have gestures able to become active when there is more than one touch on screen, but I want each gesture to only track one touch itself. I was setting an "active" flag on gestures that were tracking touches, and then toggling that flag off after touches ended, and that wasn't working, because touch-end-detection was hard to implement well.
But if there's a different way to achieve the same thing, that's the real thing I'm looking for: among many overlapping touches, have each gesture recognizer track one and only one.
You may want to do something like - it catches the change in fingers on the screen for the given gesture; you may need to add some more logic surrounding which gesture you're working with:
switch( recognizer.numberOfTouches ) {
case 1: {
NSLog(#"1 ");
break;
}
case 2: {
NSLog(#"2");
break;
}
case 3: {
NSLog(#"3");
break;
}
case 4: {
NSLog(#"4");
break;
}
default: {
NSLog(#"0");
}
}
This is what eventually worked.
In short, I made a flag that flipped whenever a gesture recognizer was assigned a touch, ensuring no other recognizers accepted that touch. I also tested each recognizer to make sure it only accepted a touch when it wasn't already following a touch. So I made each touch only get assigned once, and each recognizer only accept one touch. Worked like a charm.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//set this to no every time a new touch happens, meaning it isn't taken yet.
touchTaken = NO;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
//If the touch is taken or the gesture's already following a touch, say no.
if (touchTaken | ([gestureRecognizer numberOfTouches] > 0)) {
return NO;
}
else {
touchTaken = YES;
return YES;
}
}

iOS Detect User Touch Release

This may have been posted here somewhere but I can't find it. I am writing a simple iOS app with two UIViews. The user will first press and hold a certain area of the screen then release their touch on it then quickly touching a second view below.
The first UIView has a UILongPressGestureRecognizer attached to it and works fine. The second UIView has a UITapGestureRecognizer attached to it and also works fine. I cannot however, get either of these gesture recognizers to return anything that states that the user released their touch.
I have tried this code to no avail:
- (void)holdAction:(UILongPressGestureRecognizer *)holdRecognizer
{
if (UIGestureRecognizerStateRecognized) {
holdLabel.text = #"Holding Correctly. Release Touch when ready.";
holdView.backgroundColor = [UIColor greenColor];
} else if (UIGestureRecognizerStateCancelled){
holdLabel.text = #"Ended";
holdView.backgroundColor = [UIColor redColor];
}
Any suggestions would be great and especially if someone knows how to implement a call that returns the state of a user touching the device. I've looked over the developer docs and have come up empty.
After tinkering for a couple hours, I found a way that is working, not sure if its the best way to do it. Turns out I need to be writing it like the code below. I wasn't calling the specific UIGestureRecognizer I was declaring in the viewDidLoad() method.
- (void)holdAction:(UILongPressGestureRecognizer *)holdRecognizer
{
if (holdRecognizer.state == UIGestureRecognizerStateBegan) {
holdLabel.text = #"Holding Correctly. Release when ready.";
holdView.backgroundColor = [UIColor greenColor];
} else if (holdRecognizer.state == UIGestureRecognizerStateEnded)
{
holdLabel.text = #"You let go!";
holdView.backgroundColor = [UIColor redColor];
}
}
You need to use manual touch handling here (as opposed to using a gesture recognizer). Any UIResponder subclass can implement the following four methods:
– touchesBegan:withEvent:
– touchesMoved:withEvent:
– touchesEnded:withEvent:
– touchesCancelled:withEvent:
By using these methods, you get access to every phase of the touch events. You might have to implement your own logic to detect the long press, but you have full access to all touches.
For more information on touch handling, this session from WWDC 2011 is golden (requires dev account):
https://developer.apple.com/itunes/?destination=adc.apple.com.8270634034.08270634040.8367260921?i=1527940296
Swift 4+
let gesture = UILongPressGestureRecognizer(target: self, action: #selector(self.checkAction))
self.view.addGestureRecognizer(gesture)
#objc func checkAction(sender : UILongPressGestureRecognizer) {
if sender.state == .ended || sender.state == .cancelled || sender.state == .failed {
}
}

Resources