as shown in the picture below the UIPanGestureRecognizer will not pan when the drag started inside the "dead zone" on the top of the screen. This is most likely caused by the notification center.
The touchesMoved:withEvent: method however does get called, so there should be a way to get pan gestures recognized in this area.
Has anyone else came across this issue, are there any workarounds out there yet? Thanks for any help!
It is possible that on the top edge of the screen you have multiple gesture recognisers that interferes. It could be that the the other gesture recogniser that interferes is added by system with some UIElement.
Try to implement UIGestureRecognizerDelegate on your gesture recogniser delegate.
Especially this method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
NSLog(#"1st recognizer class: %# ; 2nd recognizer class: %#", NSStringFromClass([gestureRecognizer class]), NSStringFromClass([otherGestureRecognizer class]));
return YES;
}
https://developer.apple.com/library/ios/documentation/uikit/reference/UIGestureRecognizerDelegate_Protocol/Reference/Reference.html#//apple_ref/occ/intfm/UIGestureRecognizerDelegate/gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer:
Solved the problem. The touchesMoved:withEvent: should be implemented this way:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches.allObjects objectAtIndex:0];
CGPoint location = [touch locationInView:self];
CGPoint previousLocation = [touch previousLocationInView:self];
CGPoint contentOffset = self.contentOffset;
contentOffset.x -= location.x - previousLocation.x;
[self setContentOffset:contentOffset animated:NO];
}
Related
I am using a UISwipeGestureRecogizer to swipe left and right by using UISwipeGestureRecognizerDirectionRight and UISwipeGestureRecognizerDirectionLeft and its working totally fine but the default swipe is too sensitive, its just a little bit more than a tap.
Is there any way by which I can make it work like when a scrollView works when its paging enable property is true that till the user doesnt lifts its finger up, its doesnt work.
It will be fine if the view doesn't move with the finger moving, and only work when a left or right gesture is performed and the finger is lifted. Thanks.
A UISwipeGestureRecognizer will not give you any option to change the sensitivity or any other aspect of the swipe. To obtain more control over the swipe, you could use one among the following:
UIPanGestureRecognizer
Here is something to try with UIPanGestureRecognizer:
- (void)handleGesture:(UIPanGestureRecognizer *)gestureRecognizer{
CGPoint velocity = [gestureRecognizer velocityInView:yourView];
if(velocity.x > 0){
//Right gesture
}else{
//Left gesture
}
}
UIResponder
All UIViews inherit from UIResponder, so you can use the touchesBegan, touchesMoved and touchesEnded method for calculating the touch. Try something like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
UITouch *touch = [touches anyObject];
start = [touch locationInView:self.view];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint end = [touch locationInView:self.view];
//Compare start and end points here.
}
How can I get a CGPoint of a users touch?
It's actually a MKMapView, but since it inherits from a UIView, I'm thinking it's the same.
I can add a UIGestureRecognizer. But what I'm looking to, is to track the finger's position as it moves in the map view.
Thank you!
Use a gesture recognizer that uses this handler:
- (void)handleGesture:(UIGestureRecognizer*)gesture
{
CGPoint locationInView = [gesture locationInView:self.view];
// do stuff
}
The implementation below updates the coordinates when you drag and touches on the screen.
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint coordinate = [[touches anyObject] locationInView:self.view];
self.coordinateLabel.text = [NSString stringWithFormat:#"%f, %f",coordinate.x,coordinate.y];
}
I have an UISrollView called templateView. I must add to it a swipe gesture to allow the user to swipe left/right to see another templates. The problem is that most of times the user can't swipe easily because the view scrolls down/up instead of swiping to another view. his finger needs to be aligned strictly horizontal to swipe to another page and this isn't acceptable from a user experience perspective.
Any idea how to handle such cases? Is there a way to implement an angle for detecting the swipe gesture? or, is there a way to do it as a custom uigesture for detecting oblique lines with a specific angle?
Thanks in advance.
Try to implement UIGestureRecognizer Delegate method. This method is called when recognition of a gesture by either gestureRecognizer or otherGestureRecognizer would block the other gesture recognizer from recognizing its gesture. Note that returning YES is guaranteed to allow simultaneous recognition.
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Reference: UIGestureRecognizer Protocol
Do not forget to assign delegate, when you are initializing your swipe gesture.
UPDATE 1 CREATING YOUR OWN GESTURE
You always can subclass UIGestureRecognizer class and implement touchesBegan, touchesMoved, touchesEnded methods - manually managing the states of the gesture depending on your own needs.
I am posting some sample code of implementing custom EdgeGestureRecognizer for your better understanding.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
UITouch *touch = touches.anyObject;
CGPoint location = [touches.anyObject locationInView:self.view];
// if not single finger, then fail
if ([touches count] != 1)
{
self.state = UIGestureRecognizerStateFailed;
return;
}
//put here some logics for your case. For instance, you can register
//here your first touch location, it will help
//you to calculate the angle after.
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
if (self.state == UIGestureRecognizerStateFailed) return;
UITouch *touch = touches.anyObject;
self.previousPoint = self.currentPoint;
self.previousPointTime = self.currentPointTime;
self.currentPoint = [touch locationInView:self.view];
self.currentPointTime = touch.timestamp;
if (self.state == UIGestureRecognizerStatePossible)
{
CGPoint translate = CGPointMake(self.currentPoint.x - self.startPoint.x, self.currentPoint.y - self.startPoint.y);
// see if we've moved the necessary minimum distance
if (sqrt(translate.x * translate.x + translate.y * translate.y) >= self.minimumRecognitionDistance)
{
// recognize if the angle is roughly horizontal, otherwise fail
double angle = atan2(translate.y, translate.x);
if ([self isAngleCloseEnough:angle])
self.state = UIGestureRecognizerStateBegan;
else
self.state = UIGestureRecognizerStateFailed;
}
}
else if (self.state == UIGestureRecognizerStateBegan)
{
self.state = UIGestureRecognizerStateChanged;
}
}
I need to get the touch position from a UIScrollView. But when I create a subclass for my scrollview :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
NSArray *touchesArray = [touches allObjects];
UITouch *touch = (UITouch *)[touchesArray objectAtIndex:0];
CGPoint point = [touch locationInView:self];
self.position = point;}
The function touchesBegan isn't always called.
If I swipe fast, touchesBegan is never called, but scrollViewDidScroll gets directly called instead.
I can't use UIGesture (UITapGestureRecognizer, ... ) because users don't tap or swipe.
Are there others methods to get the position from UIScrollView ?
EDIT :
Thanks ThXou : https://stackoverflow.com/a/15763450/1752843
UIScrollView subclass :
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
self.position = point;
return self;
}
To detect the touch location inside scrollview, try this code
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(singleTapGestureCaptured:)];
[scrollView addGestureRecognizer:singleTap];
- (void)singleTapGestureCaptured:(UITapGestureRecognizer *)gesture
{
CGPoint touchPoint=[gesture locationInView:scrollView];
}
You could alternatively use scrollview.panGestureRecognizer's locationInView method to get the coordinates of a touch in your view. This could be in any of scrollView's delegate method depending on your requirements.
I saw an answer in SO that I think that it can solve you problem. It seems that the SDK doesn't pass the touch handling methods to the UIScrollView subclasses anymore. So, you should to perform a hitTest: to pass the touches to your subclass.
See the answer for more information.
scrollView.delaysContentTouches = NO;
It will makes that touchesBegan and other touch events will be triggered before scrollViewDidScroll.
The hitTest is a hack and should not be preferred solution.
I have a UIImageView that is added as a subview. It shows up when a button is pressed.
When someone taps outside of the UIImageView in any part of the application, I want the UIImageView to go away.
#interface SomeMasterViewController : UITableViewController <clip>
<clip>
#property (strong, nonatomic) UIImageView *someImageView;
There are some hints in stackoverflow and Apple's documentation that sound like what I need.
Apple's : Gesture Recognizers
Apple's : UIView hitTest:withEvent
Apple's : UITouch Class Reference
Stackoverflow: Listening to UITouch event along with UIGestureRecognizer
(not likely needed but..) - CGRectContainsPoint as mentioned in the following post titled: Comparing a UITouch location to UIImageView rectangle
However, I want to check my approach here. It's my understanding that the code needs to
Register a UITapGestureRecognizer to get all touch events that can happen in an application
UITapGestureRecognizer should have its cancelsTouchesInView and
delaysTouchesBegan and delaysTouchesEnded set to NO.
Compare those touch events with the someImageView (how? Using UIView hitTest:withEvent?)
Update: I am registering a UITapGestureRecognizer with the main UIWindow.
Final Unsolved Part
I have a handleTap:(UITapGestureRecognizer *) that the UITapGestureRecognizer will call. How can I take the UITapGestureRecognizer that is given and see if the tap falls outside of the UIImageView? Recognizer's locationInView looks promising, but I do not get the results I expect. I expect to see a certain UIImageView when I click on it and not see the UIImageView when I click in another spot. I get the feeling that the locationInView method is being used wrong.
Here is my call to the locationInView method:
- (void)handleTap:(UITapGestureRecognizer *)gestureRecognizer
{
if (gestureRecognizer.state != UIGestureRecognizerStateEnded) {
NSLog(#"handleTap NOT given UIGestureRecognizerStateEnded so nothing more to do");
return;
}
UIWindow *mainWindow = [[[UIApplication sharedApplication] delegate] window];
CGPoint point = [gestureRecognizer locationInView:mainWindow];
NSLog(#"point x,y computed as the location in a given view is %f %f", point.x, point.y);
UIView *touchedView = [mainWindow hitTest:point withEvent:nil];
NSLog(#"touchedView = %#", touchedView);
}
I get the following output:
<clip>point x,y computed as the location in a given view is 0.000000 0.000000
<clip>touchedView = <UIWindow: 0x8c4e530; frame = (0 0; 768 1024); opaque = NO; autoresize = RM+BM; layer = <UIWindowLayer: 0x8c4c940>>
I think you can just say [event touchesForView:<image view>]. If that returns an empty array, dismiss the image view. Do this in the table view controller's touchesBegan:withEvent:, and be sure to call [super touchesBegan:touches withEvent:event] or your table view will completely stop working. You probably don't even need to implement touchesEnded:/Cancelled:..., or touchesMoved:....
UITapGestureRecognizer definitely seems like overkill in this case.
You can use touch functions to do that:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
when user touch the screen first your touchBegan function is called.
in touchBegan:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint pt = [[touches anyObject] locationInView:self];
}
so you have the point that user touched.Then you must find that the point is in your UIImageView or not.
But if you can give tag to your UIImageViews. That will be pretty much easy.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject ];
if( yourImageView.tag==[touch view].tag){
[[self.view viewWithTag:yourImageView.tag] removeFromSuperView];
[yourImageView release];
}
}