iOS detect drag outside left edge of screen - ios

In iPad when you put your finger outside top or bottom edge of screen and then drag it on screen a menu is revealed. How can I implement that?

There is specifically a Gesture Recogniser class for this, introduced in iOS 7. It's the UIScreenEdgePanGestureRecognizer. The documentation for it is here. Check it out.
To test this in the simulator, just start the drag from near the edge (~15 points).
Also, you will have to create a gestureRecognizer for each edge. You can't OR edges together, so UIRectEdgeAll won't work.
There is a simple example here. Hope this helps!

Well you can do something like this, this example is the case where you want you pan gesture to work only when the user swipes 20px inside from the right hand side of the screen
First of all add the gesture to your window
- (void)addGestures {
if (!_panGesture) {
_panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[_panGesture setDelegate:self];
[self.view addGestureRecognizer:_panGesture];
}
}
After adding the check whether the touch you recieved is a pan gesture and then perform your action accordingly
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint point = [touch locationInView:self.view];
if (gestureRecognizer == _panGesture) {
return [self slideMenuForGestureRecognizer:gestureRecognizer withTouchPoint:point];
}
return YES;
}
Here is how you can check whether your touch is contained in the region where you want it to be
-(BOOL)isPointContainedWithinBezelRect:(CGPoint)point {
CGRect leftBezelRect;
CGRect tempRect;
//this will be the width between CGRectMaxXEdge and the screen offset, thus identifying teh region
CGFloat bezelWidth =20.0;
CGRectDivide(self.view.bounds, &leftBezelRect, &tempRect, bezelWidth, CGRectMaxXEdge);
return CGRectContainsPoint(leftBezelRect, point);
}

Related

How to get coordinates of the tap in 2 views simultaneously

I am working on an iOS map app and it includes interactive map. The interactive map is a subclass of UIImageView and placed on a scrollView. My view hierarchy is shown below:
When user taps some part of the map, ViewController performs animated segue (like zoom-in to that area of the map). I can start segue from any point of the screen, but to do this properly, I need exact coordinates of user's tap relative to the screen itself. As ImageView is put at the top of ScrollView, it uses different coordinate system, larger than screen size. No matter, which area of map has ben tapped, what matters is the tapped CGPoint on the screen (physical).
ImageView uses its own code to get coordinates of a tap:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
// cancel previous touch ended event
[NSObject cancelPreviousPerformRequestsWithTarget:self];
CGPoint touchPoint = \
[[touches anyObject] locationInView:self];
NSValue* touchValue =\
[NSValue
valueWithCGPoint:touchPoint];
// perform new one
[self
performSelector:#selector(_performHitTestOnArea:)
withObject:touchValue
afterDelay:0.1];
}
And the case if I place gesture recognizer, it works, but ImageView can't get any touches and, therefore, call segue.
The code for gesture recognizer, I attempted to use:
UITapGestureRecognizer *rec = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapRecognized:)];
[someView addGestureRecognizer:rec];
[rec release];
// elsewhere
- (void)tapRecognized:(UITapGestureRecognizer *)recognizer
{
if(recognizer.state == UIGestureRecognizerStateRecognized)
{
CGPoint point = [recognizer locationInView:recognizer.view];
// again, point.x and point.y have the coordinates
}
}
So, is there any way to get two coordinates in different reference systems?, or to make these recognizers work simultaneously without interfering each other?
Solved
I use this code to convert touched point from one view's reference system to
CGPoint pointInViewCoords = [self.parentView convertPoint:self.imageView.touchPoint fromView:self.imageView];
Where self.parentView is "View" on hierarchy image - with the size of the screen.

Is there a built in way to add a gesture recognizer (specifically pan) just to the right or left edges of a view?

Throughout iOS 7 there's many situations where users can slide their finger in from the left or right edge of the screen in order to perform an action, such as popping a view controller or showing a sidebar.
Is there a built in way to do this that I've completely overlooked somehow (yes, I've searched extensively)? Or is the only way to check the frame position of where the pan started?
This is because I want to perform distinct actions if the user pulls from the edge, or say the middle.
You have UIScreenEdgePanGestureRecognizer, which is added to iOS7 to detect, well, panning from the edges of the screen. For panning from the middle, a normal pan gesture recognizer will suffice, where you can check if the pan gesture originated close enough to the middle.
Use the UIScreenEdgePanGestureRecognizer, but check that it's available first (since it's iOS 7+):
if (NSClassFromString(#"UIScreenEdgePanGestureRecognizer")) {
UIScreenEdgePanGestureRecognizer *panRecognizer =
[[UIScreenEdgePanGestureRecognizer alloc] initWithTarget:self
action:#selector(handleScreenEdgePanGesture:)];
panRecognizer.edges = UIRectEdgeLeft;
}
Set up gestures
UIPanGestureRecognizer* panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanning:)];
[self.view addGestureRecognizer:panGesture];
Handling gesture states
- (void)handlePanning:(UIPanGestureRecognizer *)gestureRecognizer
{
switch ([gestureRecognizer state])
{
case UIGestureRecognizerStateBegan:
[self startDragging:gestureRecognizer];
break;
//u won't need following cases
case UIGestureRecognizerStateEnded:
case UIGestureRecognizerStateCancelled:
case UIGestureRecognizerStateFailed:
[self stopDragging:gestureRecognizer];
break;
default:
break;
}
}
Recognizing start point of drag
- (void)startDragging:(UIPanGestureRecognizer *)gestureRecognizer
{
CGPoint pointInSrc = [gestureRecognizer locationInView:yourVIEW];
}
The pan gesture recognizer is a predefined recognizer, about all you can do with it is to determine how fast the user moved their finger. If you want to tell if where the movement started, you'll have to code your own recognizer. It's not that difficult, you'll be notified when the touch started and where it ended.

UIButton subview of a gesture recognizer

I have a UIView with two gesture recognizers. Both recognize tap with two fingers: one for the upper half of the screen, the other for the bottom of the screen.
In that UIView, I have 4 buttons that cover the entire screen (each button is a quarter of the screen).
I'm using the gesture recognizer to detect when the user presses 2 buttons at the same time, and I still want to recognize the normal touches on the buttons.
I've setup everything, and it works fine. However, when pressing with just one finger the shadow on the button appears on Touch Up, and not on Touch Down. And it feels weird. I've tried to change delaysTouchesBegan with no success.
Is there a way to have both behaviours? Detect the touches with two fingers, but have a "normal behaviour" when there's only one finger? Otherwise, can I force the pressed state of a UIbutton?
Here's how I setup my gestures :
-(void)initGestureRecognition{
handClapTapGestureRecognizer = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(handClapDetected:)];
handClapTapGestureRecognizer.numberOfTouchesRequired = 2;
handClapTapGestureRecognizer.numberOfTapsRequired = 1;
handClapTapGestureRecognizer.cancelsTouchesInView = YES;
[self.gestureRecognitionView addGestureRecognizer:handClapTapGestureRecognizer];
handClapTapGestureRecognizer.delegate = self;
jumpTapGestureRecognizer = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(jumpDetected:)];
jumpTapGestureRecognizer.numberOfTouchesRequired = 2;
jumpTapGestureRecognizer.numberOfTapsRequired = 1;
jumpTapGestureRecognizer.cancelsTouchesInView = YES;
[self.gestureRecognitionView addGestureRecognizer:jumpTapGestureRecognizer];
jumpTapGestureRecognizer.delegate = self;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return NO;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{
if ([gestureRecognizer isEqual:handClapTapGestureRecognizer] && [touch locationInView:self.view].y > self.view.frame.size.height/2)
return NO;
if ([gestureRecognizer isEqual:jumpTapGestureRecognizer] && [touch locationInView:self.view].y < self.view.frame.size.height/2)
return NO;
return YES;
}
I know my problem is similar to that one : UIButton inside a view that has a UITapGestureRecognizer but the difference is that in my case, the behaviour is ok, and I'm just trying to get the shadow on the button on Touch DOwn, rather than on Touch Up.
Thanks
Could you manually set [button setHighlighted:YES] when the tap gesture is first recognized and its location matches that of the button's, and then to NO when the gesture ends?

Some buttons fail hitTest

My interface sometimes has buttons around its periphery. Areas without buttons accept gestures.
GestureRecognizers are added to the container view, in viewDidLoad. Here’s how the tapGR is set up:
UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(playerReceived_Tap:)];
[tapGR setDelegate:self];
[self.view addGestureRecognizer:tapGR];
In order to prevent the gesture recognizers from intercepting button taps, I implemented shouldReceiveTouch to return YES only if the view touched is not a button:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gr
shouldReceiveTouch:(UITouch *)touch {
// Get the topmost view that contains the point where the gesture started.
// (Buttons are topmost, so if they were touched, they will be returned as viewTouched.)
CGPoint pointPressed = [touch locationInView:self.view];
UIView *viewTouched = [self.view hitTest:pointPressed withEvent:nil];
// If that topmost view is a button, the GR should not take this touch.
if ([viewTouched isKindOfClass:[UIButton class]])
return NO;
return YES;
}
This works fine most of the time, but there are a few buttons that are unresponsive. When these buttons are tapped, hitTest returns the container view, not the button, so shouldReceiveTouch returns YES and the gestureRecognizer commandeers the event.
To debug, I ran some tests...
The following tests confirmed that the button was a sub-subview of the container view, that it was enabled, and that both button and the subview were userInteractionEnabled:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gr
shouldReceiveTouch:(UITouch *)touch {
// Test that hierarchy is as expected: containerView > vTop_land > btnSkipFwd_land.
for (UIView *subview in self.view.subviews) {
if ([subview isEqual:self.playComposer.vTop_land])
printf("\nViewTopLand is a subview."); // this prints
}
for (UIView *subview in self.playComposer.vTop_land.subviews) {
if ([subview isEqual:self.playComposer.btnSkipFwd_land])
printf("\nBtnSkipFwd is a subview."); // this prints
}
// Test that problem button is enabled.
printf(“\nbtnSkipFwd enabled? %d", self.playComposer.btnSkipFwd_land.enabled); // prints 1
// Test that all views in hierarchy are interaction-enabled.
printf("\nvTopLand interactionenabled? %d", self.playComposer.vTop_land.userInteractionEnabled); // prints 1
printf(“\nbtnSkipFwd interactionenabled? %d", self.playComposer.btnSkipFwd_land.userInteractionEnabled); // prints 1
// etc
}
The following test confirms that the point pressed is actually within the button’s frame.
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gr
shouldReceiveTouch:(UITouch *)touch {
CGPoint pointPressed = [touch locationInView:self.view];
CGRect rectSkpFwd = self.playComposer.btnSkipFwd_land.frame;
// Get the pointPressed relative to the button's frame.
CGPoint pointRelSkpFwd = CGPointMake(pointPressed.x - rectSkpFwd.origin.x, pointPressed.y - rectSkpFwd.origin.y);
printf("\nIs relative point inside skipfwd? %d.", [self.playComposer.btnSkipFwd_land pointInside:pointRelSkpFwd withEvent:nil]); // prints 1
// etc
}
So why is hitTest returning the container view rather than this button?
SOLUTION: The one thing I wasn't testing was that the intermediate view, vTop_land, was framed properly. It looked OK because it had an image that extended across the screen -- past the bounds of its frame (I didn't know this was possible). The frame was set to portrait width, rather than landscape width, so buttons on the far right were out of zone.
Hit test is not reliable in most cases, and it is generally not advisable to use it along with gestureRecognizers.
Why dont you setExclusiveTouch:YES for each button, and this should make sure that the buttons are always chosen.

iOS Animate sliding an imageview from off-screen to on-screen with gesture

I'm looking to animate bubbles with text on them to slide on and off the screen. The ideal implementation for this animation is iOS's horizonatal scroll with paging enabled. I definitely want the "bounce" when I reach the end of the speech bubbles and I definetely want the bubbles to track the finger until a certain point before they will slide off the screen. I believe this is not the same as a swipe (which is just a flick in one direction).
However, the problem with the horizontal scroll is that it is optimized for a static number of images. I will be having a dynamic number of images and as far as I can tell, you cannot dynamically append images to horizontal scroller. The idea is the app dynamically adds content to the scroller as you continue to progress through it.
The scroller was easy enough to get going but I'm going to have to tear it down now. How can I get started with the gesture (I'm not sure if the standard gesture recognizers will work for me at this point) as well as the animation? I've never worked with that portion of iOS code before.
I'm not sure if I follow your question entirely, but if you want to animate the movement of something based upon a gesture, you can use a UIPanGestureRecognizer and change the center of whatever subview you want. For example, in viewDidLoad you would:
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(movePiece:)];
[whateverViewYouWantToAnimate addGestureRecognizer:panGesture];
You can then have your gesture recognizer move it where ever you want:
- (void)movePiece:(UIPanGestureRecognizer *)gestureRecognizer
{
static CGPoint originalCenter;
if (gestureRecognizer.state == UIGestureRecognizerStateBegan)
{
originalCenter = [gestureRecognizer view].center;
}
else if (gestureRecognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint translation = [gestureRecognizer translationInView:self.view];
gestureRecognizer.view.center = CGPointMake(originalCenter.x + translation.x, originalCenter.y);
// if you wanted to animate both left/right and up/down, it would be:
// gestureRecognizer.view.center = CGPointMake(originalCenter.x + translation.x, originalCenter.y + translation.y);
}
else if (gestureRecognizer.state == UIGestureRecognizerStateEnded)
{
// replace this offscreen CGPoint with something that makes sense for your app
CGPoint offscreen = CGPointMake(480, gestureRecognizer.view.center.y);
[UIView animateWithDuration:0.5
animations:^{
gestureRecognizer.view.center = offscreen;
}
completion:^(BOOL finished){
// when you're done, you might want to do whatever cleanup
// is appropriate for your app (e.g. do you want to remove it?)
[gestureRecognizer.view removeFromSuperview];
}];
}
}

Resources