UIPanGestureRecognizer in Texture/AsyncDisplayKit's ASCellNode prevents the table from scrolling - ios

I am using the Texture/AsyncDisplayKit library:
https://github.com/texturegroup/texture
I am trying to implement "swipe to perform action" in a ASCellNode. The problem is that the UIPanGestureRecognizer prevents the table from scrolling.
I am able to successfully get this to work in UIKit using UITableViewCell but for some reason it's not working when using Texture's ASCellNode. I can demonstrate the issue easily with the ASDKgram example provided with this library which has both a UIKit example in one tab and Texture example in another tab:
https://github.com/TextureGroup/Texture/tree/master/examples/ASDKgram
For the UIKit example, all I had to do was:
Add <UIGestureRecognizerDelegate> to PhotoTableViewCell.h
Add UIPanGestureRecognizer *_panGestureRecognizer; in the #implementation PhotoTableViewCell
Add following to the - (instancetype)initWithStyle:(UITableViewCellStyle)style reuseIdentifier:(NSString *)reuseIdentifier:
_panGestureRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panned:)];
_panGestureRecognizer.delegate = self;
[self.contentView addGestureRecognizer:_panGestureRecognizer];
Add following:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer{
if (gestureRecognizer == _panGestureRecognizer) {
CGPoint v = [_panGestureRecognizer velocityInView:_panGestureRecognizer.view];
return fabs(v.x) > fabs(v.y);
}
return false;
}
-(void)panned:(UIPanGestureRecognizer *)sender{
NSLog(#"Panned!");
}
This was enough to get it to print Panned! when panning horizontally and also let the UITableView scroll when it's vertical direction.
The same does not work for the PhotoCellNode. I did the following:
Add <UIGestureRecognizerDelegate> to PhotoCellNode.h
Add UIPanGestureRecognizer *_panGestureRecognizer; in the #implementation PhotoCellNode
Add following to the PhotoCellNode.m:
- (void)didLoad {
[super didLoad];
_panGestureRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panned:)];
_panGestureRecognizer.delegate = self;
[self.view addGestureRecognizer:_panGestureRecognizer];
}
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer{
if (gestureRecognizer == _panGestureRecognizer) {
CGPoint v = [_panGestureRecognizer velocityInView:_panGestureRecognizer.view];
return fabs(v.x) > fabs(v.y);
}
return false;
}
-(void)panned:(UIPanGestureRecognizer *)sender{
NSLog(#"Panned!");
}
This allows Panned! to print when panning horizontally but the table does not scroll at all. Why are they working differently? How can I make the table scroll when the touches are vertical?

Related

Put Swipe Gesture over UIWebView to Get Scroll Direction in IOS

I want to get the scroll direction in webView. I read the code about getting scroll direction here.
-(void)userDidScrollWebView:(id)scrollPoint{
// NSLog(#"scrolled:::");
NSString *x1 = [webView stringByEvaluatingJavaScriptFromString: #"scrollX"];
NSString *y1 = [webView stringByEvaluatingJavaScriptFromString: #"scrollY"];
NSLog(#"scroll x=%# y=%#", x1,y1);
if ([y1 isEqualToString: #"0"]) {
NSLog(#"RELAOD ME");
}
}
I have 2 Questions :-
About this code,I don't know where to call userDidScrollWebView method in my code so that I get regular updates about scrolling.
Another approach, I thought may be I could place Swipe Gesture over Web View but that's not working.
This is how I implemented swipe gesture in UIWebView.
Add <UIGestureRecognizerDelegate> protocol to ViewController.h
In ViewDidLoad Method add the following code:
UISwipeGestureRecognizer * swipeGestureDown = [[UISwipeGestureRecognizer alloc]initWithTarget:self action:#selector(swipeDown)];
swipeGestureDown.numberOfTouchesRequired = 1;
swipeGestureDown.direction = UISwipeGestureRecognizerDirectionDown;
swipeGestureDown.delegate = self;
[self.webView addGestureRecognizer:swipeGestureDown];
Add Delegate method in ViewController.m :
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
-(void)swipeDown
{
NSLog(#"swipe down in webView");
}
Similarly you can add another gesture for swipe Up.

iOS MKAnnotationView LongPressGestureRecognizer

Hi to everyone and thanks in advance =)
I have a doubt related with MKMapView and MKAnnotationView. I need to show annotations with custom images on MKMapView. To do this, and following several tutorials and other stackoverflow answers i created my own class. EDAnnotation.h:
#interface EDAnnotation : MKAnnotationView
//#property (nonatomic, strong) UIImageView *imageView;
- (id)initWithAnnotation:(id <MKAnnotation>)annotation reuseIdentifier:(NSString *)reuseIdentifier;
#end
EDAnnotation.m:
#import "EDAnnotation.h"
#implementation EDAnnotation
- (id)initWithAnnotation:(id <MKAnnotation>)annotation reuseIdentifier:(NSString *)reuseIdentifier{
self = [super initWithAnnotation:annotation reuseIdentifier:reuseIdentifier];
if (self != nil) {
CGRect frame = self.frame;
frame.size = CGSizeMake(15.0, 15.0);
self.frame = frame;
self.backgroundColor = [UIColor clearColor];
self.centerOffset = CGPointMake(-5, -5);
}
return self;
}
-(void) drawRect:(CGRect)rect {
NSMutableParagraphStyle *style = [[NSParagraphStyle defaultParagraphStyle] mutableCopy];
[style setAlignment:NSTextAlignmentCenter];
[[UIImage imageNamed:#"train4_transparent.png"] drawInRect:CGRectMake(0, 0, 15, 15)];
}
#end
I've added several of this annotations to my map and everything works as expected. Whenever I tap on an image, a bubble showing some information is showed. The problem is that i need to be able to detect long press gesture over one of this annotations (in addition to the tap gesture to show the bubble). To achieve this, i've tried to add UILongGestureRecognizer to almost everything possible:
The UIImageView commented in the class above.
The 'EDAnnotationView' instance retrieved using (EDAnnotation *) [mapView dequeueReusableAnnotationViewWithIdentifier:identifier]; in viewForAnnotation callback. I've even tried to make this instance draggable and to listen for didChangeDragState calls in order to cancel them as soon as MKAnnotationViewDragStateStarting is triggered, but this didn't work as expected too.
Basically what i need is:
if the user presses over the image specified in drawRect method of EDAnnotation the bubble shows.
if the user long presses over the image specified in drawRect method of EDAnnotation receive a callback that lets me add a new MKPointAnnotation to the map.
Thanks in advance for your help =)
The problem could be also that your gestureRecognizer conflicts with the gestureRecognizers in the mapView. This could happen, because the annotationViews are subviews of the mapView.To solve this problem use the UIGestureRecognizerDelegate. When you initialize your gestureRecognizer, set the delegate property to the class where you implement that protocol, more precisely these two methods:
#pragma mark GestureRecognizerDelegate
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return YES;
}
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{
return YES;
}
With easily returning YES in both methods the gestureRecognizer should react. Maybe some other gestureRecognizers from the mapView will now fire their actions too, but unfortunately it's not possible to do the delegation of the mapView's gestureRecognizers.
This workaround helped me, when I was adding a longPressureRecognizer to the mapView. i think it could help you with your issue too.
Did you tried Delegate way of calling annotation?
Create a delegate in Annotation Class
#protocol AnnotationDelegate <NSObject>
#optional
- (void)shouldContinueAnimate;
#end
in implementation file
- (void)shouldContinueAnimate {
//add code for animating
}
Import the delegate where ever required < AnnotationDelegate >
In the image view class you can add both LongPressGestureRecognizer and TapGestureRecognizer for the image.
_longPressGestureRecognizer = [[UILongPressGestureRecognizer alloc] initWithTarget:self
action:#selector(handleLongPressGestureRecognizer:)];
_tapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self
action:#selector(handleTapGestureRecognizer:)];
[self.imageView addGestureRecognizer:self.longPressGestureRecognizer];
[self.imageView addGestureRecognizer:self.tapGestureRecognizer];
Handle the method:
- (void)handleTapGestureRecognizer:(UIGestureRecognizer *)sender {
if ([self.delegate respondsToSelector:#selector(shouldContinueAnimate)]) {
[self.delegate shouldContinueAnimate];
}
}
- (void)handleLongPressGestureRecognizer:(UIGestureRecognizer *)sender {
if ([self.delegate respondsToSelector:#selector(shouldContinueAnimate)]) {
[self.delegate shouldContinueAnimate];
}
}
Thanks.

Subview and UIWebView UIGesture doesn't work

I recently got into trouble with a View Controller which has a UIWebView in it and a subview which I would like to add to the View Controller.
That's the View with the UIWebView:
http://cl.ly/image/03473l0e3a2L
My target is, to add a Share Menu which does work without problems:
http://cl.ly/image/3b273t2o3P00
But now I have the problem, that I set gesture recognizers for the social icons + labels (twitter,facebook,mail) - but these gesture recognizers don't do anything.
The ShareView is a UIView Subclass and I add the Gesture Recognizers this way:
UITapGestureRecognizer *fbTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(fbTapped:)];
fbTap.numberOfTapsRequired = 1;
fbTap.numberOfTouchesRequired = 1;
fbTap.delegate = self;
[fbImage addGestureRecognizer:fbTap];
[fbLabel addGestureRecognizer:fbTap];
UITapGestureRecognizer *twTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(twTapped:)];
twTap.numberOfTapsRequired = 1;
twTap.numberOfTouchesRequired = 1;
twTap.delegate = self;
[twImage addGestureRecognizer:twTap];
[twLabel addGestureRecognizer:twTap];
UITapGestureRecognizer *mailTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(mailTapped:)];
mailTap.numberOfTapsRequired = 1;
mailTap.numberOfTouchesRequired = 1;
mailTap.delegate = self;
[mailImage addGestureRecognizer:mailTap];
[mailLabel addGestureRecognizer:mailTap];
I think the Label and the UIImageView names do explain themselves. Every Label and ImageView has set userInteractionEnabled to YES. The ShareView is also enabled for Userinteractions and I did set UIGestureRecognizerDelegate to it.
The fbTapped,mailTapped and twTapped functions do send a Notification to the Main View (the view which has the webview and the ShareView in it).
But now when I click on the labels or imageviews, nothing happendes.
I did read on stackoverflow that the UIWebView in the MainView could interrupt the recognization? But I don't know how to solve this problem.
Would be really happy If you could help me or point me into the right direction to solve this problem.
I hope this piece of code will point you to the right direction:
1.Add a gesture recognizer delegate :
#interface myclass <UIGestureRecognizerDelegate>
{
//Whatever you are doing with gestures
}
2.Implement the delegate method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
//Do your stuff
return YES;
}
check if the control reaches this delegate

MKMapView Not Calling regionDidChangeAnimated on Pan

I have an app with a MKMapView and code that is called each time the map changes locations (in regionDidChangeAnimated). When the app initially loads, regionDidChangeAnimated is called on pans (swipes), pinches, taps and buttons that explicitly update the map coordinates. After loading other views and coming back to the map the regionDidChangeAnimated is only called for taps and the buttons that explicitly update the map. Panning the map and pinches no longer call regionDidChangeAnimated.
I have looked at this stackoverflow post which did not solve this issue. The forum posts on devforums and iphonedevsdk also did not work. Does anyone know what causes this issue? I am not adding any subviews to MKMapView.
I did not want to initially do it this way, but it appears to work with no problems so far (taken from devforums post in question):
Add the UIGestureRecognizerDelegate to your header.
Now add a check for the version number... If we're on iOS 4 we can do this:
if (NSFoundationVersionNumber >= 678.58){
UIPinchGestureRecognizer *pinch = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(pinchGestureCaptured:)];
pinch.delegate = self;
[mapView addGestureRecognizer:pinch];
[pinch release];
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panGestureCaptured:)];
pan.delegate = self;
[mapView addGestureRecognizer:pan];
[pan release];
}
Add the delegate methods to handle the gestures:
#pragma mark -
#pragma mark Gesture Recognizers
- (void)pinchGestureCaptured:(UIPinchGestureRecognizer*)gesture{
if(UIGestureRecognizerStateEnded == gesture.state){
///////////////////[self doWhatYouWouldDoInRegionDidChangeAnimated];
}
}
- (void)panGestureCaptured:(UIPanGestureRecognizer*)gesture{
if(UIGestureRecognizerStateEnded == gesture.state){
///////////////////[self doWhatYouWouldDoInRegionDidChangeAnimated];
}
}
-(BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer{
return YES;
}
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch: (UITouch *)touch{
return YES;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return YES;
}

Swipe a mapkit away from view

By default, when swipe a mapkit view, the map moves. This is great.
If I want to move away from the map view and load another viewcontroller's view, how do I accomplish that? I could add a button to do that, but I'd like to use gesture.
Thanks
THE FOLLOWING CODE WORKED:
(1) In the map view controller's header file, I added UIGestureRecognizerDelegate to support its protocol
(2) In map view controller's .m file , I added
- (BOOL) gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
(3) In map view controllers viewDidLoad method I added:
UISwipeGestureRecognizer *leftSwipe =
[[[UISwipeGestureRecognizer alloc]
initWithTarget:self action:#selector(leftSwipeReceiver:)] autorelease];
leftSwipe.direction = UISwipeGestureRecognizerDirectionLeft;
leftSwipe.delegate = self;
[self.view addGestureRecognizer:leftSwipe];
(4) This follow function is called for a left-swipe
- (void)leftSwipeReceiver:(UIGestureRecognizer *)recognizer
{
NSLog(#"leftSwipeReceiver:");
}

Resources