Hi to everyone and thanks in advance =)
I have a doubt related with MKMapView and MKAnnotationView. I need to show annotations with custom images on MKMapView. To do this, and following several tutorials and other stackoverflow answers i created my own class. EDAnnotation.h:
#interface EDAnnotation : MKAnnotationView
//#property (nonatomic, strong) UIImageView *imageView;
- (id)initWithAnnotation:(id <MKAnnotation>)annotation reuseIdentifier:(NSString *)reuseIdentifier;
#end
EDAnnotation.m:
#import "EDAnnotation.h"
#implementation EDAnnotation
- (id)initWithAnnotation:(id <MKAnnotation>)annotation reuseIdentifier:(NSString *)reuseIdentifier{
self = [super initWithAnnotation:annotation reuseIdentifier:reuseIdentifier];
if (self != nil) {
CGRect frame = self.frame;
frame.size = CGSizeMake(15.0, 15.0);
self.frame = frame;
self.backgroundColor = [UIColor clearColor];
self.centerOffset = CGPointMake(-5, -5);
}
return self;
}
-(void) drawRect:(CGRect)rect {
NSMutableParagraphStyle *style = [[NSParagraphStyle defaultParagraphStyle] mutableCopy];
[style setAlignment:NSTextAlignmentCenter];
[[UIImage imageNamed:#"train4_transparent.png"] drawInRect:CGRectMake(0, 0, 15, 15)];
}
#end
I've added several of this annotations to my map and everything works as expected. Whenever I tap on an image, a bubble showing some information is showed. The problem is that i need to be able to detect long press gesture over one of this annotations (in addition to the tap gesture to show the bubble). To achieve this, i've tried to add UILongGestureRecognizer to almost everything possible:
The UIImageView commented in the class above.
The 'EDAnnotationView' instance retrieved using (EDAnnotation *) [mapView dequeueReusableAnnotationViewWithIdentifier:identifier]; in viewForAnnotation callback. I've even tried to make this instance draggable and to listen for didChangeDragState calls in order to cancel them as soon as MKAnnotationViewDragStateStarting is triggered, but this didn't work as expected too.
Basically what i need is:
if the user presses over the image specified in drawRect method of EDAnnotation the bubble shows.
if the user long presses over the image specified in drawRect method of EDAnnotation receive a callback that lets me add a new MKPointAnnotation to the map.
Thanks in advance for your help =)
The problem could be also that your gestureRecognizer conflicts with the gestureRecognizers in the mapView. This could happen, because the annotationViews are subviews of the mapView.To solve this problem use the UIGestureRecognizerDelegate. When you initialize your gestureRecognizer, set the delegate property to the class where you implement that protocol, more precisely these two methods:
#pragma mark GestureRecognizerDelegate
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return YES;
}
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{
return YES;
}
With easily returning YES in both methods the gestureRecognizer should react. Maybe some other gestureRecognizers from the mapView will now fire their actions too, but unfortunately it's not possible to do the delegation of the mapView's gestureRecognizers.
This workaround helped me, when I was adding a longPressureRecognizer to the mapView. i think it could help you with your issue too.
Did you tried Delegate way of calling annotation?
Create a delegate in Annotation Class
#protocol AnnotationDelegate <NSObject>
#optional
- (void)shouldContinueAnimate;
#end
in implementation file
- (void)shouldContinueAnimate {
//add code for animating
}
Import the delegate where ever required < AnnotationDelegate >
In the image view class you can add both LongPressGestureRecognizer and TapGestureRecognizer for the image.
_longPressGestureRecognizer = [[UILongPressGestureRecognizer alloc] initWithTarget:self
action:#selector(handleLongPressGestureRecognizer:)];
_tapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self
action:#selector(handleTapGestureRecognizer:)];
[self.imageView addGestureRecognizer:self.longPressGestureRecognizer];
[self.imageView addGestureRecognizer:self.tapGestureRecognizer];
Handle the method:
- (void)handleTapGestureRecognizer:(UIGestureRecognizer *)sender {
if ([self.delegate respondsToSelector:#selector(shouldContinueAnimate)]) {
[self.delegate shouldContinueAnimate];
}
}
- (void)handleLongPressGestureRecognizer:(UIGestureRecognizer *)sender {
if ([self.delegate respondsToSelector:#selector(shouldContinueAnimate)]) {
[self.delegate shouldContinueAnimate];
}
}
Thanks.
Related
I am using the Texture/AsyncDisplayKit library:
https://github.com/texturegroup/texture
I am trying to implement "swipe to perform action" in a ASCellNode. The problem is that the UIPanGestureRecognizer prevents the table from scrolling.
I am able to successfully get this to work in UIKit using UITableViewCell but for some reason it's not working when using Texture's ASCellNode. I can demonstrate the issue easily with the ASDKgram example provided with this library which has both a UIKit example in one tab and Texture example in another tab:
https://github.com/TextureGroup/Texture/tree/master/examples/ASDKgram
For the UIKit example, all I had to do was:
Add <UIGestureRecognizerDelegate> to PhotoTableViewCell.h
Add UIPanGestureRecognizer *_panGestureRecognizer; in the #implementation PhotoTableViewCell
Add following to the - (instancetype)initWithStyle:(UITableViewCellStyle)style reuseIdentifier:(NSString *)reuseIdentifier:
_panGestureRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panned:)];
_panGestureRecognizer.delegate = self;
[self.contentView addGestureRecognizer:_panGestureRecognizer];
Add following:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer{
if (gestureRecognizer == _panGestureRecognizer) {
CGPoint v = [_panGestureRecognizer velocityInView:_panGestureRecognizer.view];
return fabs(v.x) > fabs(v.y);
}
return false;
}
-(void)panned:(UIPanGestureRecognizer *)sender{
NSLog(#"Panned!");
}
This was enough to get it to print Panned! when panning horizontally and also let the UITableView scroll when it's vertical direction.
The same does not work for the PhotoCellNode. I did the following:
Add <UIGestureRecognizerDelegate> to PhotoCellNode.h
Add UIPanGestureRecognizer *_panGestureRecognizer; in the #implementation PhotoCellNode
Add following to the PhotoCellNode.m:
- (void)didLoad {
[super didLoad];
_panGestureRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panned:)];
_panGestureRecognizer.delegate = self;
[self.view addGestureRecognizer:_panGestureRecognizer];
}
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer{
if (gestureRecognizer == _panGestureRecognizer) {
CGPoint v = [_panGestureRecognizer velocityInView:_panGestureRecognizer.view];
return fabs(v.x) > fabs(v.y);
}
return false;
}
-(void)panned:(UIPanGestureRecognizer *)sender{
NSLog(#"Panned!");
}
This allows Panned! to print when panning horizontally but the table does not scroll at all. Why are they working differently? How can I make the table scroll when the touches are vertical?
I am display a simple MKMapView to display a collection of discovered places near the users location. Upon getting results, I am adding custom annotation views, let's say of class MyAnnotationView to the map.
These custom view's are displayed nicely, and I have ironed out all of the intricate handlers for a very nice UI. Like most would assume (or hope..), upon touching one of my markers, a separate (and custom) MKAnnotationView pops up to display more detail. This custom view has much more detail regarding the location found, with several buttons the user is able to interact with. The interaction is not the issue here (thankfully having overcome that obstacle).
My issue is, for whatever reason, there seems to be a bit of "lag" between the TouchDown and the TouchUpInside event calling, about roughly ~0.5 seconds delay to be more precise.
I have checked firing my anticipated method for only the TouchDown event, and it fires almost immediately (with the micro-delay any UIButton naturally creates).
My guess is that the MKMapView is the culprit. Given it is intercepting / monitoring many different UIGestureRecognizer's, I'm assuming the framework is just a bit "behind" in delivering my TouchUpInside event..
Unfortunately, assumptions don't really help anyone, especially if they don't come with a solution. So if anyone has any idea's / workarounds as to why the event handling is experiencing this delay I would love to hear. Thanks!
CODE REFERENCES
Here is some of my code to help:
Custom annotation view (w/ buttons) .h
#import <UIKit/UIKit.h>
#import "MyAnnotationView.h"
#import MapKit;
#interface MyAnnotationView : MKAnnotationView
.m
#import "MyAnnotationView.h"
#implementation MyAnnotationView {
CGFloat width, height;
}
- (instancetype)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) {
width = frame.size.width, height = frame.size.height;
self.contentView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, width, height)];
[self addSubview:self.contentView];
}
return self;
}
Adding the views
- (MKAnnotationView *)mapView:(MKMapView *)mapView viewForAnnotation:(id<MKAnnotation>)annotation {
// Here I simply create an annotation, assign it to a new `MyAnnotationView`
// and add the view.
MyAnnotationView *view = [[MyAnnotationView alloc] init];
...
// Note: the detailButton is just a UIButton
// This lags...
[view.detailButton addTarget:self action:#selector(didTouchCell) forControlEvents:UIControlEventTouchUpInside];
// No lag... hmm
// [view.detailButton addTarget:self action:#selector(didTouchCell) forControlEvents:UIControlEventTouchDown];
}
The reason for the delay is that the map view is waiting to see if you're going to double-tap to zoom in. Even if you double-tap an annotation, the map will still zoom in. You can remove the delay if you're not bothered about double-tap-to-zoom by removing the double-tap gesture from the view hierarchy.
- (void)removeDoubleTapGestures:(UIView *)view {
NSArray *gestureRecognizers = [view gestureRecognizers];
for (UIGestureRecognizer *recognizer in gestureRecognizers) {
if ([recognizer isKindOfClass:[UITapGestureRecognizer class]] &&
[(UITapGestureRecognizer *)recognizer numberOfTapsRequired] == 2) {
[view removeGestureRecognizer:recognizer];
}
}
for (UIView *subview in view.subviews) {
[self removeDoubleTapGestures:subview];
}
}
In your viewDidLoad call:
[self removeDoubleTapGestures:myMapView];
Remember though that you're modifying MKMapView's view hierarchy, so if Apple change things in the future it could stop working.
When tapping a specific button in my app I want an image to show, I did this using an UIImageView. Then I want to hide that image by tapping it, but I don't understand how to do this?
I tried the following code, but it doesn't work.
#implementation ViewController
-(IBAction)pic {
UIImage *img = [UIImage imageNamed:#"test.png"];
[ImageView setImage:img];
imageView.userInteractionEnabled = YES;
UITapGestureRecognizer *tapRecognize = [[UITapGestureRecognizer alloc]
initWithTarget:self action:#selector(handleTap:)];
tapRecognizer.delegate = self;
[imageView addGestureRecognizer:tapRecognizer];
}
- (void)handleTap:(UITapGestureRecognizer *)tapGestureRecognizer {
//handle tap
}
Its pretty simple.
Use an UIImageView instead and check that userInteractionEnabled is YES on the UIImageView. Then you can then add a gesture recognizer.
Your .h file should have atleast something like below:
#import <UIKit/UIKit.h>
#interface ViewController : UIViewController<UIGestureRecognizerDelegate>
#property (weak, nonatomic) IBOutlet UIImageView *touchImageView;
#end
Dont forget to connect UIImageView from your storyboard to property declared above.
in your .m file:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
self.touchImageView.userInteractionEnabled = YES;
UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc]
initWithTarget:self action:#selector(handleTap:)];
tapRecognizer.delegate = self;
[self.touchImageView addGestureRecognizer:tapRecognizer];
}
- (void)handleTap:(UITapGestureRecognizer *)tapGestureRecognizer {
//handle tap
self.touchImageView.alpha = 0.0f;
}
You could put an image on a button instead. I think using a UIImageView is the right decision though. You need to hook up a gesture to it programmatically. You can do this using something similar to below:
let singleFingerTap = UITapGestureRecognizer(target: self, action: "viewTapped:")
imageView.addGestureRecognizer(singleFingerTap)
You can add a tap gesture recognizer to the UIImageView that contains the image.
var tapGesture = UITapGestureRecognizer(target: <#AnyObject#>, action: <#Selector#>)
In the method you assign as the action, just set myImageView.alpha = 0. This should essentially "hide" your image view. You could also set the height of the image view to 0 if you wanted to hide it in that sense.
An alternative could be to import an open-sourced project, such as the AKImageViewer, to get posts to appear full screen (giving the user a better full view) and allowing them to swipe or cancel to get away from the image (similar to viewing images in the Twitter app.
I want to get rid of magnification and text selection in UITextView but I need phone number, link and address detectors. I am using
-(void)addGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer {
if ([gestureRecognizer isKindOfClass:[UILongPressGestureRecognizer class]]) {
gestureRecognizer.enabled = NO;
}
[super addGestureRecognizer:gestureRecognizer];
return;}
to stop magnification, but it also stops selection phone number / address / link detected by textview.
If I do [_txtView setSelectable:NO]; it stops both magnification and text selection as well as data detection.
After quite a long time trying, I managed to stop text selection, magnifying, and keeping data detection (links clickable etc) by overriding addGestureRecognizer on a UITextView subclass allowing only UILongPressGestureRecognizer delaying touch ending:
UIUnselectableTextView.m
-(void)addGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
if([gestureRecognizer isKindOfClass:[UILongPressGestureRecognizer class]] && gestureRecognizer.delaysTouchesEnded)
{
[super addGestureRecognizer:gestureRecognizer];
}
}
Put image on your UITextview in .xib file then put below code.
- (void)viewDidLoad
{
[super viewDidLoad];
self.navigationController.navigationBarHidden = YES;
UITapGestureRecognizer *tappress= [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(longPressed:)];
img.userInteractionEnabled = YES;
[img addGestureRecognizer:tappress];
}
-(void)longPressed:(UILongPressGestureRecognizer *)sender
{
[yourtextview becomeFirstResponder];
}
in my code img is a UIImageview
Try this:
Set the delegate of the textview to your viewcontroller
add this method
- (void)textViewDidChangeSelection:(UITextView *)textView
{
NSRange selected;
selected.location = 0;
selected.length = 0;
textView.selectedRange = selected;
}
This would disable the magnification but still have links clickable
You just need to make sure you have set right parameters for the UItextfield (and there is no need to actually have it done by overriding the gestures). I guess if you change your attributes for "Behaviour" and "Detection" in interface builder as following you will have your desired behaviour.
I am using a CCTableView to make a table with CCNodes as cells. Those CCNodes have a button each. I want to be able to detect if a user taps on a cell and if it taps on the button. But the CCTableView doesn't have a tableView:didSelectRowAtIndexPath: method so how can I do this? Do you know of any open source class that has this method?
P.S. I am using version 3 of cocos2d
I took a different approach after trying numerous things
#interface WKTableCell : CCTableViewCell
#end
#implementation WKTableCell
- (instancetype) initWithTitle: (NSString *) title
{
self = [super init];
if (!self)
return nil;
[self.button setTitle:title];
// This is a transparent png (400x200) for my needs
CCSpriteFrame * frame = [CCSpriteFrame frameWithImageNamed:#"cell.png"] ;
[self.button setPreferredSize:CGSizeMake(frame.originalSize.width, frame.originalSize.height)];
[self.button setContentSizeType:CCSizeTypePoints];
[self.button setBackgroundSpriteFrame:frame forState:CCControlStateNormal];
}
// then in your table
[table setBlock:^(id sender) {
CCLOG(#"yup, this gets called.. ");
}];
this did work for me..
Your CCTableView, responds to the CCTouchDelegate, so you can use ccTouchBegan etc
to detect the point and then calculate what cell there was in that point. Here the class reference:
http://docs.huihoo.com/doxygen/cocos2d-x/2.1.2/d0/d38/classcocos2d_1_1_c_c_touch_delegate.html