Custom MapKit Annotation Lagging - ios

I am display a simple MKMapView to display a collection of discovered places near the users location. Upon getting results, I am adding custom annotation views, let's say of class MyAnnotationView to the map.
These custom view's are displayed nicely, and I have ironed out all of the intricate handlers for a very nice UI. Like most would assume (or hope..), upon touching one of my markers, a separate (and custom) MKAnnotationView pops up to display more detail. This custom view has much more detail regarding the location found, with several buttons the user is able to interact with. The interaction is not the issue here (thankfully having overcome that obstacle).
My issue is, for whatever reason, there seems to be a bit of "lag" between the TouchDown and the TouchUpInside event calling, about roughly ~0.5 seconds delay to be more precise.
I have checked firing my anticipated method for only the TouchDown event, and it fires almost immediately (with the micro-delay any UIButton naturally creates).
My guess is that the MKMapView is the culprit. Given it is intercepting / monitoring many different UIGestureRecognizer's, I'm assuming the framework is just a bit "behind" in delivering my TouchUpInside event..
Unfortunately, assumptions don't really help anyone, especially if they don't come with a solution. So if anyone has any idea's / workarounds as to why the event handling is experiencing this delay I would love to hear. Thanks!
CODE REFERENCES
Here is some of my code to help:
Custom annotation view (w/ buttons) .h
#import <UIKit/UIKit.h>
#import "MyAnnotationView.h"
#import MapKit;
#interface MyAnnotationView : MKAnnotationView
.m
#import "MyAnnotationView.h"
#implementation MyAnnotationView {
CGFloat width, height;
}
- (instancetype)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) {
width = frame.size.width, height = frame.size.height;
self.contentView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, width, height)];
[self addSubview:self.contentView];
}
return self;
}
Adding the views
- (MKAnnotationView *)mapView:(MKMapView *)mapView viewForAnnotation:(id<MKAnnotation>)annotation {
// Here I simply create an annotation, assign it to a new `MyAnnotationView`
// and add the view.
MyAnnotationView *view = [[MyAnnotationView alloc] init];
...
// Note: the detailButton is just a UIButton
// This lags...
[view.detailButton addTarget:self action:#selector(didTouchCell) forControlEvents:UIControlEventTouchUpInside];
// No lag... hmm
// [view.detailButton addTarget:self action:#selector(didTouchCell) forControlEvents:UIControlEventTouchDown];
}

The reason for the delay is that the map view is waiting to see if you're going to double-tap to zoom in. Even if you double-tap an annotation, the map will still zoom in. You can remove the delay if you're not bothered about double-tap-to-zoom by removing the double-tap gesture from the view hierarchy.
- (void)removeDoubleTapGestures:(UIView *)view {
NSArray *gestureRecognizers = [view gestureRecognizers];
for (UIGestureRecognizer *recognizer in gestureRecognizers) {
if ([recognizer isKindOfClass:[UITapGestureRecognizer class]] &&
[(UITapGestureRecognizer *)recognizer numberOfTapsRequired] == 2) {
[view removeGestureRecognizer:recognizer];
}
}
for (UIView *subview in view.subviews) {
[self removeDoubleTapGestures:subview];
}
}
In your viewDidLoad call:
[self removeDoubleTapGestures:myMapView];
Remember though that you're modifying MKMapView's view hierarchy, so if Apple change things in the future it could stop working.

Related

iOS tap recognizer to catch all taps

I want a pretty simple thing - in my top controller (which is navigation controller) to set up a tap gesture recognizer which will catch all the taps everywhere on the view. Currently when I tap on a button the system is not even thinking to bother my recognizer (except the gestureRecognizer:shouldReceiveTouch: delegate method, where I return YES). Instead, it just executes a button click. So I want to install "the strongest" recognizer on a view hierarchy no matter what.
You might try putting an empty UIView on top of all other views and add the UITapGestureRecognizer to it. We do something similar with help overlays. The biggest issue is figuring out how and when to ignore the touches so the underlying buttons get them when needed.
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
UIButton *b = [UIButton buttonWithType:UIButtonTypeInfoDark];
b.frame = CGRectMake(50,50, b.bounds.size.width, b.bounds.size.height );
[self.view addSubview:b];
UIView *invisibleView = [[UIView alloc] initWithFrame:self.view.bounds];
invisibleView.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;
[invisibleView addGestureRecognizer:[[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapHit:)]];
[self.view addSubview:invisibleView];
}
-(void)tapHit:(UITapGestureRecognizer*)tap {
NSLog( #"tapHit" );
}
#end

iOS MKAnnotationView LongPressGestureRecognizer

Hi to everyone and thanks in advance =)
I have a doubt related with MKMapView and MKAnnotationView. I need to show annotations with custom images on MKMapView. To do this, and following several tutorials and other stackoverflow answers i created my own class. EDAnnotation.h:
#interface EDAnnotation : MKAnnotationView
//#property (nonatomic, strong) UIImageView *imageView;
- (id)initWithAnnotation:(id <MKAnnotation>)annotation reuseIdentifier:(NSString *)reuseIdentifier;
#end
EDAnnotation.m:
#import "EDAnnotation.h"
#implementation EDAnnotation
- (id)initWithAnnotation:(id <MKAnnotation>)annotation reuseIdentifier:(NSString *)reuseIdentifier{
self = [super initWithAnnotation:annotation reuseIdentifier:reuseIdentifier];
if (self != nil) {
CGRect frame = self.frame;
frame.size = CGSizeMake(15.0, 15.0);
self.frame = frame;
self.backgroundColor = [UIColor clearColor];
self.centerOffset = CGPointMake(-5, -5);
}
return self;
}
-(void) drawRect:(CGRect)rect {
NSMutableParagraphStyle *style = [[NSParagraphStyle defaultParagraphStyle] mutableCopy];
[style setAlignment:NSTextAlignmentCenter];
[[UIImage imageNamed:#"train4_transparent.png"] drawInRect:CGRectMake(0, 0, 15, 15)];
}
#end
I've added several of this annotations to my map and everything works as expected. Whenever I tap on an image, a bubble showing some information is showed. The problem is that i need to be able to detect long press gesture over one of this annotations (in addition to the tap gesture to show the bubble). To achieve this, i've tried to add UILongGestureRecognizer to almost everything possible:
The UIImageView commented in the class above.
The 'EDAnnotationView' instance retrieved using (EDAnnotation *) [mapView dequeueReusableAnnotationViewWithIdentifier:identifier]; in viewForAnnotation callback. I've even tried to make this instance draggable and to listen for didChangeDragState calls in order to cancel them as soon as MKAnnotationViewDragStateStarting is triggered, but this didn't work as expected too.
Basically what i need is:
if the user presses over the image specified in drawRect method of EDAnnotation the bubble shows.
if the user long presses over the image specified in drawRect method of EDAnnotation receive a callback that lets me add a new MKPointAnnotation to the map.
Thanks in advance for your help =)
The problem could be also that your gestureRecognizer conflicts with the gestureRecognizers in the mapView. This could happen, because the annotationViews are subviews of the mapView.To solve this problem use the UIGestureRecognizerDelegate. When you initialize your gestureRecognizer, set the delegate property to the class where you implement that protocol, more precisely these two methods:
#pragma mark GestureRecognizerDelegate
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return YES;
}
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{
return YES;
}
With easily returning YES in both methods the gestureRecognizer should react. Maybe some other gestureRecognizers from the mapView will now fire their actions too, but unfortunately it's not possible to do the delegation of the mapView's gestureRecognizers.
This workaround helped me, when I was adding a longPressureRecognizer to the mapView. i think it could help you with your issue too.
Did you tried Delegate way of calling annotation?
Create a delegate in Annotation Class
#protocol AnnotationDelegate <NSObject>
#optional
- (void)shouldContinueAnimate;
#end
in implementation file
- (void)shouldContinueAnimate {
//add code for animating
}
Import the delegate where ever required < AnnotationDelegate >
In the image view class you can add both LongPressGestureRecognizer and TapGestureRecognizer for the image.
_longPressGestureRecognizer = [[UILongPressGestureRecognizer alloc] initWithTarget:self
action:#selector(handleLongPressGestureRecognizer:)];
_tapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self
action:#selector(handleTapGestureRecognizer:)];
[self.imageView addGestureRecognizer:self.longPressGestureRecognizer];
[self.imageView addGestureRecognizer:self.tapGestureRecognizer];
Handle the method:
- (void)handleTapGestureRecognizer:(UIGestureRecognizer *)sender {
if ([self.delegate respondsToSelector:#selector(shouldContinueAnimate)]) {
[self.delegate shouldContinueAnimate];
}
}
- (void)handleLongPressGestureRecognizer:(UIGestureRecognizer *)sender {
if ([self.delegate respondsToSelector:#selector(shouldContinueAnimate)]) {
[self.delegate shouldContinueAnimate];
}
}
Thanks.

Detecting a touch anywhere on the screen

I am wanting to know when a user has touched anywhere on the screen of my app.
I have looked into using -(UIResponder *)nextResponder but unfortunately this will not work, as I am also reloaded a table automatically, so this gets trigged when that occurs.
I have also tried a gesture recognizer, with the following code. But this will only recognise touches on the view. Where as I have many buttons the user will be using to operate the app. I would like to avoid adding a gesture recogniser or code for this in every button and segment control I have on the screen
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapOnView:)];
[self.mainView addGestureRecognizer:tap];
- (void)tapOnView:(UITapGestureRecognizer *)sender
{
//do something
}
I have also tried -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event , but this has the same issue as the gesture recognizer.
I was wondering if there is any way I could achieve this task. I was hoping that I may be able to recognise the type of event from within the nextResponder, and then I could detect if it is button for example.
EDIT: The reason I am working on this is that my app needs to stay active and the screen cannot be locked (so I have disabled screen locking). To avoid excessive use of power, I need to dim the screen, but then return the brightness back to the original level once the app is touched. I need this feature to only occur on 1 of my viewcontrollers.
As mentioned by Ian MacDonald, using hitTest:: is a great solution to detect user interaction on an app wide scale, including when buttons, textfields, etc, are selected.
My solution was to subclass UIWindow and implement the hitTest method.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// do your stuff here
// return nil if you want to prevent interaction with UI elements
return [super hitTest:point withEvent:event];
}
You could attach your UITapGestureRecognizer to your [[UIApplication sharedApplication] keyWindow].
Alternatively, you could override hitTest: of your root UIView.
Is there a particular task you are hoping to accomplish? There may be a better way than assigning an "anywhere" gesture.
Edit: Use hitTest:.
#interface PassthroughView : UIView
#property (readonly) id target;
#property (readonly) SEL selector;
#end
#implementation PassthroughView
- (void)setTarget:(id)target selector:(SEL)selector {
_target = target;
_selector = selector;
}
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
[_target performSelector:_selector];
return nil;
}
#end
#implementation YourUIViewController {
PassthroughView *anytouchView;
}
- (void)viewDidLoad {
// Add this at the end so it's above all other views.
anytouchView = [[PassthroughView alloc] initWithFrame:self.view.bounds];
[anytouchView setAutoresizingMask:UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight];
[anytouchView setTarget:self selector:#selector(undim)];
[anytouchView setHidden:YES];
[self.view addSubview:anytouchView];
}
- (void)undim {
[anytouchView setHidden:YES];
}
- (void)dim {
[anytouchView setHidden:NO];
}
#end
Your edit adds more clarity to your question.
The reason I am working on this is that my app needs to stay active
and the screen cannot be locked (so I have disabled screen locking).
To avoid excessive use of power, I need to dim the screen, but then
return the brightness back to the original level once the app is
touched.
Since you are controlling the screen brightness, you can add one transparent view controller before dimming screen on top of your root controller which does only one job, listen to tap using Tap gesture. And on tap you can dismiss the view controller and adjust brightness to previous state.
By doing so you dont have to worry about buttons being clicked as they will be below the transparent view controller. Since its a whole new view controller sitting on top of stack you dont have to modify your existing code as well.
Ok I have had a similar problem before.
As I remember I subclassed the UIWindow for full screen detection and made it First responder.
Than I overridden the touch to handle from subclasses.
You can also use code to identify the control that is been touched.
#import <QuartzCore/QuartzCore.h>
- (void)viewDidLoad
{
[super viewDidLoad];
[self.view setMultipleTouchEnabled:YES];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// Enumerate over all the touches
[touches enumerateObjectsUsingBlock:^(id obj, BOOL *stop) {
// Get a single touch and it's location
UITouch *touch = obj;
CGPoint touchPoint = [touch locationInView:self.view];
...
}];
}
To disable the locking of screen I used below code:
[[UIApplication sharedApplication] setIdleTimerDisabled:YES];
I used following functions to dim or increase the screen brightness
[[UIScreen mainScreen] setBrightness:0.0f]; //and
[[UIScreen mainScreen] setBrightness:1.0f];

how to make an object move by touching it in iOS

I am a newbie in Objective-C and trying to make a simple app in which when you touch the object it will move randomly for a while and stops. then you need to touch it again so it will move again and stop after a while.
I have searched for touch method and some tutorials, but the problem is I don't know how to start. It is like I need a function to move the object and one to touch it, but I don't know how to connect them and use them.
here is a tutorial which helped me a lot to get a view of functionality and it actually function in opposite way of my app. but still I can not start programming on my own.
http://xcodenoobies.blogspot.se/2010/11/under-construction.html
Any help would be appreciated, regarding how to program my logic and how to find the right methods and how to find the type of variables I need.
Step 1: Put this code in your ViewDidLoad Method, in which i have created some UIImageView and add it to View Randomly
[self.view setTag:1];
for(int i=0;i<4;i++)
{
int x = arc4random()%300;
int y = arc4random()%400;
#warning set Image here
UIImageView *imgview = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"someImage.png"]];
[imgview setFrame:CGRectMake(x, y, 25, 25)];
[imgview setUserInteractionEnabled:YES];
[self.view addSubview:imgview];
}
Step 2 : Define touchBegan Method to handle touch and move objects around the view, we have set Tag = 1 for ViewController ,because we dont want to move our mainview, only subviews will be moved
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if([[touch view] tag] != 1)
{
[UIView animateWithDuration:0.25f animations:^{
int x = arc4random()%300;
int y = arc4random()%400;
[[touch view] setCenter:CGPointMake(x, y)];
}];
}
}
What you need is to add a gesture recognizer to the view you want to be able to touch:
// In a view controller or in a UIView subclass
UITapGestureRecognizer *tapGestureRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTap:)];
[self addGestureRecognizer:tapGestureRecognizer];
// Or [self.view addGestureRecognizer:tapGestureRecognizer];
- (void)handleTap:(UITapGestureRecognizer *)sender
{
if (sender.state == UIGestureRecognizerStateEnded) {
// Animate the view, move it around for a while etc.
// For animating a view use animateWithDuration:animations:
}
}
If I get you correctly I would say that the easiest way to achieve what you request is to create a UIButton in Interface Builder and connect it to an IBAction which moves it to a random spot.. You can then add a custom graphic to the button..
Create a public method in your ViewController with return type IBAction
Create a button in IB and connect its "Touch Up Inside" outlet to your IBAction
I your IBAction method, generate a random x and y coordinate within the screens bounds and animate the movement to this point.
I will/can not go into details on the specific code since it would take way to much space.
Note that your question is very open and vague which is not considered good style on StackOverflow. Also, you might wan't to save stuff like animations until you are a bit more experienced with iOS and Objective-C
-V
Actually Apple made a demo for this.
http://developer.apple.com/library/ios/#samplecode/MoveMe/Introduction/Intro.html
You can try to modify this code to your needs. And the actual functions you where looking for where:
- (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
- (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event
If this is the answer you where looking for please click "answered" so this question can be considered as closed :-).
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
is a method called when user touches the view. if you override this in mainview (big view) you will have to find if touched point is where the object is with some helper method as described in your link. else you can override your object's class & implement the method so you dont have to explicitly find if touched point is on the object or not.
for your requirement. i'd say override the uiimageview & inside that put the touchesbegan implementation it will just work fine.
.h file
#interface StarView : UIImageView
#end
.m file
#implementation StarView
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
return self;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// your code in the link with the related methods
Destination = CGPointMake(arc4random() % 320, arc4random() % 480);
// calculate steps based on speed specified and distance between the current location of the sprite and the new destination
xamt = ((Destination.x - self.center.x) / speed);
yamt = ((Destination.y - self.center.y) / speed);
// ask timer to execute moveBall repeatedly each 0.2 seconds. Execute the timer.
mainTimer = [NSTimer scheduledTimerWithTimeInterval:(0.02) target:self selector:#selector(moveBall) userInfo:nil repeats: NO];
}
dont forget to copy the moveBall method after this.
In your mainview just make an instance of StarView & add to mainview

UIView and UIViewController

I know this is really basic stuff but i need to understand whether my understanding of this is correct.
So what i want to do is this. I want an view with a label on which when double tapped flips and loads another view. On the second view i want a UIPickerView and above i have a button saying back. Both views will be of same size as an UIPickerView which is 320px x 216px.
What i am thinking of to do is create two UIViewclasses named labelView and pickerView. I would then create a viewController which on loadView loads labelView then when user double taps the labelView i get an event in labelView class which is sent to my viewController that then can unload loadView and load the pickerView.
Does this sound as the best way to do this ? Is there a simpler way ? I am also unsure how i route the event from the labelView class to the viewControllerclass.
I dont exactly know the most efficient way to do it(as i am also now to this language),but it is for sure that i have solved ur problem. I made a simple program for that.Three classes involved here in my eg are BaseViewController (which will show two views),LabelView and PickerView (according to ur requirement).
In LabelView.h
#protocol LabelViewDelegate
-(void)didTapTwiceLabelView;
#end
#interface LabelView : UIView {
id <LabelViewDelegate> delegate;
}
#property(nonatomic,retain)id <LabelViewDelegate> delegate;
-(void)didTouch;
#end
In LabelView.m
#synthesize delegate;
-(id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self)
{
UILabel* labl = [[UILabel alloc] initWithFrame:CGRectMake(10, 5, frame.size.width-20,20)];
labl.text = #"Some Text";
[self addSubview:labl];
[labl release]; labl = nil;
self.backgroundColor = [UIColor grayColor];
UITapGestureRecognizer* ges = [[[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(didTouch)] autorelease];
ges.numberOfTapsRequired = 2;
[self addGestureRecognizer:ges];
}
return self;
}
-(void)didTouch
{
[delegate didTapTwiceLabelView];
}
//=============================================================
In Pickerview.h
#protocol PickerViewDelegate
-(void)didTapBackButton;
#end
#interface PickerView : UIView <UIPickerViewDelegate,UIPickerViewDataSource>{
id <PickerViewDelegate> delegate;
}
#property(nonatomic,retain)id <PickerViewDelegate> delegate;
#end
In Pickerview.m
#implementation PickerView
#synthesize delegate;
-(id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self)
{
UIPickerView* picker = [[UIPickerView alloc] initWithFrame:CGRectMake(0, 30, 320, 216)];
picker.delegate = self;
picker.dataSource = self;
[self addSubview:picker];
[picker release]; picker = nil;
self.frame = CGRectMake(frame.origin.x, frame.origin.y, 320, 250);
UIButton* btn = [UIButton buttonWithType:UIButtonTypeRoundedRect];
[btn setFrame:CGRectMake(10, 1, 50, 27)];
[btn setTitle:#"Back" forState:UIControlStateNormal];
[btn addTarget:self action:#selector(backButton) forControlEvents:UIControlEventTouchUpInside];
[self addSubview:btn];
}
return self;
}
-(void)backButton
{
[delegate didTapBackButton];
}
//====================================================================
in BaseViewController.h
#import "LabelView.h"
#import "PickerView.h"
#interface VarticalLabel : UIViewController<UITextFieldDelegate,PickerViewDelegate,LabelViewDelegate> {
PickerView* myPickerView;
LabelView* myLabelView;
}
#end
In BaseViewController.m
-(void)viewDidLoad
{
[super viewDidLoad];
myPickerView= [[PickerView alloc] initWithFrame:CGRectMake(0, 50, 320, 250)];
[self.view addSubview:myPickerView];
myPickerView.delegate = self;
myLabelView= [[LabelView alloc] initWithFrame:CGRectMake(0, 50, 320, 250)];
[self.view addSubview:myLabelView];
myLabelView.delegate = self;
myPickerView.hidden = YES;
}
#pragma mark PickerViewDelgate
-(void)didTapBackButton
{
myPickerView.hidden = YES;
myLabelView.hidden = NO;
}
#pragma mark LabelViewDelegate
-(void)didTapTwiceLabelView
{
myPickerView.hidden = NO;
myLabelView.hidden = YES;
}
To get events from a button to the view controller, just hook up the button's event, e.g. touch up inside, to a method in the view controller, using interface builder. (Double tapping is probably more complicated though.)
When you say 'flips', do you mean it actually shows an animation of flipping over a view to show a 'reverse' side? Like in the weather app when you hit the 'i' button? I'm assuming this is what you mean.
Perhaps check TheElements sample example on the iPhone Reference Library, it has an example of flip animation.
Btw, it's not strictly necessary to unload the loadView that is being 'hidden' when you flip -- it saves you having to construct it again when you flip back -- but it may be pertinent if you have memory use concerns, and/or the system warns you about memory being low.
Also, what do you mean by "create a UIView"? Do you mean subclass UIView, or just instantiate a UIVIew and add children view objects to it? The latter is the usual strategy. Don't subclass UIView just because you want to add some things to a UIView.
If you've got one screen of information that gives way to another screen of information, you'd normally make them separate view controllers. So in your case you'd have one view controller with the label and upon receiving the input you want, you'd switch to the view controller composed of the UIPickerView and the button.
Supposing you use Interface Builder, you would probably have a top level XIB (which the normal project templates will have provided) that defines the app delegate and contains a reference to the initial view controller in a separate XIB (also supplied). In the separate XIB you'd probably want to add another view controller by reference (so, put it in, give it the class name but indicate that its description is contained in another file) and in that view controller put in the picker view and the button.
The point of loadView, as separate from the normal class init, is to facilitate naming and linking to an instance in one XIB while having the layout defined in another. View controllers are alloced and inited when something that has a reference to them is alloced and inited. But the view is only loaded when it is going to be presented, and may be unloaded and reloaded while the app is running (though not while it is showing). Generally speaking, views will be loaded when needed and unnecessary views will be unloaded upon a low memory warning. That's all automatic, even if you don't put anything in the XIBs and just create a view programmatically within loadView or as a result of viewDidLoad.
I've made that all sound more complicated than your solution, but it's actually simpler because of the amount you can do in Interface Builder, once you're past the curve of learning it. It may actually be worth jumping straight to the Xcode 4 beta, as it shakes things up quite a lot in this area and sites have reported that a gold master was seeded at one point, so is likely to become the official thing very soon.
With respect to catching the double tap, the easiest thing is a UITapGestureRecognizer (see here). You'd do something like:
// create a tap gesture recogniser, tell it to send events to this instance
// of this class, and to send them via the 'handleGesture:' message, which
// we'll implement below...
UITapGestureRecognizer *tapGestureRecognizer = [[UITapGestureRecognizer alloc]
initWithTarget:self action:#selector(handleGesture:)];
// we want double taps
tapGestureRecognizer.numberOfTapsRequired = 2;
// attach the gesture recogniser to the view we want to catch taps on
[labelView addGestureRecognizer:tapGestureRecognizer];
// we have an owning reference to the recogniser but have now given it to
// the label. We don't intend to talk to it again without being prompted,
// so should relinquish ownership
[tapGestureRecognizer release];
/* ... elsewhere ... */
// the method we've nominated to receive gesture events
- (void)handleGesture:(UIGestureRecognizer *)gestureRecognizer
{
// could check 'gestureRecognizer' against tapGestureRecognizer above if
// we set the same message for multiple recognisers
// just make sure we're getting this because the gesture occurred
if(gestureRecognizer.state == UIGestureRecognizerStateRecognized)
{
// do something to present the other view
}
}
Gesture recognisers are available as of iOS 3.2 (which was for iPad only; so iOS 4.0 on iPhone and iPod Touch).

Resources