I want to add a mkannotation to my mkmapview when an user taps over the map so they can choose a location.
I've read about the drag&drop but that's a bit annoying if you want to move to the other corner of the city because you have to move step by step the pin.
How can I get the coordinate where a user taps and move my pin there?
Thanks!
Use UITapGestureRecognizer to get the CGPoint and coordinate of tapped point.
UITapGestureRecognizer *recognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(addPin:)];
[recognizer setNumberOfTapsRequired:1];
[map addGestureRecognizer:recognizer];
[recognizer release];
then add your target action
- (void)addPin:(UITapGestureRecognizer*)recognizer
{
CGPoint tappedPoint = [recognizer locationInView:map];
NSLog(#"Tapped At : %#",NSStringFromCGPoint(tappedPoint));
CLLocationCoordinate2D coord= [map convertPoint:tappedPoint toCoordinateFromView:map];
NSLog(#"lat %f",coord.latitude);
NSLog(#"long %f",coord.longitude);
// add an annotation with coord
}
On ios < 3.2, you can use this snippet:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if ( touch.tapCount == 1 ) {
UITouch *touch = [[event allTouches] anyObject];
if (CGRectContainsPoint([map frame], [touch locationInView:self.view]))
{
CLLocationCoordinate2D coord=
[map convertPoint:tappedPoint toCoordinateFromView:map];
NSLog(#"lat %f",coord.latitude);
NSLog(#"long %f",coord.longitude);
// add an annotation with coord
// or (as example before)
// [self addPin];
}
}
}
it's similar, but don't use UIGestureRecognizer.
Hope this helps.
Related
In touchesBegan I have my logic for single touch. And I'm trying to add ability to change camera position with pan. For pan I use touchesMoved. Everything is kind of okay, but once I pan, action for touch gets executed too.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
NSArray *sprites = [self nodesAtPoint:location];
for (SKSpriteNode *sprite in sprites)
{
//*
//* How to stop executing this block when panning?
//*
}
}
}
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint positionInScene = [touch locationInNode:self];
CGPoint previousPosition = [touch previousLocationInNode:self];
CGPoint translation = CGPointMake((-1)*(positionInScene.x - previousPosition.x), (-1)*(positionInScene.y - previousPosition.y));
CGPoint cameraPos = [self camera].position;
[self camera].position = CGPointAdd(cameraPos, translation);
}
Look at how to use the Pan Gesture that is built into IOS, with it you will have to option to allow it to also execute the touch event or not.
I will give you an answer using the view controller, you may use it somewhere else though
Objective C:
Open up ViewController.h and add the following declaration:
#interface ViewController : UIViewController<UIGestureRecognizerDelegate>
...
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer;
Then implement it in ViewController.m as follows:
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
http://www.raywenderlich.com/6567/uigesturerecognizer-tutorial-in-ios-5-pinches-pans-and-more
At this point you can link it via UI like in the tutorial above, or declare somewhere in the beginning
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
[self.view addGestureRecognizer:pan];
Swift:
class ViewController : UIViewController, UIGestureRecognizerDelegate
... then in your code
#IBAction func handlePan(recognizer:UIPanGestureRecognizer) {
let translation = recognizer.translationInView(self.view)
if let view = recognizer.view {
view.center = CGPoint(x:view.center.x + translation.x,
y:view.center.y + translation.y)
}
recognizer.setTranslation(CGPointZero, inView: self.view)
}
http://www.raywenderlich.com/76020/using-uigesturerecognizer-with-swift-tutorial
At this point you can link it via UI like in the tutorial above, or declare somewhere in the beginning stages like init:
let pan = UIPanGestureRecognizer(target: self, action: "handlePan:")
self.view.addGestureRecognizer(pan);
is it possible to get the x and y coordinates of a touch? If so could someone please provide a very simple example where the coordinates are just logged to the console.
Using touchesBegan Event
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
NSLog(#"Touch x : %f y : %f", touchPoint.x, touchPoint.y);
}
This event is triggered when touch starts.
Using Gesture
Register your UITapGestureRecognizer in viewDidLoad: Method
- (void)viewDidLoad {
[super viewDidLoad];
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapGestureRecognizer:)];
[self.view setUserInteractionEnabled:YES];
[self.view addGestureRecognizer:tapGesture];
}
Setting up the tapGestureRecognizer function
// Tap GestureRecognizer function
- (void)tapGestureRecognizer:(UIGestureRecognizer *)recognizer {
CGPoint tappedPoint = [recognizer locationInView:self.view];
CGFloat xCoordinate = tappedPoint.x;
CGFloat yCoordinate = tappedPoint.y;
NSLog(#"Touch Using UITapGestureRecognizer x : %f y : %f", xCoordinate, yCoordinate);
}
Sample Project
First you need to add a gesture recognizer to the view you want.
UITapGestureRecognizer *myTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(myTapRecognizer:)];
[self.myView setUserInteractionEnabled:YES];
[self.myView addGestureRecognizer:myTap];
Then in the gesture recognizer method you make a call to locationInView:
- (void)myTapRecognizer:(UIGestureRecognizer *)recognizer
{
CGPoint tappedPoint = [recognizer locationInView:self.myView];
CGFloat xCoordinate = tappedPoint.x;
CGFloat yCoordinate = tappedPoint.y;
}
You may want to take a look at apple's UIGestureRecognizer Class Reference
Here's a very basic example (place it inside your view controller):
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
NSLog(#"%#", NSStringFromCGPoint(currentPoint));
}
This triggers every time the touch moves. You can also use touchesBegan:withEvent: which triggers when a touch starts, and touchesEnded:withEvent: which triggers when a touch ends (i.e. a finger is lifted).
You can also do this using a UIGestureRecognizer, which in many cases is more practical.
I am trying to develop an analysing app that determines if you are "clever"
What this involves doing is taking a picture of yourself and dragging points onto your face, where the nose, mouth and eyes are. However, The code I have tried does not work:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if ([touch view] == eye1)
{
eye1.center = location;
}
else if ([touch view] == eye2)
{
eye2.center = location;
}
else if ([touch view] == nose)
{
nose.center = location;
}
else if ([touch view] == chin)
{
chin.center = location;
}
else if ([touch view] == lip1)
{
lip1.center = location;
}
else if ([touch view] ==lip2)
{
lip2.center = location;
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesBegan:touches withEvent:event];
}
What is happening, because when I just have a single image, it works, but is not helpful for me. What can I do to make it work? The spots start at the bottom of the screen in a "Toolbar" and then the user drags them onto the face. I kinda want the finished result to look like:
There are two basic approaches:
You can use the various touches methods (e.g. touchesBegan, touchesMoved, etc.) in your controller or the main view, or you can use a single gesture recognizer on the main view. In this situation, you'd use touchesBegan or, if using a gesture recognizer, a state of UIGestureRecognizerStateBegan, determine locationInView of the superview, and then test whether the touch is over one of your views by testing CGRectContainsPoint, using the frame of the various views as the first parameter, and by using the location as the second parameter.
Having identified the view that the gesture began, then in touchesMoved or, if in a gesture recognizer, a state of UIGestureRecognizerStateChanged, and move the view based upon the translationInView.
Alternatively (and easier IMHO), you can create individual gesture recognizers that you attach to each of the subviews. This latter approach might look like the following. For example, you first add your gesture recognizers:
NSArray *views = #[eye1, eye2, lip1, lip2, chin, nose];
for (UIView *view in views)
{
view.userInteractionEnabled = YES;
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[view addGestureRecognizer:pan];
}
Then you implement a handlePanGesture method:
- (void)handlePanGesture:(UIPanGestureRecognizer *)gesture
{
CGPoint translation = [gesture translationInView:gesture.view];
if (gesture.state == UIGestureRecognizerStateChanged)
{
gesture.view.transform = CGAffineTransformMakeTranslation(translation.x, translation.y);
[gesture.view.superview bringSubviewToFront:gesture.view];
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
gesture.view.transform = CGAffineTransformIdentity;
gesture.view.center = CGPointMake(gesture.view.center.x + translation.x, gesture.view.center.y + translation.y);
}
}
I have implemented the drag effect on an image but during my test I see that the image is moving only on the click mouse event.
I cannot move my image with the mouse on my screen through the drag event. But when I click on a side of my screen the image take the place where I have clicked.
I followed many topics on youtube but finally, I haven't the same behavior.
This my code:
ScreenView1.h
IBOutlet UIImageView *image;
ScreenView1.m
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
image.center = location;
[self ifCollision];
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[self touchesBegan:touches withEvent:event];
}
If you want to drag an image view, you will be so much happier using a UIPanGestureRecognizer. It makes this sort of thing trivial. Using touchesBegan is so iOS 4!
UIPanGestureRecognizer* p =
[[UIPanGestureRecognizer alloc] initWithTarget:self
action:#selector(dragging:)];
[imageView addGestureRecognizer:p];
// ...
- (void) dragging: (UIPanGestureRecognizer*) p {
UIView* vv = p.view;
if (p.state == UIGestureRecognizerStateBegan ||
p.state == UIGestureRecognizerStateChanged) {
CGPoint delta = [p translationInView: vv.superview];
CGPoint c = vv.center;
c.x += delta.x; c.y += delta.y;
vv.center = c;
[p setTranslation: CGPointZero inView: vv.superview];
}
}
You're not doing the right thing in the touchesMoved:withEvent:, which is why the drag won't work. Here's a little code that works:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
[CATransaction begin];
[CATransaction setDisableActions:YES];
[image setCenter:location];
[CATransaction commit];
}
For the others, I have implemented my issue in that way:
- (IBAction)catchPanEvent:(UIPanGestureRecognizer *)recognizer{
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
thank you again Matt!
I'm working on a graphing calculator app for the iPad, and I wanted to add a feature where a user can tap an area in the graph view to make a text box pop up displaying the coordinate of the point they touched. How can I get a CGPoint from this?
you have two way ...
1.
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
}
here,you can get location with point from current view...
2.
UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapped:)];
[tapRecognizer setNumberOfTapsRequired:1];
[tapRecognizer setDelegate:self];
[self.view addGestureRecognizer:tapRecognizer];
here,this code use when you want to do somthing with your perticular object or subview of your mainview
Try This
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
// Get the specific point that was touched
CGPoint point = [touch locationInView:self.view];
NSLog(#"X location: %f", point.x);
NSLog(#"Y Location: %f",point.y);
}
You can use "touchesEnded" if you'd rather see where the user lifted their finger off the screen instead of where they touched down.
Just want to toss in a Swift 4 answer because the API is quite different looking.
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = event?.allTouches?.first {
let loc:CGPoint = touch.location(in: touch.view)
//insert your touch based code here
}
}
OR
let tapGR = UITapGestureRecognizer(target: self, action: #selector(tapped))
view.addGestureRecognizer(tapGR)
#objc func tapped(gr:UITapGestureRecognizer) {
let loc:CGPoint = gr.location(in: gr.view)
//insert your touch based code here
}
In both cases loc will contain the point that was touched in the view.
it's probably better and simpler to use a UIGestureRecognizer with the map view instead of trying to subclass it and intercepting touches manually.
Step 1 : First, add the gesture recognizer to the map view:
UITapGestureRecognizer *tgr = [[UITapGestureRecognizer alloc]
initWithTarget:self action:#selector(tapGestureHandler:)];
tgr.delegate = self; //also add <UIGestureRecognizerDelegate> to #interface
[mapView addGestureRecognizer:tgr];
Step 2 : Next, implement shouldRecognizeSimultaneouslyWithGestureRecognizer and return YES so your tap gesture recognizer can work at the same time as the map's (otherwise taps on pins won't get handled automatically by the map):
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer
:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Step 3 : Finally, implement the gesture handler:
- (void)tapGestureHandler:(UITapGestureRecognizer *)tgr
{
CGPoint touchPoint = [tgr locationInView:mapView];
CLLocationCoordinate2D touchMapCoordinate
= [mapView convertPoint:touchPoint toCoordinateFromView:mapView];
NSLog(#"tapGestureHandler: touchMapCoordinate = %f,%f",
touchMapCoordinate.latitude, touchMapCoordinate.longitude);
}
If you use an UIGestureRecognizer or UITouch object you can use the locationInView: method to retrieve the CGPoint within the given view the user touched.
func handleFrontTap(gestureRecognizer: UITapGestureRecognizer) {
print("tap working")
if gestureRecognizer.state == UIGestureRecognizerState.Recognized {
`print(gestureRecognizer.locationInView(gestureRecognizer.view))`
}
}