UIPanGestureRecognizer *panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)];
[self addGestureRecognizer:panRecognizer];
- (void)pan:(UIPanGestureRecognizer *)gesture {
NSLog(#"%f", [gesture translationInView:self].x);
}
The above code will log the relative position of my current pan, but how can I get the absolute position for the view I'm in?
I'm simply just wanting to slide a UIImageView to wherever the user's finger is.
translationInView gives you the pan translation (how much x has changed) and not the position of the pan in the view (the value of x). If you need the position of the pan, you have to use the method locationInView.
You can find the coordinates relatively to the view as follows:
- (void)pan:(UIPanGestureRecognizer *)gesture {
NSLog(#"%f", [gesture locationInView:self].x);
}
Or relatively to the superview:
- (void)pan:(UIPanGestureRecognizer *)gesture {
NSLog(#"%f", [gesture locationInView:self.superview].x);
}
Or relatively to the window:
- (void)pan:(UIPanGestureRecognizer *)gesture {
NSLog(#"%f", [gesture locationInView:self.window].x);
}
Swift 5
Use the method .location() that returns a CGPoint value. [documentation]
For example, relative location of your gesture to self.view:
let relativeLocation = gesture.location(self.view)
print(relativeLocation.x)
print(relativeLocation.y)
I think a simple way of something like this is to get the x and y of the touch and tracking it, once it has 2 points (say X:230 Y:122) you set the scroll of the UIScroll view to the x and y.
Related
I have a scrollview and I'm watching user input. I would like to know where their finger currently is on the X plane. Is this possible through the ScrollView and/or its delegate or do I have to override touchesBegan, etc?
I'm assuming you set up your scroll view delegate. If you did, then you only need to implement the scrollViewDidScroll: method from the UIScrollViewDelegate protocol...
- (void)scrollViewDidScroll:(UIScrollView *)scrollView {
CGPoint touchPoint = [scrollView.panGestureRecognizer locationInView:scrollView];
NSLog(#"Touch point: (%f, %f)", touchPoint.x, touchPoint.y);
}
This will update while the user is scrolling.
Additional info: Note that if you have something like a UIPageControl, that your scroll view is navigating between, you may have to calculate the x position based on the number of pages (ie if each page is 100 pixels wide, page 0 will start at point 0, page one at point 100, etc..).
CGPoint touchPoint = [scrollView.panGestureRecognizer locationInView:scrollView];
CGFloat x = touchPoint.x;
CGPoint point = [_scrollView locationOfTouch:touchIndex inView:_scrollView];
touchindex: 0 for first touch and 1, 2,3 so on
if inview:nil then point will be in the window base coordinate sysytem
CGFloat x = point.x;
CGFloat y = point.y
I use the panGesture of the scrollView because it's more accurate if you want to know the position of the x plane when you have only vertical scroll.
[self.scrollView.panGestureRecognizer addTarget:self action:#selector(handlePanForScrollView:)];
and then:
- (void)handlePanForScrollView:(UIPanGestureRecognizer *)gesture {
CGPoint positionInView = [gesture locationInView:self.scrollView];
}
I have a UIPanGestureRecognizer attached to a map view in my window (and recognizing gesture simulateously so I can receive events when the map pans). However for some reason I can't figure out, [UIPanGestureRecognizer translationInView:] always returns coordinates relative to the origin of the pan. In fact it behaves identically regardless of what view parameter I pass to it, even nil!
I can't figure out why it's not telling me coordinates relative to the view I pass in.
self.mapViewPan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)];
self.mapViewPan.delegate = self;
[self.mapView addGestureRecognizer:self.mapViewPan];
*
- (void)pan:(UIPanGestureRecognizer*)sender {
CGPoint translatedPoint = [sender translationInView:self.view];
NSLog(#"%f, %f", translatedPoint.x, translatedPoint.y);
}
Nevermind, I seem to have misunderstood what translationInView: actually represents. It appears to be relative to the origin of the touch regardless (which makes me wonder what its parameter actually affects).
It turns out what I really want is locationInView::
CGPoint touchPoint = [sender locationInView:self.view];
This returns the location of the touch in the view's coordinates.
Is there any reason you're not using the MKMapViewDelegate methods for this?
mapView:regionDidChangeAnimated: should let you get the map region when it scrolls.
I am working on an iOS map app and it includes interactive map. The interactive map is a subclass of UIImageView and placed on a scrollView. My view hierarchy is shown below:
When user taps some part of the map, ViewController performs animated segue (like zoom-in to that area of the map). I can start segue from any point of the screen, but to do this properly, I need exact coordinates of user's tap relative to the screen itself. As ImageView is put at the top of ScrollView, it uses different coordinate system, larger than screen size. No matter, which area of map has ben tapped, what matters is the tapped CGPoint on the screen (physical).
ImageView uses its own code to get coordinates of a tap:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
// cancel previous touch ended event
[NSObject cancelPreviousPerformRequestsWithTarget:self];
CGPoint touchPoint = \
[[touches anyObject] locationInView:self];
NSValue* touchValue =\
[NSValue
valueWithCGPoint:touchPoint];
// perform new one
[self
performSelector:#selector(_performHitTestOnArea:)
withObject:touchValue
afterDelay:0.1];
}
And the case if I place gesture recognizer, it works, but ImageView can't get any touches and, therefore, call segue.
The code for gesture recognizer, I attempted to use:
UITapGestureRecognizer *rec = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapRecognized:)];
[someView addGestureRecognizer:rec];
[rec release];
// elsewhere
- (void)tapRecognized:(UITapGestureRecognizer *)recognizer
{
if(recognizer.state == UIGestureRecognizerStateRecognized)
{
CGPoint point = [recognizer locationInView:recognizer.view];
// again, point.x and point.y have the coordinates
}
}
So, is there any way to get two coordinates in different reference systems?, or to make these recognizers work simultaneously without interfering each other?
Solved
I use this code to convert touched point from one view's reference system to
CGPoint pointInViewCoords = [self.parentView convertPoint:self.imageView.touchPoint fromView:self.imageView];
Where self.parentView is "View" on hierarchy image - with the size of the screen.
How would you properly determine if a point is inside a rotated CGRect/frame?
The frame is rotated with Core Graphics.
So far I've found an algorithm that calculates if a point is inside a triangle, but that's not quite what I need.
The frame being rotated is a regular UIView with a few subviews.
Let's imagine that you use transform property to rotate a view:
self.sampleView.transform = CGAffineTransformMakeRotation(M_PI_2 / 3.0);
If you then have a gesture recognizer, for example, you can see if the user tapped in that location using locationInView with the rotated view, and it automatically factors in the rotation for you:
- (void)handleTap:(UITapGestureRecognizer *)gesture
{
CGPoint location = [gesture locationInView:self.sampleView];
if (CGRectContainsPoint(self.sampleView.bounds, location))
NSLog(#"Yes");
else
NSLog(#"No");
}
Or you can use convertPoint:
- (void)handleTap:(UITapGestureRecognizer *)gesture
{
CGPoint locationInMainView = [gesture locationInView:self.view];
CGPoint locationInSampleView = [self.sampleView convertPoint:locationInMainView fromView:self.view];
if (CGRectContainsPoint(self.sampleView.bounds, locationInSampleView))
NSLog(#"Yes");
else
NSLog(#"No");
}
The convertPoint method obviously doesn't need to be used in a gesture recognizer, but rather it can be used in any context. But hopefully this illustrates the technique.
Use CGRectContainsPoint() to check whether a point is inside a rectangle or not.
I'm learning iOS and I can't find how to add drag and drop behavior to a UIView.
I tried:
[_view addTarget:self action:#selector(moved:withEvent:) forControlEvents:UIControlEventTouchDragInside];
It says "no visible interface for UIView declares selector addTarget (etc)"
Also I tried adding a pan gesture recognizer but not sure if that's what I need
- (IBAction)test:(id)sender {
NSLog(#"dfsdfsf");
}
It's called, but don't know how to get the coordinates of the event. What't the standard, simple way in iOS to register a callback for move events / do drag and drop?
Thanks in advance.
A UIPanGestureRecognizer is definitely the way to go. If you want the user to drag the view around, you'll need the “translation” (movement) of the gesture in the superview's coordinate system:
- (IBAction)panWasRecognized:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:_view.superview];
Once you have the translation, you can move (“drag”) the view by changing its center:
CGPoint center = _view.center;
center.x += translation.x;
center.y += translation.y;
_view.center = center;
Finally, you want to set the pan gesture recognizer's translation back to zero, so the next time you get the message, it only tells you how much the gesture has moved since the last message:
[recognizer setTranslation:CGPointZero inView:_view.superview];
}
Here it is all together for easy copy/paste:
- (IBAction)panWasRecognized:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:_view.superview];
CGPoint center = _view.center;
center.x += translation.x;
center.y += translation.y;
_view.center = center;
[recognizer setTranslation:CGPointZero inView:_view.superview];
}
Start with touchesBegan, touchesMoved, touchesEnded. Override these in your UIView subclass and you'll be on your way to learning the event system. You can get the event coordinates like so:
- (void) touchesBegan:(NSSet *) touches withEvent:(UIEvent *) event
{
float x = [[touches anyObject] locationInView:self].x;
float y = [[touches anyObject] locationInView:self].y;
}
Then there's a ton of stuff for converting coordinates between different views and so on. Once you've understood that, you can work with the UIGestureRecognizer stuff you've already found which is what you need.
You are going to need a pan gesture recognizer to do drag/drop. You can use the locationInView: selector in UIPanGestureRecognizer to find out where you are at any given moment.
You add your gesture recognizer like so, not with the target-action stuff you were trying:
UIPanGestureRecognizer *dragDropRecog = [[UIPanGestureRecognizer alloc] initWithTarget:yourView action:#selector(thingDragged:)];
[yourView addGestureRecognizer:dragDropRecog];
Then you have to implement the selector thingDragged: in your view:
- (void) thingDragged:(UIPanGestureRecognizer *) gesture
{
CGPoint location = [gesture locationInView:self];
if ([gesture state] == UIGestureRecognizerStateBegan) {
// Drag started
} else if ([gesture state] == UIGestureRecognizerStateChanged) {
// Drag moved
} else if ([gesture state] == UIGestureRecognizerStateEnded) {
// Drag completed
}
}
You'll be translating the view being dragged in the changed bit, and handling the drop in the ended section.