Point inside a rotated CGRect - ios

How would you properly determine if a point is inside a rotated CGRect/frame?
The frame is rotated with Core Graphics.
So far I've found an algorithm that calculates if a point is inside a triangle, but that's not quite what I need.
The frame being rotated is a regular UIView with a few subviews.

Let's imagine that you use transform property to rotate a view:
self.sampleView.transform = CGAffineTransformMakeRotation(M_PI_2 / 3.0);
If you then have a gesture recognizer, for example, you can see if the user tapped in that location using locationInView with the rotated view, and it automatically factors in the rotation for you:
- (void)handleTap:(UITapGestureRecognizer *)gesture
{
CGPoint location = [gesture locationInView:self.sampleView];
if (CGRectContainsPoint(self.sampleView.bounds, location))
NSLog(#"Yes");
else
NSLog(#"No");
}
Or you can use convertPoint:
- (void)handleTap:(UITapGestureRecognizer *)gesture
{
CGPoint locationInMainView = [gesture locationInView:self.view];
CGPoint locationInSampleView = [self.sampleView convertPoint:locationInMainView fromView:self.view];
if (CGRectContainsPoint(self.sampleView.bounds, locationInSampleView))
NSLog(#"Yes");
else
NSLog(#"No");
}
The convertPoint method obviously doesn't need to be used in a gesture recognizer, but rather it can be used in any context. But hopefully this illustrates the technique.

Use CGRectContainsPoint() to check whether a point is inside a rectangle or not.

Related

How to get coordinates of the tap in 2 views simultaneously

I am working on an iOS map app and it includes interactive map. The interactive map is a subclass of UIImageView and placed on a scrollView. My view hierarchy is shown below:
When user taps some part of the map, ViewController performs animated segue (like zoom-in to that area of the map). I can start segue from any point of the screen, but to do this properly, I need exact coordinates of user's tap relative to the screen itself. As ImageView is put at the top of ScrollView, it uses different coordinate system, larger than screen size. No matter, which area of map has ben tapped, what matters is the tapped CGPoint on the screen (physical).
ImageView uses its own code to get coordinates of a tap:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
// cancel previous touch ended event
[NSObject cancelPreviousPerformRequestsWithTarget:self];
CGPoint touchPoint = \
[[touches anyObject] locationInView:self];
NSValue* touchValue =\
[NSValue
valueWithCGPoint:touchPoint];
// perform new one
[self
performSelector:#selector(_performHitTestOnArea:)
withObject:touchValue
afterDelay:0.1];
}
And the case if I place gesture recognizer, it works, but ImageView can't get any touches and, therefore, call segue.
The code for gesture recognizer, I attempted to use:
UITapGestureRecognizer *rec = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapRecognized:)];
[someView addGestureRecognizer:rec];
[rec release];
// elsewhere
- (void)tapRecognized:(UITapGestureRecognizer *)recognizer
{
if(recognizer.state == UIGestureRecognizerStateRecognized)
{
CGPoint point = [recognizer locationInView:recognizer.view];
// again, point.x and point.y have the coordinates
}
}
So, is there any way to get two coordinates in different reference systems?, or to make these recognizers work simultaneously without interfering each other?
Solved
I use this code to convert touched point from one view's reference system to
CGPoint pointInViewCoords = [self.parentView convertPoint:self.imageView.touchPoint fromView:self.imageView];
Where self.parentView is "View" on hierarchy image - with the size of the screen.

Is it possible to know when [UIDynamicItemBehavior addLinearVelocity:forItem:] has finished running?

I'm using a UIPanGestureRecognizer and UIAttachmentBehavior to move a UIView around the screen. When the user ends the gesture I apply the velocity of the gesture recognizer to the view using a UIDynamicItemBehavior and the addLinearVelocity:forItem: method.
Here is the code I use:
- (void)_handlePanGestureRecognized: (UIPanGestureRecognizer *)panGestureRecognizer
{
if (panGestureRecognizer.state == UIGestureRecognizerStateBegan)
{
_attachmentBehavior.anchorPoint = panGestureRecognizer.view.center;
[_dynamicAnimator addBehavior: _attachmentBehavior];
}
else if (panGestureRecognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint point = [panGestureRecognizer locationInView: panGestureRecognizer.view.superview];
_attachmentBehavior.anchorPoint = point;
}
else if (panGestureRecognizer.state == UIGestureRecognizerStateEnded)
{
[_dynamicAnimator removeBehavior: _attachmentBehavior];
CGPoint velocity = [panGestureRecognizer velocityInView: panGestureRecognizer.view.superview];
[_dynamicItemBehavior addLinearVelocity: velocity
forItem: self];
}
}
When the view stops moving I would then like to have it snap to the closest edge of the screen but I currently have no way of knowing when it has stopped moving short of polling the view's center with a CADisplayLink.
Have you tried attaching a UIDynamicAnimatorDelegate to your animator, and using the dynamicAnimatorDidPause: method to trigger snapping to the closest edge?
From reading on the developer forums, it sounds like some have had problems with their views staying in motion for a very long time (jiggling back and forth by 1 pixel, for example), but perhaps this will work for your case.

UIPanGestureRecognizer not working as expected with multiple pans

Essentially, what I want to do is to move a view around to follow the user's pan. This works fine as long as the same pan object is being used. The problem comes when the user releases and than starts another pan.
According to the documentation, the value in translationInView is relative to the position at the start of the pan.
So my strategy for handling this was to add two properties to my view so I can tell whether the same pan object is being used and what the reference location is. The self object is the object being moved. It is a UIView subclass.
CGPoint originalPoint;
if (pan == self.panObject) {
//If the pan object is the same as the one in the property, use the saved value as the reference point.
originalPoint = CGPointMake(self.panStartLocation.x, self.panStartLocation.y);
} else {
//If the pan object is DIFFERENT, set the originalPoint from the existing center.
//self.center is in self.superview's coordinate system.
originalPoint = CGPointMake(self.center.x, self.center.y);
self.panStartLocation = CGPointMake(originalPoint.x, originalPoint.y);
self.panObject = pan;
}
CGPoint translation = [pan translationInView:self.superview];
self.center = CGPointMake(originalPoint.x+translation.x, originalPoint.y+translation.y);
This scheme doesn't work because each pan object apparently is the same object. I've spent a bit of time in the debugger verifying this, and that seems to be true. I thought the pan object would be different for each touch. So since this doesn't work, what is the alternative?
I solved it. Here is the corrected code:
CGPoint originalPoint;
if (pan.state == UIGestureRecognizerStateBegan) {
originalPoint = CGPointMake(self.center.x, self.center.y);
self.panStartLocation = CGPointMake(originalPoint.x, originalPoint.y);
} else {
originalPoint = CGPointMake(self.panStartLocation.x, self.panStartLocation.y);
}
CGPoint translation = [pan translationInView:self.superview];
self.center = CGPointMake(originalPoint.x+translation.x, originalPoint.y+translation.y);
EDIT: A better approach is to take advantage of the fact that the gesture recognizer allows you to set the translation:
[sender setTranslation:CGPointMake(0.0, 0.0) inView:self.pieceBeingMoved];
Do this when you move your item, and then the new translation next time will be relative to the position you just moved to.

Simple Drag & Drop for a View?

I'm learning iOS and I can't find how to add drag and drop behavior to a UIView.
I tried:
[_view addTarget:self action:#selector(moved:withEvent:) forControlEvents:UIControlEventTouchDragInside];
It says "no visible interface for UIView declares selector addTarget (etc)"
Also I tried adding a pan gesture recognizer but not sure if that's what I need
- (IBAction)test:(id)sender {
NSLog(#"dfsdfsf");
}
It's called, but don't know how to get the coordinates of the event. What't the standard, simple way in iOS to register a callback for move events / do drag and drop?
Thanks in advance.
A UIPanGestureRecognizer is definitely the way to go. If you want the user to drag the view around, you'll need the “translation” (movement) of the gesture in the superview's coordinate system:
- (IBAction)panWasRecognized:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:_view.superview];
Once you have the translation, you can move (“drag”) the view by changing its center:
CGPoint center = _view.center;
center.x += translation.x;
center.y += translation.y;
_view.center = center;
Finally, you want to set the pan gesture recognizer's translation back to zero, so the next time you get the message, it only tells you how much the gesture has moved since the last message:
[recognizer setTranslation:CGPointZero inView:_view.superview];
}
Here it is all together for easy copy/paste:
- (IBAction)panWasRecognized:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:_view.superview];
CGPoint center = _view.center;
center.x += translation.x;
center.y += translation.y;
_view.center = center;
[recognizer setTranslation:CGPointZero inView:_view.superview];
}
Start with touchesBegan, touchesMoved, touchesEnded. Override these in your UIView subclass and you'll be on your way to learning the event system. You can get the event coordinates like so:
- (void) touchesBegan:(NSSet *) touches withEvent:(UIEvent *) event
{
float x = [[touches anyObject] locationInView:self].x;
float y = [[touches anyObject] locationInView:self].y;
}
Then there's a ton of stuff for converting coordinates between different views and so on. Once you've understood that, you can work with the UIGestureRecognizer stuff you've already found which is what you need.
You are going to need a pan gesture recognizer to do drag/drop. You can use the locationInView: selector in UIPanGestureRecognizer to find out where you are at any given moment.
You add your gesture recognizer like so, not with the target-action stuff you were trying:
UIPanGestureRecognizer *dragDropRecog = [[UIPanGestureRecognizer alloc] initWithTarget:yourView action:#selector(thingDragged:)];
[yourView addGestureRecognizer:dragDropRecog];
Then you have to implement the selector thingDragged: in your view:
- (void) thingDragged:(UIPanGestureRecognizer *) gesture
{
CGPoint location = [gesture locationInView:self];
if ([gesture state] == UIGestureRecognizerStateBegan) {
// Drag started
} else if ([gesture state] == UIGestureRecognizerStateChanged) {
// Drag moved
} else if ([gesture state] == UIGestureRecognizerStateEnded) {
// Drag completed
}
}
You'll be translating the view being dragged in the changed bit, and handling the drop in the ended section.

ios UIPanGestureRecognizer pointer position

UIPanGestureRecognizer *panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)];
[self addGestureRecognizer:panRecognizer];
- (void)pan:(UIPanGestureRecognizer *)gesture {
NSLog(#"%f", [gesture translationInView:self].x);
}
The above code will log the relative position of my current pan, but how can I get the absolute position for the view I'm in?
I'm simply just wanting to slide a UIImageView to wherever the user's finger is.
translationInView gives you the pan translation (how much x has changed) and not the position of the pan in the view (the value of x). If you need the position of the pan, you have to use the method locationInView.
You can find the coordinates relatively to the view as follows:
- (void)pan:(UIPanGestureRecognizer *)gesture {
NSLog(#"%f", [gesture locationInView:self].x);
}
Or relatively to the superview:
- (void)pan:(UIPanGestureRecognizer *)gesture {
NSLog(#"%f", [gesture locationInView:self.superview].x);
}
Or relatively to the window:
- (void)pan:(UIPanGestureRecognizer *)gesture {
NSLog(#"%f", [gesture locationInView:self.window].x);
}
Swift 5
Use the method .location() that returns a CGPoint value. [documentation]
For example, relative location of your gesture to self.view:
let relativeLocation = gesture.location(self.view)
print(relativeLocation.x)
print(relativeLocation.y)
I think a simple way of something like this is to get the x and y of the touch and tracking it, once it has 2 points (say X:230 Y:122) you set the scroll of the UIScroll view to the x and y.

Resources