Get vector in SCNNode environment from touch location swift - ios

I have the position and orientation of my camera, the CGPoint touch location on the screen, I need the line (preferably vector) in the direction that I touched on the screen in my 3d SCNNode environment, how can I get this?
A code snippet would be very helpful.

You can use the SCNSceneRenderer.unprojectPoint(_:) method for this.
This method, which is implemented by SCNView, takes the coordinates of your point as a SCNVector3. Set the first two elements in the coordinate space of your view. Apple describes the use of the third element:
The z-coordinate of the point parameter describes the depth at which to unproject the point relative to the near and far clipping planes of the renderer’s viewing frustum (defined by its
pointOfView
node). Unprojecting a point whose z-coordinate is 0.0 returns a point on the near clipping plane; unprojecting a point whose z-coordinate is 1.0 returns a point on the far clipping plane.
You are not looking for the location of these points, but for the line that connects them. Just subtract both to get the line.
func getDirection(for point: CGPoint, in view: SCNView) -> SCNVector3 {
let farPoint = view.unprojectPoint(SCNVector3Make(point.x, point.y, 1))
let nearPoint = view.unprojectPoint(SCNVector3Make(point.x, point.y, 0))
return SCNVector3Make(farPoint.x - nearPoint.x, farPoint.y - nearPoint.y, farPoint.z - nearPoint.z)
}

Related

IOS ARKit Swift how can I get distance between me and a SCNNode when I move

I have an issue with ARkit, I use ARWorldTrackingConfiguration and when I have a SCNNode one meter away from me, when I use Iphone X and move to the SCNNode, the Z position of my camera SCNNode updates and I know that I am closer to the node, but with other Iphones (Iphone 8) I don't get an update to Z position.
Also with GPS even with kCLLocationAccuracyBestForNavigation I don't have accuracy.
How can I know that I am closer to the SCNNode?? Thank you
implementing ARSCNViewDelegate in your ViewController, it will be called the renderer:updateAtTime callback once per frame.
Inside this function you can get the position of your camera relative to the real world calling sceneView.session.currentFrame.camera.transform which is a simd_float4x4. You can access the current position of your camera with transform.columns.3 and peeking the x,y,z field from it. With these coordinates you can calculate the distance, such the Euclidean distance with these functions
func distanceTravelled(xDist:Float, yDist:Float, zDist:Float) -> Float{
return sqrt((xDist*xDist)+(yDist*yDist)+(zDist*zDist))
}
func distanceTravelled(between v1:SCNVector3,and v2:SCNVector3) -> Float{
let xDist = v1.x - v2.x
let yDist = v1.y - v2.y
let zDist = v1.z - v2.z
return distanceTravelled(xDist: xDist, yDist: yDist, zDist: zDist)
}
Remember to convert the coordinates of the nodes to worldCoordinates -> node.worldPosition with node an instance of SCNNode

Check if node is visible on the screen

I currently have a large map that goes off the screen, because of this its coordinate system is very different from my other nodes. This has led me to a problem, because I'm needing to generate a random CGPoint within the bounds of this map, and then if that point is frame/on-screen I place a visible node there. However the check on wether or not the node is on screen continuously fails.
I'm checking if the node is in frame with the following code: CGRectContainsPoint(self.frame, values) (With values being the random CGPoint I generated). Now this is where my problem comes in, the coordinate system of the frame is completely different from the coordinate system of the map.
For example, in the picture below the ball with the arrows pointing to it is at coordinates (479, 402) in the scene's coordinates, but they are actually at (9691, 9753) in the map's coordinates.
I determined the coordinates using the touchesBegan event for those who are wondering. So basically, how do I convert that map coordinate system to one that will work for the frame?
Because as seen below, the dot is obviously in the frame however the CGRectContainsPoint always fails. I've tried doing scene.convertPoint(position, fromNode: map) but it didn't work.
Edit: (to clarify some things)
My view hierarchy looks something like this:
The map node goes off screen and is about 10,000x10,000 for size. (I have it as a scrolling type map). The origin (Or 0,0) for this node is in the bottom left corner, where the map starts, meaning the origin is offscreen. In the picture above, I'm near the top right part of the map. I'm generating a random CGPoint with the following code (Passing it the maps frame) as an extension to CGPoint:
static func randPoint(within: CGRect) -> CGPoint {
var point = within.origin
point.x += CGFloat(arc4random() % UInt32(within.size.width))
point.y += CGFloat(arc4random() % UInt32(within.size.height))
return point;
}
I then have the following code (Called in didMoveToView, note that I'm applying this to nodes I'm generating - I just left that code out). Where values is the random position.
let values = CGPoint.randPoint(map.totalFrame)
if !CGRectContainsPoint(self.frame, convertPointToView(scene!.convertPoint(values, fromNode: map))) {
color = UIColor.clearColor()
}
To make nodes that are off screen be invisible. (Since the user can scroll the map background). This always passes as true, making all nodes invisible, even though nodes are indeed within the frame (As seen in the picture above, where I commented out the clear color code).
If I understand your question correctly, you have an SKScene that contains an SKSpriteNode that is larger than the scene's view, and that you are randomly generating coordinates within that sprite's coordinate system that you want to map to the view.
You're on the right track with SKNode's convertPoint(_:fromNode:) (where your scene is the SKNode and your map is the fromNode). That should get you from the generated map coordinate to the scene coordinate. Next, convert that coordinate to the view's coordinate system using your scene's convertPointToView(_:). The point will be out of bounds if it is not in view.
Using a worldNode which includes a playerNode and having the camera center on this node, you can check on/off with this code:
float left = player.position.x - 700;
float right = player.position.x + 700;
float up = player.position.y + 450;
float down = player.position.y - 450;
if((object.position.x > left) && (object.position.x < right) && (object.position.y > down) && (object.position.y < up)) {
if((object.parent == nil) && (object.dead == false)) {
[worldNode addChild:object];
}
} else {
if(object.parent != nil) {
[object removeFromParent];
}
}
The numbers I used above are static. You can also make them dynamic:
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat screenWidth = screenRect.size.width;
CGFloat screenHeight = screenRect.size.height;
Diving the screenWidth by 2 for left and right. Same for screenHeight.

Passing UIBezierPath to view class for drawing

I am making a level based game with many different objects, all different. In each level, there will be different amounts of each type of object. Thus, I have been trying to make the drawing part as generic as possible so that all I have to do is pass in the coords and it will automatically draw. To do this, I have made a protocol that forces each object class to implement the method getBP(), which returns the UIBezierPath to draw for each. Then, the view class just has to say
Object.getBP().fill()
However, this has been leading to some strange problems. The object does not draw at the correct coordinates. The y coordinate is correct, but the x coordinate puts it always at the left of the screen. I think it may be the fact that the Bezier Path is not being created in the view class. Here is my code in Surface.swift (this is meant to draw a surface in the game):
func getBP() -> UIBezierPath {
var rect:CGRect
var length:Double = getSurfaceVector().getMagnitude()//length of the surface
var cx = points.1.x+(points.0.x-points.1.x)//center coords of the surface
var cy = points.1.y+(points.0.y-points.1.y)
var bp = UIBezierPath(roundedRect: CGRectMake(CGFloat(cx - length/2), CGFloat(cy-RECT_HEIGHT/2), CGFloat(length), CGFloat(RECT_HEIGHT)), cornerRadius: CGFloat(5))
let transform:CGAffineTransform = CGAffineTransformMakeRotation(CGFloat(Double(angle)*(Double(M_PI)/Double(180))))
bp.applyTransform(transform)
return bp
}
points is just a tuple with the start and end points of the surface. RECT_HEIGHT is the height of the rectangle that is drawn to represent the surface. angle is the angle from horizontal of the surface.
Creating the surface in View.swift, I do this:
Surface(fixed: true, points: (Vector(x: 50, y:100), Vector(x: Double(UIScreen.mainScreen().bounds.width), y: 100)))
I add that surface to the array of objects in the game. I draw it in the View.swift file by saying
surface.stroke()
The surface draws on the screen with a y value of 100, but it is centered at x = 0 so that it is half on and half off of the screen. Also, it doesn't draw at the angle - it is always horizontal. Is there some better way of doing this? What is happening?

Calculating point coordinates from user tap with constraints

I am trying to calculate the coordinates along a circle corresponding to the tap location. The coordinates should be on the border of the circle nearest to the tap location (e.g. the border that is less distant from the radius). To facilitate this I am detecting only taps that are distant by 80% of the radius from the circle center.
Input:
P (GPPoint) - center of the circle
P1 (GPPoint) current position of an image displayed
r (float) radius of circle
P3 (CGPoint) user tap coordinate
Desired output:
P2 (CGPoint) - new coordinates for the image corresponding to P3 but along the circle. Sorry for the bad explanation, I try to explain it in other words: once the user taps on the screen I would like to move the image in P2. P2 should be derived by moving P2 to the border of the circle. It should be possible to do so by using the radius information.
The idea is to create from P3 coordinates a new coordinate called P2 as described above - the key is that P2 distance from the centre should correspond exactly to the radius and the ANGLE should be the same as tapPoint.
Would anyome be able to suggest a formula to calculate the corresponding coordinate given a tap? I simply need to calculate P3 using the input I have.
Code so far:
-(void)tapInImageView:(UITapGestureRecognizer *)tap
{
CGPoint tapPoint = [tap locationInView:tap.view];
if ([self isInOuternCircle:tapPoint]) {
// then create from tapPoint coordinates a new coordinate P2 as described above - but have no idea how.. the key is that P2 distance from the centre should correspond exactly to the radius and the ANGLE should be the same as tapPoint.
}
}
-(BOOL)isInOuternCircle:(CGPoint)point
{
double distanceToCenter = sqrt((point.x - _timerView.center.x)*(point.x - _timerView.center.x) + (point.y - _timerView.center.y)*(point.y - _timerView.center.y));
if (distanceToCenter < _innerCircleRadius) {
return false;
}
return true;
}
I've done this once before, but the math usually depends on how you've set up your coordinate system, so I'll just outline what I did. You'll need a bit of geometry, and a few formulae to determine the new coordinate along the circle.
Calculate the formula of a line passing through the center (P) and your tap point (P3) using this: http://en.wikipedia.org/wiki/Linear_equation#Two-point_form
Determine the equation for your circle: http://en.wikipedia.org/wiki/Circle#Equations
Using the above two equations, you'll have a system of a linear and a quadratic equation: http://www.mathsisfun.com/algebra/systems-linear-quadratic-equations.html
Once you have the equation above, you need to solve it. The result will yield two possible points (the line will intersect the circle in two places), and the point you are looking for is the point closer the tap point. In this case, just compare the distances to P3 between the two solutions, and the shorter distance will show your required solution - P2.

Best practice for using lat/long within a UIView (not MKMapView)

Basically i have a list of POI's (name,lat,long) and i want to draw them on the UIView, relative to my current lat/long. I'm looking for some best practice for mapping these POI (lat/long) to a UIView.
I don't want to use MKMapView (no need for displaying map-data).
I was reading:
http://developer.apple.com/library/ios/#documentation/general/conceptual/Devpedia-CocoaApp/CoordinateSystem.html
But I'm clueless how i get from a CLLocation to a (x,y) on my UIView. I only want to draw those POI's around my current location. So, for example if my screen would represent a 20 by 30 KM region, how do i map my POI's to their corresponding (x,y) coordinates?
Thanks.
What you're doing is a little strange, but you can convert latitude/longitude to a CGPoint-like struct called an MKMapPoint. An MKMapPoint has an x and y value which correspond to points on a map. Imagine if you laid out a flat map of the world, and 0,0 was the top left. MKMapPoint is a point on that map using that coordinate system.
Use the function MKMapPointForCoordinate() to convert a CLLocationCoordinate2D to an MKMapPoint
MKMapPoint myMapPoint = MKMapPointForCoordinate(myLocationCoordinate);
When you get the list of points, you'll have to do something like finding the max and min x and y values, then fitting all the points into your view using those values, otherwise you'll end up with a load of very close points in one place in your view.
My guess is that, for a 20KM by 30KM region, you can consider the earth to be flat and there fore linearly extrapolate the coordinates. I am sure you can google and find out as to how much distance is a difference in 0.00001 in latitude and longitude.
So if you have 20Km to be represented on X axis, and your current location is 30.1234567 in latitude, and 0.0000001 is 1 km then you can put your coordinate in the center of the screen and 30.1234557 as the left most X coordinate and so on.
I am not trying to provide an answer here, but just trying to think out loud, because I wanted to do some thing similar as well and did it as an Internet based app (without display though), where given two coordinates, I had to find the distance between them.
There are many (many) different approaches to modelling the planet and translating 3D coordinates onto a 2D surface, and the errors introduced by the various methods vary depending on what part of the globe you are. This question seems to cover most of what you are after though:
Converting Longitude & Latitude to X Y on a map with Calibration points
I think its best way (correctly work for Mercator projection map):
extension UIView
{
func addLocation(coordinate: CLLocationCoordinate2D)
{
// max MKMapPoint values
let maxY = Double(267995781)
let maxX = Double(268435456)
let mapPoint = MKMapPointForCoordinate(coordinate)
let normalizatePointX = CGFloat(mapPoint.x / maxX)
let normalizatePointY = CGFloat(mapPoint.y / maxY)
let pointView = UIView(frame: CGRectMake(0, 0, 5, 5))
pointView.center = CGPointMake(normalizatePointX * frame.width, normalizatePointY * frame.height)
pointView.backgroundColor = UIColor.blueColor()
addSubview(pointView)
}
}
My simple project for adding coordinate on UIView: https://github.com/Glechik/MapCoordinateDrawer

Resources