Show direction to CGPoint SpriteKit - ios

I need an arrow (white circle) that shows the direction to CGPoint while user move his icon and camera.
I mean that arrow (white circle) needs to take position on the edge of visible screen and shows the way that helps user to return to followed CGPoint.
Demo gif

You are expecting for two things:
place the arrow on the correct screen side given the target CGPoint position
Orient the arrow towards the target CGPoint
In your touchesMoved(_:) method, you can update the arrow rotation and position, not tested but the principle should work :
private func placeArrow(at sourceNode: SKNode, for targetNode: SKNode) {
//do not display arrow if already on screen
guard targetNode.position.x > cameraNode.position.x - screenSizeX/2
&& targetNode.position.x < cameraNode.position.x + screenSizeX/2
&& targetNode.position.y > cameraNode.position.y - screenSizeY/2
&& targetNode.position.y < cameraNode.position.y + screenSizeY/2
{
arrowNode.isHidden = true
return
}
//find arrow position, if on the left place to the left side, else on the right
//place at the medium y between the 2 points
let screenSizeX = UIScreen.main.bounds.width
let screenSizeY = UIScreen.main.bounds.height
let ymin = cameraNode.position.y - screenSizeY/2 + 10
let ymax = cameraNode.position.y + screenSizeY/2 - 10
let midY = (sourceNode.position.y + targetNode.position.y)/2
var clampedMidY = midY
if midY > ymax {
clampedMidY = ymax
} else if midY < ymin {
clampedMidY = ymin
}
arrowNode.position = CGPoint(x: (targetNode.position.x < sourceNode.position.x) ? cameraNode.position.x - screenSizeX/2 : cameraNode.position.x + screenSizeX/2, y: clampedMidY)
//find arrow orientation
//see https://stackoverflow.com/questions/38411494/rotating-a-sprite-towards-a-point-and-move-it-towards-it-with-a-duration
let v1 = CGVector(dx:0, dy:1)
let v2 = CGVector(dx: targetNode.position.x - sourceNode.position.x, dy: targetNode.position.y - sourceNode.position.y)
arrowNode.zRotation = atan2(v2.dy, v2.dx) - atan2(v1.dy, v1.dx)
}

Related

Keep pannable UIView inside bounds

I am developing an iOS application which contains a scalable and pannable UIView. Since the user is allowed to pan and scale the view, I did like to keep the UIView within the boundaries of the screen.
I searched a lot on the internet, but there does not seem to be examples that really fits my need.
Below is the panning code I wrote:
private void HandlePan(UIPanGestureRecognizer recognizer)
{
if (recognizer.State != UIGestureRecognizerState.Began &&
recognizer.State != UIGestureRecognizerState.Changed)
return;
var translation = recognizer.TranslationInView(this);
_posX += translation.X;
_posY += translation.Y;
var maxX = (Bounds.Size.Width / 2) * _currentScale;
var maxY = (Bounds.Size.Height / 2) * _currentScale;
// TODO: The min values are wrong
var minX = (Bounds.Size.Width / 2) / _currentScale;
var minY = (Bounds.Size.Height / 2) / _currentScale;
if (_posX > maxX)
_posX = maxX;
else if (_posX < minX)
_posX = minX;
if (_posY > maxY)
_posY = maxY;
else if (_posY < minY)
_posY = minY;
var translatedCenter = new CGPoint(_posX, _posY);
Center = translatedCenter;
recognizer.SetTranslation(CGPoint.Empty, this);
}
I managed to get the boundaries working with only two sides of the screen. maxX and maxY are correct. I just can't figure out the way on how to calculate the correct minX and minY values.
Below I added a screen recording to show what is going wrong:
You can see that the coordinates are blocking (even more when the scale is incrementing > 1) when I try to drag to the right/bottom side of the image (which are the minX and minY values).
What is wrong in this calculation or what should the correct calculation be? Note that maxX and maxY are perfectly working.
var minX = (Bounds.Size.Width / 2) / _currentScale;
var minY = (Bounds.Size.Height / 2) / _currentScale;
Another approach for calculating boundaries could also be:
int centerPoint = (int)(Bounds.Size.Width / 2);
var minX = centerPoint - ((_currentScale - 1) * centerPoint);
if (_posX < minX)
_posX = minX;
After lots of trial and error I managed to fix it by moving the image manually to the borders of the frame and writing down all the values.
The values looked like this:
Bounds = 375
maxX = 187.5, minX = 187.5, scale = 1
maxX = 549.039223765595, minX = -173.989914508274, scale = 2.92820919341651
maxX = 915.278849493097, minX = -539.668274239639, scale = 4.88052876267261
maxX = 1575.12891219963, minX = -1198.9316584188, scale = 8.40338022601214
Which means that the following formula is able to get the min values.
var minX = -(maxX - Bounds.Size.Width);
var minY = -(maxY - Bounds.Size.Height);
Now it's working fine.

Camera is not following the airplane in Scenekit

I have a flying aircraft which I am following and I am also showing the path the aircraft has followed. I am drawing cylinders as a line for drawing the path. Its kind of drawing a line between 2 points. I have a cameraNode which is set to (0,200,200) initially. At that point I can see the aircraft. But when I start my flight. It goes out of the screen. I want 2 things :
Follow just the aircraft (Path won't matter).
Show whole path and also the aircraft.
I tried finding the min ad max x,y and z and taking average but it din't work. If you see below gif its too zoomed and aircraft has moved out of the screen
Here is how I set my camera:
- (void)setUpCamera {
SCNScene *workingScene = [self getWorkingScene];
_cameraNode = [[SCNNode alloc] init];
_cameraNode.camera = [SCNCamera camera];
_cameraNode.camera.zFar = 500;
_cameraNode.position = SCNVector3Make(0, 60, 50);
[workingScene.rootNode addChildNode:_cameraNode];
SCNNode *frontCameraNode = [SCNNode node];
frontCameraNode.position = SCNVector3Make(0, 100, 50);
frontCameraNode.camera = [SCNCamera camera];
frontCameraNode.camera.xFov = 75;
frontCameraNode.camera.zFar = 500;
[_assetActivity addChildNode:frontCameraNode]; //_assetActivity is the aircraft node.
}
Here is how I am changing camera position which is not working:
- (void)showRealTimeFlightPath {
DAL3DPoint *point = [self.aircraftLocation convertCooridnateTo3DPoint];
DAL3DPoint *previousPoint = [self.previousAircraftLocation convertCooridnateTo3DPoint];
self.minCoordinate = [self.minCoordinate findMinPoint:self.minCoordinate currentPoint:point];
self.maxCoordinate = [self.minCoordinate findMaxPoint:self.maxCoordinate currentPoint:point];
DAL3DPoint *averagePoint = [[DAL3DPoint alloc] init];
averagePoint = [averagePoint averageBetweenCoordiantes:self.minCoordinate maxPoint:self.maxCoordinate];
SCNVector3 positions[] = {
SCNVector3Make(point.x,point.y,point.z) ,
SCNVector3Make(previousPoint.x,previousPoint.y,previousPoint.z)
};
SCNScene *workingScene = [self getWorkingScene];
DALLineNode *lineNodeA = [[DALLineNode alloc] init];
[lineNodeA init:workingScene.rootNode v1:positions[0] v2:positions[1] radius:0.1 radSegementCount:6 lineColor:[UIColor greenColor]] ;
[workingScene.rootNode addChildNode:lineNodeA];
self.previousAircraftLocation = [self.aircraftLocation mutableCopy];
self.cameraNode.position = SCNVector3Make(averagePoint.x, averagePoint.y, z);
self.pointOfView = self.cameraNode;
}
Code in swift or objective c are welcomed.
Thanks!!
The first behavior you describe would most easily be achieved by chaining a look-at constraint and a distance constraint, both targeting the aircraft.
let lookAtConstraint = SCNLookAtConstraint(target: aircraft)
let distanceConstraint = SCNDistanceConstraint(target: aircraft)
distanceConstraint.minimumDistance = 10 // set to whatever minimum distance between the camera and aircraft you'd like
distanceConstraint.maximumDistance = 10 // set to whatever maximum distance between the camera and aircraft you'd like
camera.constraints = [lookAtConstraint, distanceConstraint]
For iOS 10 and earlier, you can implement a distance constraint using SCNTransformConstraint. Here's a basic (though slightly ugly 😛) implementation that uses linear interpolation to update the node's position.
func normalize(_ value: Float, in range: ClosedRange<Float>) -> Float {
return (value - range.lowerBound) / (range.upperBound - range.lowerBound)
}
func interpolate(from start: Float, to end: Float, alpha: Float) -> Float {
return (1 - alpha) * start + alpha * end
}
let target = airplane
let minimumDistance: Float = 10
let maximumDistance: Float = 15
let distanceConstraint = SCNTransformConstraint(inWorldSpace: false) { (node, transform) -> SCNMatrix4 in
let distance = abs(sqrt(pow(target.position.x - node.position.x, 2) + pow(target.position.y - node.position.y, 2) + pow(target.position.z - node.position.z, 2)))
let normalizedDistance: Float
switch distance {
case ...minimumDistance:
normalizedDistance = self.normalize(minimumDistance, in: 0 ... distance)
case maximumDistance...:
normalizedDistance = self.normalize(maximumDistance, in: 0 ... distance)
default:
return transform
}
node.position.x = self.interpolate(from: target.position.x, to: node.position.x, alpha: normalizedDistance)
node.position.y = self.interpolate(from: target.position.y, to: node.position.y, alpha: normalizedDistance)
node.position.z = self.interpolate(from: target.position.z, to: node.position.z, alpha: normalizedDistance)
return transform
}
The second behavior could be implemented by determining the bounding box of your aircraft and all of its path segments in the camera's local coordinate space, then updating the camera's distance from the center of that bounding box to frame all of those nodes in the viewport. frameNodes(_:), a convenience method that implements this functionality, was introduced in iOS 11 and is defined on SCNCameraController. I'd recommend using it if possible, unless you want to dive into the trigonometry yourself. You could use your scene view's default camera controller or create a temporary instance, whichever suits the needs of your app.
You need to calculate the angle of the velocity so that the camera points in the direction of the moving SCNNode.
This code will point you in the right direction.
func renderer(_ aRenderer: SCNSceneRenderer, didSimulatePhysicsAtTime time: TimeInterval) {
// get velocity angle using velocity of vehicle
var degrees = convertVectorToAngle(vector: vehicle.chassisBody.velocity)
// get rotation of current camera on X and Z axis
let eX = cameraNode.eulerAngles.x
let eZ = cameraNode.eulerAngles.z
// offset rotation on y axis by 90 degrees
// this needs work, buggy
let ninety = deg2rad(90)
// default camera Y Euler angle facing north at 0 degrees
var eY : Float = 0.0
if degrees != 0 {
eY = Float(-degrees) - Float(ninety)
}
// rotate camera direction using cameraNode.eulerAngles and direction of velocity as eY
cameraNode.eulerAngles = SCNVector3Make(eX, eY, eZ)
// put camera 25 points behind vehicle facing direction of velocity
let dir = calculateCameraDirection(cameraNode: vehicleNode)
let pos = pointInFrontOfPoint(point: vehicleNode.position, direction:dir, distance: 25)
// camera follows driver view from 25 points behind, and 10 points above vehicle
cameraNode.position = SCNVector3Make(pos.x, vehicleNode.position.y + 10, pos.z)
}
func convertVectorToAngle(vector: SCNVector3) -> CGFloat {
let degrees = atan2(vector.z, vector.x)
return CGFloat(degrees)
}
func pointInFrontOfPoint(point: SCNVector3, direction: SCNVector3, distance: Float) -> SCNVector3 {
var x = Float()
var y = Float()
var z = Float()
x = point.x + distance * direction.x
y = point.y + distance * direction.y
z = point.z + distance * direction.z
let result = SCNVector3Make(x, y, z)
return result
}
func calculateCameraDirection(cameraNode: SCNNode) -> SCNVector3 {
let x = -cameraNode.rotation.x
let y = -cameraNode.rotation.y
let z = -cameraNode.rotation.z
let w = cameraNode.rotation.w
let cameraRotationMatrix = GLKMatrix3Make(cos(w) + pow(x, 2) * (1 - cos(w)),
x * y * (1 - cos(w)) - z * sin(w),
x * z * (1 - cos(w)) + y*sin(w),
y*x*(1-cos(w)) + z*sin(w),
cos(w) + pow(y, 2) * (1 - cos(w)),
y*z*(1-cos(w)) - x*sin(w),
z*x*(1 - cos(w)) - y*sin(w),
z*y*(1 - cos(w)) + x*sin(w),
cos(w) + pow(z, 2) * ( 1 - cos(w)))
let cameraDirection = GLKMatrix3MultiplyVector3(cameraRotationMatrix, GLKVector3Make(0.0, 0.0, -1.0))
return SCNVector3FromGLKVector3(cameraDirection)
}
func deg2rad(_ number: Double) -> Double {
return number * .pi / 180
}

How do I wrap x & y axis in SpriteKit?

So in my GameScene I have these lines in the beginning of func didMoveToView
let border = SKPhysicsBody(edgeLoopFromRect: self.frame)
border.friction = 0
self.physicsBody = border
self.physicsWorld.contactDelegate = self
It works fine for preventing my player from going outside the screen; However, I want my player, or anything else in the scene, to go to the other side of the screen when touching the border. In other words: If i kept going right i would keep going right until i hit the border then my player would appear from the left border and continue going right in an endless loop.
Same goes for y axis.
How is this possible in code?
Try this code:
if player.position.x + player.size.width / 2 > size.width {
player.position.x = -player.size.width
}
else if player.position.x + player.size.width / 2 < 0 {
player.position.x = size.width + player.size.width / 2
Same logic for Y axis.
For whoever is interested to reproduce this, I had to delete the following code:
let border = SKPhysicsBody(edgeLoopFromRect: self.frame)
border.friction = 0
self.physicsBody = border
self.physicsWorld.contactDelegate = self
Then, in touchesMoved I added:
if Player.position.x + Player.size.width / 16 > size.width {
Player.position.x = Player.size.width / 16
}
else if Player.position.x + Player.size.width / 16 < 0 {
Player.position.x = size.width - Player.size.width / 16
}
for the x axis. & the same can be done for the Y axis except we will change the x to y & the width to height.

Using anchorPoint in spritekit to each node

Im wondering at my setting ball.position function, the point is to set ball's point to other spritenode(in left/middle/right side), i tried to write my own function, but it was full of bugs, (it's commented), next after review developer lib's. I found anchorPoint to parent position, but it's not working, i dont know exactly how to set current parent for ball. I'll be really grateful for some advices.
if (ball.position.y < block.position.y +10 && ball.position.y > block.position.y -10) {
midX = CGRectGetMidX(self.frame);
side = block1.size.width/2;
if (emitter.position.x < midX - side+5) {
// fixedEmitterPos = emitter.position.x+side;
ball.anchorPoint = CGPointMake(0,0.5);
}
if (emitter.position.x > midX + side-5){
// fixedEmitterPos = emitter.position.x-side;
ball.anchorPoint = CGPointMake(1,0.5);
}
if (emitter.position.x == 160) {
// fixedEmitterPos = emitter.position.x;
ball.anchorPoint = CGPointMake(0.5,0.5);
}
}
I don't know what exactly is your question but to change your anchor point you don't change anything in parent node. For example if anchor point is
ball.anchorPoint = CGPointMake(0.5,0.5);
the anchor point in exactly in the middle and if you change position of the ball like:
ball.position == CGPointMake(50,50);
the ball center will be exactly in that point (50, 50).
but if the anchor point will be:
ball.anchorPoint = CGPointMake(1, 1);
and you change the position to :
ball.position == CGPointMake(50,50);
the ball centre will be in
X = 50 - (ball width / 2)
Y = 50 - (ball height / 2)
If you want to set up ball position base on other sprite node you can do something like that:
//Attach left centre side of ball to other sprite:
ball.anchorPoint = CGPointMake(0, 0.5);
ball.position.x == otherSprit.position.x + otherSprit.size.with;
ball.position.y == otherSprit.position.y;
//Attach right centre side of ball to other sprite:
ball.anchorPoint = CGPointMake(1, 0.5);
ball.position.x == otherSprit.position.x;
ball.position.y == otherSprit.position.y;
Hope this is what you are about.

Get angle from 2 positions

I have 2 objects and when I move one, I want to get the angle from the other.
For example:
Object1X = 211.000000, Object1Y = 429.000000
Object2X = 246.500000, Object2Y = 441.500000
I have tried the following and every variation under the sun:
double radians = ccpAngle(Object1,Object2);
double degrees = ((radians * 180) / Pi);
But I just get 2.949023 returned where I want something like 45 degrees etc.
Does this other answer help?
How to map atan2() to degrees 0-360
I've written it like this:
- (CGFloat) pointPairToBearingDegrees:(CGPoint)startingPoint secondPoint:(CGPoint) endingPoint
{
CGPoint originPoint = CGPointMake(endingPoint.x - startingPoint.x, endingPoint.y - startingPoint.y); // get origin point to origin by subtracting end from start
float bearingRadians = atan2f(originPoint.y, originPoint.x); // get bearing in radians
float bearingDegrees = bearingRadians * (180.0 / M_PI); // convert to degrees
bearingDegrees = (bearingDegrees > 0.0 ? bearingDegrees : (360.0 + bearingDegrees)); // correct discontinuity
return bearingDegrees;
}
Running the code:
CGPoint p1 = CGPointMake(10, 10);
CGPoint p2 = CGPointMake(20,20);
CGFloat f = [self pointPairToBearingDegrees:p1 secondPoint:p2];
And this returns 45.
Hope this helps.
Here's how I'm doing it in Swift for those interested, it's based on #bshirley's answer above w/ a few modifications to help match to the calayer rotation system:
extension CGFloat {
var degrees: CGFloat {
return self * CGFloat(180) / .pi
}
}
extension CGPoint {
func angle(to comparisonPoint: CGPoint) -> CGFloat {
let originX = comparisonPoint.x - x
let originY = comparisonPoint.y - y
let bearingRadians = atan2f(Float(originY), Float(originX))
var bearingDegrees = CGFloat(bearingRadians).degrees
while bearingDegrees < 0 {
bearingDegrees += 360
}
return bearingDegrees
}
}
This provides a coordinate system like this:
90
180 0
270
Usage:
point.angle(to: point2)
CGPoint.zero.angle(to: CGPoint(x: 0, y: 1)) // 90
I modified #tomas' solution to be streamlined. It's likely (it was for me) that this math is going to be called frequently.
In my incarnation, you have to perform the difference between the two points yourself (or if you're lucky, (0,0) is already one of your points). The value being calculated is the direction of the point from (0,0). Yes, that's simple enough and you could inline it if you really want to. My preference is for more readable code.
I also converted it to a function call:
CGFloat CGPointToDegree(CGPoint point) {
// Provides a directional bearing from (0,0) to the given point.
// standard cartesian plain coords: X goes up, Y goes right
// result returns degrees, -180 to 180 ish: 0 degrees = up, -90 = left, 90 = right
CGFloat bearingRadians = atan2f(point.y, point.x);
CGFloat bearingDegrees = bearingRadians * (180. / M_PI);
return bearingDegrees;
}
If you don't want negative values, you need to convert it yourself. Negative values were fine for me - no need to make unneeded calculations.
I was using this in a cocos2d environment, this is how I call it: (Mathematically, we are translating the plane to make p0 the origin. Thus subtracting p0 from p1 (p0 - p0 = {0,0}). The angles are unchanged when the plane is translated.)
CGPoint p0 = self.position;
CGPoint p1 = other.position;
CGPoint pnormal = ccpSub(p1, p0);
CGFloat angle = CGPointToDegree(pnormal);
ccpSub is provided by cocos2d, it's subtraction of a tuple - you can do that yourself if you don't have that available
aside: it's generally not polite style to name the method as above with the CG___ naming scheme, which identifies the function as part of CoreGraphics - so if you want to rename it to MyConvertCGPointToBearing() or FredLovesWilma() then you should do that.
Tomas' answer in Swift 5
func angle(between starting: CGPoint, ending: CGPoint) -> CGFloat {
let center = CGPoint(x: ending.x - starting.x, y: ending.y - starting.y)
let radians = atan2(center.y, center.x)
let degrees = radians * 180 / .pi
return degrees > 0 ? degrees : 360 + degrees
}
There is no angle between two points. If you want to know the angle between the vectors from the origin (0,0) to the objects, use the scalar (dot) product:
theta = arccos ( (veca dot vecb) / ( |veca| * |vecb| )
The math std lib of the language your are using surely provides functions for arcus cosine, scalar product and length.
The vertex of the angle is the point (0,0).
Consider object1X=x1 ....object2Y=y2.
Angle(object1-object2) =
90 * ( (1 + sign(x1)) * (1 - sign(y1^2))
- (1 + sign(x2)) * (1 - sign(y2^2)) )
+ 45 * ( (2 + sign(x1)) * sign(y1)
- (2 + sign(x2)) * sign(y2) )
+ 180/pi() * sign(x1*y1) * atan( (abs(x1) - abs(y1)) / (abs(x1) + abs(y1)) )
- 180/pi() * sign(x2*y2) * atan( (abs(x2) - abs(y2)) / (abs(x2) + abs(y2)) )
Will leave it here. Corrected code, plus with rotation of the axis by 90 degrees counterclockwise. I've used it for touches. viewCenter is just center of the view
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let location = touch.location(in: self)
guard let viewCenter = self.viewCenter else { return }
let angle = angle(between: CGPoint(x: location.x, y: location.y) , ending:viewCenter)
print(angle)
}
}
func angle(between starting: CGPoint, ending: CGPoint) -> CGFloat {
let center = CGPoint(x: ending.x - starting.x, y: ending.y - starting.y)
let angle90 = deg2rad(90)
//Rotate axis by 90 degrees counter clockwise
let rotatedX = center.x * cos(angle90) + center.y * sin(angle90)
let rotatedY = -center.x * sin(angle90) + center.y * cos(angle90)
let radians = atan2(rotatedY, rotatedX)
let degrees = radians * 180 / .pi
return degrees > 0 ? degrees : degrees + 360
}
func deg2rad(_ number: CGFloat) -> CGFloat {
return number * .pi / 180
}

Resources