Why is X the longitude and Y latitude for com.datastax.driver.dse.geometry.Point? - datastax-enterprise

The constructor for com.datastax.driver.dse.geometry.Point is Point(double x, double y)
The documentation says
the X coordinate is the longitude and the Y is the latitude.
Why is the X the longitude and Y the latitude? Seems counter intuitive.

I am just guessing here, but it feels intuitive to me. In mathematics, the Y-axis is usually the vertical one and X-axis is the horizontal one. Latitude lines intersect the Y-axis, so they have a Y-value and they are parallel with the X axis.
All points on equator have coordinate (x=alpha, y=0) for alpha in [-180, 180]
y
^
|
|
|- - - - - - - - - - - - (equator)
|
|
+--------------------> x

Related

Getting angle between 2 lines in swift [duplicate]

I have two points in my coordinate system (x,y) that I want to know the angle of their line and x-axis.
I use swift for solving this but I can't get the angle.
I need this angle in radians to use it in the following equation:
(x0 + r cos theta, y0 + r sin theta)
r : radius of circle
If you have two points, (x0, y0) and (x1, y1), then the angle of the line joining them (relative to the X axis) is given by:
theta = atan2((y1 - y0), (x1 - x0))
The angle between a line, for reference, let's call this A, defined by two points p1=(x1,y1),p2=(x2, y2) and the x-axis is related to finding the slope/ gradient of the line, A.
# To solve a problem you sometimes have to simplify it and then work up to the full solution"
Let's start by obtaining the gradient of the line A.
The gradient of line A:
slope = (y2 - y1)/(x2 - x1)
for a straight line, that makes an angle theta with the x-axis
tan(theta) = slope = (change in y) / (change in x)
Therefore, theta = tan_inverse (slope)
theta = atan(slope)

Track a spot on a turning circle

I had a question about tracking a spot on a turning circle. As you see in the image I am trying to calculate the x2 and only known parameters are θ1, L and x1. The challenge is to track that spot on each turn of circle which each step size is θ1.
The calculation which gives approximately correct answer is:
x2 = x1 - (L/2 - L/2 * cos(θ1))
Spot Tracking
The problem is as the circle turns x1 deviates more from the correct answer. Is there anyway to calculate θ2 as circle turns?
Hint:
The spot motion is described by
X = Xc + r cos Θ
Y = Yc + r sin Θ
Hence the angle seen from the origin,
φ = arctan((Yc + r sin Θ)/(Xc + r cos Θ)).
Notice that your problem is indeterminate, as the center of the circle is free to move at distance L of the origin, giving different intersections with the vertical at x1.

Revert a function to get a specific value

I have this function which returns x and y position an just adding up degrees, it make objects to move around in circular movements like a satellite around a planet.
In my case it moves like an ellipse because I added +30 to dist.
-(CGPoint)circularMovement:(float)degrees moonDistance:(CGFloat)dist
{
if(degrees >=360)degrees = 0;
float x = _moon.position.x + (dist+30 + _moon.size.height/2) *cos(degrees);
float y = _moon.position.y + (dist + _moon.size.height/2) *sin(degrees);
CGPoint position= CGPointMake(x, y);
return position;
}
What I would like is to reverse this function, giving the x and y position of an object and getting back the dist value.
Is this possible?
If so, how would I go about achieving it?
If you have an origin and a target, the origin having the coordinates (x1, y1) and the target has the coordinates (x2, y2) the distance between them is found using the Pythagorean theorem.
The distance between the points is the square root of the difference between x2 and x1 plus the difference between y2 and y1.
In most languages this would look something like this:
x = x2 - x1;
y = y2 - y1;
distance = Math.SquareRoot(x * x + y * y);
Where Math is your language's math library.
float x = _moon.position.x + (dist+30 + _moon.size.height/2) *cos(degrees);
float y = _moon.position.y + (dist + _moon.size.height/2) *sin(degrees);
is the way you have originally calculated the values, so the inverse formula would be:
dist = ((y - _moon.position.y) / (sin(degrees))) - _moon.size.height/2
You could calculate it based on x as well, but there is no point, it is simpler based on y.

Kivy garden-Graph absolute position of 0,0 on the graph

I'd like to find the relative position of the graph origin in order to display the correct mouse position when it's inside it. But the size and pos property of the graph class are relative to the labels too.
How can I find the absolute position of the point 0,0 on the graph?
There is a member Graph._plot_area. It is a StencilView and it's position and size are equal to the plot area of the graph, meaning it ranges from the start of the xmin, ymin to xmax, ymax in pixel position.
If graph point x,y = 0,0 is in view, it should be equal to _plot_area.pos, however if 0,0 is not in view, you will have to calculate where it is expected to be using a ratio for the x and y axis, they can be calculated by:
x_ratio = (xmax - xmin) / _plot_area.width
y_ratio = (ymax - ymin) / _plot_area.height
and the point calculated by:
x = xmin + x_ratio * ( x - xmin)
y = ymin + y_ratio * ( y - ymin)
Hope this helps!

Get map position when WGS-84 lat/lon when upper left and lower right corners' lat/lon are given

Suppose I have a map, for example from openstreetmaps.org.
I know the WGS-84 lat/lon of the upper left and lower right corner of the map.
How can I find other positions on the map from given WGS-84 lat/lon coordinates?
If the map is roughly street/city level, uses a mercator projection (as openstreetmap.org seems to), and isn't too close to the poles, linear interpolation may be accurate enough. Assuming the following:
TL = lat/lon of top left corner
BR = lat/lon of bottom right corner
P = lat/lon of the point you want to locate on the map
(w,h) = width and height of the map you have (pixels?)
the origin of the map image, (0,0), is at its top-left corner
, we could interpolate the (x,y) position corresponding to P as:
x = w * (P.lon - TL.lon) / (BR.lon - TL.lon)
y = h * (P.lat - TL.lat) / (BR.lat - TL.lat)
Common gotcha's:
The lat/lon notation convention lists the latitude first and the longitude second, i.e. "vertical" before "horizontal". This is opposite to the common x,y notation of image coordinates.
Latitude values increase when going in a north-ward direction ("up"), whereas y coordinates in your map image may be increasing when doing down.
If the map covers a larger area, linear interpolation will not be as accurate for latitudes. For a map that spans one degree of latitude and is in the earth's habitable zones (e.g. the bay area), the center latitude will be off by 0.2% or so, which is likely to by less than a pixel (depending on size)
If that's precise enough for your needs, you can stop here!
The more precise math for getting from P's latitude to a pixel y position would start with the mercator math. We know that for a latitude P.lat, the Y position on a projection starting at the equator would be as follows (I'll use a capital Y as unlike the y value we're looking for, Y starts at the equator and increases towards the north):
Y = k * ln((1 + sin(P.lat)) / (1 - sin(P.lat)))
The constant k depends on the vertical scaling of the map, which we may not know. Luckily, it can be deduced observing that y(TL) - y(BR) = h. That gets us:
k = h / (ln((1 + sin(TL.lat)) / (1 - sin(TL.lat))) - ln((1 + sin(BR.lat)) / (1 - sin(BR.lat))))
(yikes! that's four levels of brackets!) With k known, we now have the formula to find out the Y position of any latitude. We just need to correct for: (1) our y value starts at TL.lat, not the equator, and (2) y grows towards the south, rather than to the north. This gets us:
Y(TL.lat) = k * ln((1 + sin(TL.lat)) / (1 - sin(TL.lat)))
Y(P.lat) = k * ln((1 + sin(P.lat )) / (1 - sin(P.lat )))
y(P.lat) = -(Y(P.lat) - Y(TL.lat))
So this gets you:
x = w * (P.lon - TL.lon) / (BR.lon - TL.lon) // like before
y = -(Y(P.lat) - Y(TL.lat)) // where Y(anything) depends just on h, TL.lat and BR.lat

Resources