I have a series of Lat/Long coords. I need to transform it in X, Y coordinates. I've read about UTM, but the problem is that UTM coordinates are relatives to a single zone.
For example, this two coordinates UTM has the same Easting (x) and Northing (y) but different code zone, and so each coords point to a completly different location (one in spain and one in italy):
UTM: 33T 292625m E 4641696m N
UTM: 30U 292625m E 4641696m N
I need a method to automatically transform that relatives coord in absolute X, Y coordinates. Ideas?
Does it have to be UTM? If not, you can also use Mercator, which is a simpler projection that doesn't rely on zones.
See, for example, the Bing Maps system.
You should be able to use the ProjNET library.
What you need is to find the WKT (well known text) that defines your projections, and then you should be able to convert between them.
var utm33NCoordinateSystem = CoordinateSystemWktReader.Parse("WKT for correct utm zone") as IProjectedCoordinateSystem;
var wgs84CoordiateSystem = CoordinateSystemWktReader.Parse(MappingTransforms.WGS84) as IGeographicCoordinateSystem;
var ctfac = new CoordinateTransformationFactory();
_etrsToWgsTransformation = ctfac.CreateFromCoordinateSystems(etrs89CoordinateSystem,wgs84CoordiateSystem);
double[] transform = _etrsToWgsTransformation.MathTransform.Transform(new double[] { y,x });
Note: you have to find the correct WKTs, but that can be found on the project site.
Also you may have to flip the order of the inputs, depending on the transforms.
if we want to correct bearing & Distance in between two points than we use (Polar) method by using Scientific Calculator first of all Press The Button polar than started bracket first
For the Distance POL(N-N,E-E)
For the Bearing POL(N-N,E-E)RclTan-1
Now you have got a Correct bearing
Related
triangulatePoints() method returns 4D coordinates as (X, Y, Z, W). Real object coordinates can be calculated as (X/W, Y/W, Z/W). I tried to plot points (chessboard corners) generated by this method before and after dividing by W. Interestingly, the correct location of points is only without dividing:
But these coordinates are not real (the real distance between adjacent corners is 1)
After dividing:
Definitely, there is something wrong
I think it can be caused by strange values of W:
[ 1.42646418e-03 1.22798549e-03 1.02968386e-03 8.39846092e-04
6.36201818e-04 4.52465261e-04 2.69547454e-04 8.46875409e-05
-9.68981258e-05 1.40832004e-03 1.21079874e-03 1.01654453e-03
8.17881722e-04 6.27299945e-04 4.34311369e-04 2.59211694e-04
8.54686004e-05 -8.65304610e-05 1.40546728e-03 1.20577158e-03
1.01246696e-03 8.14260624e-04 6.32434676e-04 4.47672734e-04
2.72056146e-04 9.63734783e-05 -8.87211063e-05 1.40579767e-03
1.20654306e-03 1.01806328e-03 8.29431403e-04 6.48407149e-04
4.67954233e-04 2.88052834e-04 1.05378487e-04 -8.11762657e-05
1.42593682e-03 1.23078423e-03 1.04424497e-03 8.57530802e-04
6.73743780e-04 4.87769896e-04 3.05575493e-04 1.14500137e-04
-7.35641006e-05 1.46166584e-03 1.27260783e-03 1.07531622e-03
8.86362221e-04 6.96056406e-04 5.09601785e-04 3.19138955e-04
1.36194620e-04 -5.99504456e-05]
I feel that these values should be almost the same, but they differ by more than 2 orders. Where is a mistake?
I am trying to use osmnx to find distances between a origin point (lat/lon) and nearest infrastructure, such as railways, water or parks.
1) I get the entire graph from an area with network_type='walk'.
2) Get the needed infrastructure, e.g. railway for that same area.
3) Compose the two graphs into one.
4) Find the nearest node from origin point in the original graph.
5) Find the nearest node from the origin point in the infrastructure graph
6) Find the shortest route length between the two nodes.
If you run the example below, you will see that it is missing 20% of the data because it cannot find a route between the nodes. For infrastructure='way["leisure"~"park"]' or infrastructure='way["natural"~"wood"]' this is even worse, with 80-90% of nodes not being connected.
Minimal reproducible example:
import osmnx as ox
import networkx as nx
bbox = [55.5267243, 55.8467243, 12.4100724, 12.7300724]
g = ox.graph_from_bbox(bbox[0], bbox[1], bbox[2], bbox[3],
retain_all=True,
truncate_by_edge=True,
simplify=False,
network_type='walk')
points = [(55.6790884456018, 12.568493971506154),
(55.6790884456018, 12.568493971506154),
(55.6867418740291, 12.58232314016353),
(55.6867418740291, 12.58232314016353),
(55.6867418740291, 12.58232314016353),
(55.67119624894504, 12.587201455313153),
(55.677406927839506, 12.57651997656002),
(55.6856574907879, 12.590500429002823),
(55.6856574907879, 12.590500429002823),
(55.68465359365924, 12.585474365063224),
(55.68153666806675, 12.582594757267945),
(55.67796979175, 12.583111746311117),
(55.68767346629932, 12.610040871066179),
(55.6830855237578, 12.575431380892427),
(55.68746749645466, 12.589488615911913),
(55.67514254640597, 12.574308210656602),
(55.67812748568291, 12.568454119053886),
(55.67812748568291, 12.568454119053886),
(55.6701733527419, 12.58989203029166),
(55.677700136266616, 12.582800629527789)]
railway = ox.graph_from_bbox(bbox[0], bbox[1], bbox[2], bbox[3],
retain_all=True,
truncate_by_edge=True,
simplify=False,
network_type='walk',
infrastructure='way["railway"]')
g_rail = nx.compose(g, railway)
l_rail = []
for point in points:
nearest_node = ox.get_nearest_node(g, point)
rail_nn = ox.get_nearest_node(railway, point)
if nx.has_path(g_rail, nearest_node, rail_nn):
l_rail.append(nx.shortest_path_length(g_rail, nearest_node, rail_nn, weight='length'))
else:
l_rail.append(-1)
There are 2 things that caught my attention.
OSMNX documentation specifies ox.graph_from_bbox parameters be given in the order of north, south, east, west (https://osmnx.readthedocs.io/en/stable/osmnx.html). I mention this because when I tried to run your code, I was getting empty graphs.
The parameter 'retain_all = True' is the key as you may already know. When set to true, it retains all nodes in the graph, even if they are not connected to any of the other nodes in the graph. This happens primarily due to the incompleteness of OpenStreetMap which contains voluntarily contributed geographic information. I suggest you set 'retain_all = False' meaning your graph now contains only the connected nodes. In this way, you get a complete list without any -1.
I hope this helps.
g = ox.graph_from_bbox(bbox[1], bbox[0], bbox[3], bbox[2],
retain_all=False,
truncate_by_edge=True,
simplify=False,
network_type='walk')
railway = ox.graph_from_bbox(bbox[1], bbox[0], bbox[3], bbox[2],
retain_all=False,
truncate_by_edge=True,
simplify=False,
network_type='walk',
infrastructure='way["railway"]')
g_rail = nx.compose(g, railway)
l_rail = []
for point in points:
nearest_node = ox.get_nearest_node(g, point)
rail_nn = ox.get_nearest_node(railway, point)
if nx.has_path(g_rail, nearest_node, rail_nn):
l_rail.append(nx.shortest_path_length(g_rail, nearest_node, rail_nn, weight='length'))
else:
l_rail.append(-1)
print(l_rail)
Out[60]:
[7182.002999999995,
7182.002999999995,
5060.562000000002,
5060.562000000002,
5060.562000000002,
6380.099999999999,
7127.429999999996,
4707.014000000001,
4707.014000000001,
5324.400000000003,
6153.250000000002,
6821.213000000002,
8336.863999999998,
6471.305,
4509.258000000001,
5673.294999999996,
6964.213999999994,
6964.213999999994,
6213.673,
6860.350000000001]
I'm using Lua for the first time, and of course need to check around to learn how to implement certain code.
To create a vertex in Gideros, there's this code:
mesh:setVertex(index, x, y)
However, I would also like to use the z coordinate.
I've been checking around, but haven't found any help. Does anyone know if Gideros has a method for this, or are there any tips and tricks on setting the z coordinates?
First of all these functions are not provided by Lua, but by the Gideros Lua API.
There are no meshes or things like that in native Lua.
Referring to the reference Gideros Lua API reference manual would give you some valuable hints:
http://docs.giderosmobile.com/reference/gideros/Mesh#Mesh
Mesh can be 2D or 3D, the latter expects an additionnal Z coordinate
in its vertices.
http://docs.giderosmobile.com/reference/gideros/Mesh/new
Mesh.new([is3d])
Parameters:
is3d: (boolean) Specifies that this mesh
expect Z coordinate in its vertex array and is thus a 3D mesh
So in order to create a 3d mesh you have to do something like:
local myMesh = Mesh.new(true)
Although the manual does not say that you can use a z coordinate in setVertex
http://docs.giderosmobile.com/reference/gideros/Mesh/setVertex
It is very likely that you can do that.
So let's have a look at Gideros source code:
https://github.com/gideros/gideros/blob/1d4894fb5d39ef6c2375e7e3819cfc836da7672b/luabinding/meshbinder.cpp#L96-L109
int MeshBinder::setVertex(lua_State *L)
{
Binder binder(L);
GMesh *mesh = static_cast<GMesh*>(binder.getInstance("Mesh", 1));
int i = luaL_checkinteger(L, 2) - 1;
float x = luaL_checknumber(L, 3);
float y = luaL_checknumber(L, 4);
float z = luaL_optnumber(L, 5, 0.0);
mesh->setVertex(i, x, y, z);
return 0;
}
Here you can see that you can indeed provide a z coordinate and that it will be used.
So
local myMesh = Mesh.new(true)
myMesh:SetVertex(1, 100, 20, 40)
should work just fine.
You could have simply tried that btw. It's for free, it doesn't hurt and it's the best way to learn!
I would get some data stream about 3d position(in fixed world coordinate system) of a human's 20 skeletons.
I want to use the skeletons data to drive a human model with fixed bone like the demo video.
In Kinect SDK v1.8,i could get each skeleton's local rotation by NUI_SKELETON_BONE_ORIENTATION.hierarchicalRotation.
I want to implement some function like that.But the Kinect's SDK isn't open source.
I've found that the function xnGetSkeletonJointOrientation could get skeleton's rotation like that in OpenNI.But i haven't found the implement function about that.I don't know where am i wrong.
Any idea is appreciated.Thanks!
EDIT
I have found a similar question.
Here is the code he used finally.
Point3d Controller::calRelativeToParent(int parentID,Point3d point,int frameID){
if(parentID == 0){
QUATERNION temp = calChangeAxis(-1,parentID,frameID);
return getVect(multiplyTwoQuats(multiplyTwoQuats(temp,getQuat(point)),getConj(temp)));
}else{
Point3d ref = calRelativeToParent(originalRelativePointMap[parentID].parentID,point,frameID);
QUATERNION temp = calChangeAxis(originalRelativePointMap[parentID].parentID,parentID,frameID);
return getVect(multiplyTwoQuats(multiplyTwoQuats(temp,getQuat(ref)),getConj(temp)));
}}
QUATERNION Controller::calChangeAxis(int parentID,int qtcId,int frameID){ //currentid = id of the position of the orientation to be changed
if(parentID == -1){
QUATERNION out = multiplyTwoQuats(quatOrigin.toChange,originalRelativePointMap[qtcId].orientation);
return out;
}
else{
//QUATERNION temp = calChangeAxis(originalRelativePointMap[parentID].parentID,qtcId,frameID);
//return multiplyTwoQuats(finalQuatMap[frameID][parentID].toChange,temp);
return multiplyTwoQuats(finalQuatMap[frameID][parentID].toChange,originalRelativePointMap[qtcId].orientation);
}}
But i still have some question about that.
What does the variables quatOrigin.toChange and originalRelativePointMap stand for?
And in my opinion,the parameter Point3d point of the function Controller::calRelativeToParent should be a vector with euler angle.In this way,how to call the Controller::calRelativeToParent API in the main program.Because we know the root's rotation only.
The skeleton class has a "Joints" member that contains all the 3d position data for each tracked joint on the skeleton. I would look at the joint position data directly to drive your model rather than angles. Take one point to be your base (head or otherwise) then generate vectors in tree form between pairs of connected skeletal points. Scale those vectors and apply them to your model.
I am trying to find a simple algorithm to find the correspondence between two sets of 2D points (registration). One set contains the template of an object I'd like to find and the second set mostly contains points that belong to the object of interest, but it can be noisy (missing points as well as additional points that do not belong to the object). Both sets contain roughly 40 points in 2D. The second set is a homography of the first set (translation, rotation and perspective transform).
I am interested in finding an algorithm for registration in order to get the point-correspondence. I will be using this information to find the transform between the two sets (all of this in OpenCV).
Can anyone suggest an algorithm, library or small bit of code that could do the job? As I'm dealing with small sets, it does not have to be super optimized. Currently, my approach is a RANSAC-like algorithm:
Choose 4 random points from set 1 and from set 2.
Compute transform matrix H (using openCV getPerspective())
Warp 1st set of points using H and test how they aligned to the 2nd set of points
Repeat 1-3 N times and choose best transform according to some metric (e.g. sum of squares).
Any ideas? Thanks for your input.
With python you can use Open3D librarry, wich is very easy to install in Anaconda. To your purpose ICP should work fine, so we'll use the classical ICP, wich minimizes point-to-point distances between closest points in every iteration. Here is the code to register 2 clouds:
import numpy as np
import open3d as o3d
# Parameters:
initial_T = np.identity(4) # Initial transformation for ICP
distance = 0.1 # The threshold distance used for searching correspondences
(closest points between clouds). I'm setting it to 10 cm.
# Read your point clouds:
source = o3d.io.read_point_cloud("point_cloud_1.xyz")
target = o3d.io.read_point_cloud("point_cloud_0.xyz")
# Define the type of registration:
type = o3d.pipelines.registration.TransformationEstimationPointToPoint(False)
# "False" means rigid transformation, scale = 1
# Define the number of iterations (I'll use 100):
iterations = o3d.pipelines.registration.ICPConvergenceCriteria(max_iteration = 100)
# Do the registration:
result = o3d.pipelines.registration.registration_icp(source, target, distance, initial_T, type, iterations)
result is a class with 4 things: the transformation T(4x4), 2 metrict (rmse and fitness) and the set of correspondences.
To acess the transformation:
I used it a lot with 3D clouds obteined from Terrestrial Laser Scanners (TLS) and from robots (Velodiny LIDAR).
With MATLAB:
We'll use the point-to-point ICP again, because your data is 2D. Here is a minimum example with two point clouds random generated inside a triangle shape:
% Triangle vértices:
V1 = [-20, 0; -10, 10; 0, 0];
V2 = [-10, 0; 0, 10; 10, 0];
% Create clouds and show pair:
points = 5000
N1 = criar_nuvem_triangulo(V1,points);
N2 = criar_nuvem_triangulo(V2,points);
pcshowpair(N1,N2)
% Registrate pair N1->N2 and show:
[T,N1_tranformed,RMSE]=pcregistericp(N1,N2,'Metric','pointToPoint','MaxIterations',100);
pcshowpair(N1_tranformed,N2)
"criar_nuvem_triangulo" is a function to generate random point clouds inside a triangle:
function [cloud] = criar_nuvem_triangulo(V,N)
% Function wich creates 2D point clouds in triangle format using random
% points
% Parameters: V = Triangle vertices (3x2 Matrix)| N = Number of points
t = sqrt(rand(N, 1));
s = rand(N, 1);
P = (1 - t) * V(1, :) + bsxfun(#times, ((1 - s) * V(2, :) + s * V(3, :)), t);
points = [P,zeros(N,1)];
cloud = pointCloud(points)
end
results:
You may just use cv::findHomography. It is a RANSAC-based approach around cv::getPerspectiveTransform.
auto H = cv::findHomography(srcPoints, dstPoints, CV_RANSAC,3);
Where 3 is the reprojection threshold.
One traditional approach to solve your problem is by using point-set registration method when you don't have matching pair information. Point set registration is similar to method you are talking about.You can find matlab implementation here.
Thanks