I am trying to mark a few thousands of geo-locations on the world-map with matlab. I have the latitudes and longitudes of those locations. Is there any good way to do this? Thanks.
Here is an example that doesn't require any toolbox.
First we create a function that converts longitute/latitude locations using the Mercator projection.
function [x,y] = mercatorProjection(lon, lat, width, height)
x = mod((lon+180)*width/360, width) ;
y = height/2 - log(tan((lat+90)*pi/360))*width/(2*pi);
end
We create some locations:
% GPS positions (latitude,longitude) of some markers
data = [
-22.976730, - 43.195080 ;
55.756950, 37.614975 ;
33.605381, - 7.631940 ;
35.670479, 139.740921 ;
51.506325, - 0.127144 ;
40.714550, - 74.007124 ;
-33.869629, 151.206955 ;
-26.204944, 28.040035 ;
37.777125, -122.419644 ;
30.083740, 31.255360 ;
6.439180, 3.423480
];
labels = {
'Rio de Janeiro'
'Moscow'
'Casablanca'
'Tokyo'
'London'
'New York'
'Sydney'
'Johannesburg'
'San Francisco'
'Cairo'
'Lagos'
};
Next we load a map from Wikipedia, apply the projection and overlay the markers:
% world map in Mercator projection
fname = 'https://upload.wikimedia.org/wikipedia/commons/thumb/7/74/Mercator-projection.jpg/773px-Mercator-projection.jpg';
img = imread(fname);
[imgH,imgW,~] = size(img);
% Mercator projection
[x,y] = mercatorProjection(data(:,2), data(:,1), imgW, imgH);
% plot markers on map
imshow(img, 'InitialMag',100, 'Border','tight'), hold on
plot(x,y, 'bo', 'MarkerSize',10, 'LineWidth',3)
text(x, y, labels, 'Color','w', 'VerticalAlign','bottom', 'HorizontalAlign','right')
hold off
Great way to plot the world!
You just have to change the following:
imshow(I, 'InitialMag',100, 'Border','tight'), hold on
into
imshow(img, 'InitialMag',100, 'Border','tight'), hold on
Amro's answer worked for me, but I had to do some changes.
I'm using Matlab 7.9 and imshow is part of the Image Processing Toolbox. In order to show the map without using the imshow function, I replaced this line:
imshow(img, 'InitialMag',100, 'Border','tight')
With this one:
image(img)
And it worked.
Related
I making an R Leaflet Map and I have 2 legend. how to combine them?
thanks
Understanding the structure of your map (str(mapObject))object in R can be a helpful starting point. This can be useful for making "aftermarket" edits to legends.
I tried this as a solution to your problem:
# Concatenate the vectors that define each set of colors and their corresponding values:
require(spData)
require(leaflet)
require(sf)
# loading shapes of countries from the package spData
data(world)
world <- st_read(system.file("shapes/world.gpkg", package="spData"))
africa <- world[world$continent == "Africa",]
asia <- world[world$continent == "Asia", ]
asiaPal <- colorNumeric("Reds", domain = asia$pop)
africaPal <- colorNumeric("Blues", domain = africa$pop)
map <- leaflet() %>%
addProviderTiles(providers$CartoDB.Positron) %>%
addPolygons(data = asia,
color = ~asiaPal(asia$pop)) %>%
addPolygons(data = africa,
color = ~africaPal(africa$pop)) %>%
addLegend("bottomright", pal = asiaPal, values = asia$pop, title = "Asian Population") %>%
addLegend("bottomright", pal = africaPal, values = africa$pop, title = "African Population")
# Colors
map$x$calls[[5]]$args[[1]]$colors <-
c(map$x$calls[[5]]$args[[1]]$colors, map$x$calls[[4]]$args[[1]]$colors)
# Labels
map$x$calls[[5]]$args[[1]]$labels <-
c(map$x$calls[[5]]$args[[1]]$labels, map$x$calls[[4]]$args[[1]]$labels)
# Get rid of Old Legend:
map$x$calls[[4]] <- NULL
where your legends result from elements 4 & 5 of map$x$calls.
This doesnt work very nicely. I suspect it's because these list elements are not the end result, and the elements of the map object are provided to javascript/html when rendering the map. That said, I dont know if it's easily possible to do what you are trying to achieve, without poking around in the actual HTML that results.
I´m trying to transform a point from one map to another. I´ve tried to use some OpenCV sample code for getAffineTransform(), getPerspectiveTransform(), warpAffine() and findHomography(), but there´re always some kind of gaps in my transformation mesh. The feature points are usually detected on very different positions, so I need a good interpolation method, I think.
About the maps:
Both maps are images which are containing human body parts and human skin. I´m using the OpenCV feature detection/matching algorithmns to get a couple of equal points in both maps. The tricky thing is they´re containing arms and feets, too. Feature points on arms/feets can have much bigger offsets than the points on the torso.
The goal:
I want to transform any point on map A as good as possible to the equivalent position on map B.
My current approach is to find the three most clostest points to my original point on map A and construct a triangle. Afterwards I transform this triangle to the same three feature points on map B. That´s working nice if I have a lot of close feature point surrounding my original point. But on larger areas without feature points I got some problems with the interpolation.
Is this a good way to do so? Or is there a much better solution?
My favorite one would be the contruction of a complete transformation map for both images, but I´m not sure how to do this. Is it possible at all?
Thanks a lot for any advice!
Simple sketch of the transformation (I´m trying to find the points X1 to X3 from the left image in the right image):
Sketch of a sample transformation
Sample for homography (OpenCVSharp):
Mat imgA = new Mat(#"d:\Mesh\Left2.jpg", ImreadModes.Color);
Mat imgB = new Mat(#"d:\Mesh\Right2.jpg", ImreadModes.Color);
Cv2.Resize(imgA, imgA, new Size(512, 341));
Cv2.Resize(imgB, imgB, new Size(512, 341));
SURF detector = SURF.Create(500.0);
KeyPoint[] keypointsA = detector.Detect(imgA);
KeyPoint[] keypointsB = detector.Detect(imgB);
SIFT extractor = SIFT.Create();
Mat descriptorsA = new Mat();
Mat descriptorsB = new Mat();
extractor.Compute(imgA, ref keypointsA, descriptorsA);
extractor.Compute(imgB, ref keypointsB, descriptorsB);
BFMatcher matcher = new BFMatcher(NormTypes.L2, true);
DMatch[] matches = matcher.Match(descriptorsA, descriptorsB);
double minDistance = 10000.0;
double maxDistance = 0.0;
for (int i = 0; i < matches.Length; ++i)
{
double distance = matches[i].Distance;
if (distance < minDistance)
{
minDistance = distance;
}
if (distance > maxDistance)
{
maxDistance = distance;
}
}
List<DMatch> goodMatches = new List<DMatch>();
for (int i = 0; i < matches.Length; ++i)
{
if (matches[i].Distance <= 3.0 * minDistance &&
Math.Abs(keypointsA[matches[i].QueryIdx].Pt.Y - keypointsB[matches[i].TrainIdx].Pt.Y) < 30)
{
goodMatches.Add(matches[i]);
}
}
Mat output = new Mat();
Cv2.DrawMatches(imgA, keypointsA, imgB, keypointsB, goodMatches.ToArray(), output);
List<Point2f> goodA = new List<Point2f>();
List<Point2f> goodB = new List<Point2f>();
for (int i = 0; i < goodMatches.Count; i++)
{
goodA.Add(keypointsA[goodMatches[i].QueryIdx].Pt);
goodB.Add(keypointsB[goodMatches[i].TrainIdx].Pt);
}
InputArray goodInputA = InputArray.Create<Point2f>(goodA);
InputArray goodInputB = InputArray.Create<Point2f>(goodB);
Mat h = Cv2.FindHomography(goodInputA, goodInputB);
Point2f centerA = new Point2f(imgA.Cols / 2.0f, imgA.Rows / 2.0f);
output.DrawMarker((int)centerA.X, (int)centerA.Y, Scalar.Red, MarkerStyle.Cross, 50, LineTypes.Link8, 5);
Point2f[] transformedPoints = Cv2.PerspectiveTransform(new Point2f[] { centerA }, h);
output.DrawMarker((int)transformedPoints[0].X + imgA.Cols, (int)transformedPoints[0].Y, Scalar.Red, MarkerStyle.Cross, 50, LineTypes.Link8, 5);
Code snippet for perspective transform (different approach, OpenCVSharp):
pointsA[0] = new Point(trisA[i].Item0, trisA[i].Item1);
pointsA[1] = new Point(trisA[i].Item2, trisA[i].Item3);
pointsA[2] = new Point(trisA[i].Item4, trisA[i].Item5);
pointsB[0] = new Point(trisB[i].Item0, trisB[i].Item1);
pointsB[1] = new Point(trisB[i].Item2, trisB[i].Item3);
pointsB[2] = new Point(trisB[i].Item4, trisB[i].Item5);
Mat transformation = Cv2.GetAffineTransform(pointsA, pointsB);
InputArray inputSource = InputArray.Create<Point2f>(new Point2f[] { new Point2f(10f, 50f) });
Mat outputMat = new Mat();
Cv2.PerspectiveTransform(inputSource, outputMat, transformation);
Mat.Indexer<Point2f> indexer = outputMat.GetGenericIndexer<Point2f>();
var target = indexer[0, 0];
I'm working cum clouds of points with PCL. I recently had to convert the color information of the points that are in RGB to Cielab.
I have seen that it is possible to do with OpenCV and then I used the following code:
pcl::PointCloud<pcl::PointXYZLAB>::Ptr convert_rgb_to_lab_opencv(pcl::PointCloud<pcl::PointXYZRGB>::Ptr cloud) {
pcl::PointCloud <pcl::PointXYZLAB>::Ptr cloud_lab(new pcl::PointCloud <pcl::PointXYZLAB>);
cloud_lab->height = cloud->height;
cloud_lab->width = cloud->width;
for (pcl::PointCloud<pcl::PointXYZRGB>::iterator it = cloud->begin(); it != cloud->end(); it++) {
// Color conversion
cv::Mat pixel(1, 1, CV_8UC3, cv::Scalar(it->r, it->g, it->b));
cv::Mat temp;
cv::cvtColor(pixel, temp, CV_BGR2Lab);
pcl::PointXYZLAB point;
point.x = it->x;
point.y = it->y;
point.z = it->z;
point.L = temp.at<uchar>(0, 0);
point.a = temp.at<uchar>(0, 1);
point.b = temp.at<uchar>(0, 2);
cloud_lab->push_back(point);
}
return cloud_lab;
}
My question is: are the values I got correct? Should not LAB values be decimal and vary with negative numbers?
So I tried to do the conversion "manually" with the code available here.
When I visualized the two clouds in the CloudCompare I saw that they produced very similar views, even in the histogram.
Can someone explain to me why?
I have implemented fft into at32ucb series ucontroller using kiss fft library and currently struggling with the output of the fft.
My intention is to analyse sound coming from piezo speaker.
Currently, the frequency of the sounder is 420Hz which I successfully got from the fft output (cross checked with an oscilloscope). However, the output frequency is just half of expected if I put function generator waveform into the system.
I suspect its the frequency bin calculation formula which I got wrong; currently using, fft_peak_magnitude_index*sampling frequency / fft_size.
My input is real and doing real fft. (output samples = N/2)
And also doing iir filtering and windowing before fft.
Any suggestion would be a great help!
// IIR filter calculation, n = 256 fft points
for (ctr=0; ctr<n; ctr++)
{
// filter calculation
y[ctr] = num_coef[0]*x[ctr];
y[ctr] += (num_coef[1]*x[ctr-1]) - (den_coef[1]*y[ctr-1]);
y[ctr] += (num_coef[2]*x[ctr-2]) - (den_coef[2]*y[ctr-2]);
y1[ctr] = y[ctr] - 510; //eliminate dc offset
// hamming window
hamming[ctr] = (0.54-((0.46) * cos(2*M_PI*ctr/n)));
window[ctr] = hamming[ctr]*y1[ctr];
fft_input[ctr].r = window[ctr];
fft_input[ctr].i = 0;
fft_output[ctr].r = 0;
fft_output[ctr].i = 0;
}
kiss_fftr_cfg fftConfig = kiss_fftr_alloc(n,0,NULL,NULL);
kiss_fftr(fftConfig, (kiss_fft_scalar * )fft_input, fft_output);
peak = 0;
freq_bin = 0;
for (ctr=0; ctr<n1; ctr++)
{
fft_mag[ctr] = 10*(sqrt((fft_output[ctr].r * fft_output[ctr].r) + (fft_output[ctr].i * fft_output[ctr].i)))/(0.5*n);
if(fft_mag[ctr] > peak)
{
peak = fft_mag[ctr];
freq_bin = ctr;
}
frequency = (freq_bin*(10989/n)); // 10989 is the sampling freq
//************************************
//Usart write
char filtResult[10];
//sprintf(filtResult, "%04d %04d %04d\n", (int)peak, (int)freq_bin, (int)frequency);
sprintf(filtResult, "%04d %04d %04d\n", (int)x[ctr], (int)fft_mag[ctr], (int)frequency);
char c;
char *ptr = &filtResult[0];
do
{
c = *ptr;
ptr++;
usart_bw_write_char(&AVR32_USART2, (int)c);
// sendByte(c);
} while (c != '\n');
}
The main problem is likely to be how you declared fft_input.
Based on your previous question, you are allocating fft_input as an array of kiss_fft_cpx. The function kiss_fftr on the other hand expect an array of scalar. By casting the input array into a kiss_fft_scalar with:
kiss_fftr(fftConfig, (kiss_fft_scalar * )fft_input, fft_output);
KissFFT essentially sees an array of real-valued data which contains zeros every second sample (what you filled in as imaginary parts). This is effectively an upsampled version (although without interpolation) of your original signal, i.e. a signal with effectively twice the sampling rate (which is not accounted for in your freq_bin to frequency conversion). To fix this, I suggest you pack your data into a kiss_fft_scalar array:
kiss_fft_scalar fft_input[n];
...
for (ctr=0; ctr<n; ctr++)
{
...
fft_input[ctr] = window[ctr];
...
}
kiss_fftr_cfg fftConfig = kiss_fftr_alloc(n,0,NULL,NULL);
kiss_fftr(fftConfig, fft_input, fft_output);
Note also that while looking for the peak magnitude, you probably are only interested in the final largest peak, instead of the running maximum. As such, you could limit the loop to only computing the peak (using freq_bin instead of ctr as an array index in the following sprintf statements if needed):
for (ctr=0; ctr<n1; ctr++)
{
fft_mag[ctr] = 10*(sqrt((fft_output[ctr].r * fft_output[ctr].r) + (fft_output[ctr].i * fft_output[ctr].i)))/(0.5*n);
if(fft_mag[ctr] > peak)
{
peak = fft_mag[ctr];
freq_bin = ctr;
}
} // close the loop here before computing "frequency"
Finally, when computing the frequency associated with the bin with the largest magnitude, you need the ensure the computation is done using floating point arithmetic. If as I suspect n is an integer, your formula would be performing the 10989/n factor using integer arithmetic resulting in truncation. This can be simply remedied with:
frequency = (freq_bin*(10989.0/n)); // 10989 is the sampling freq
How to make a function for atan. This will be working inside SQLLite Query. I have needed acos but I got formula of acos in this needed atan.
newCos = 2 * atan( sqrt(1-pow(var,2))/(1+var) );
But we need atan function for run this
Distance function for sqlite..... in SQL acos working can manage using this .. May Be helpfull
As part of an iPhone SDK project, I have an sqlite database with a table full of geographic locations, each stored as a latitude and longitude value in degrees. I wanted to be able to perform an SQL SELECT on this table and ORDER BY each row’s distance from an arbitrary point. I’ve achieved this by defining a custom sqlite function. This article contains the code for the function, together with instructions on using it.
Here’s the function, together with a convenience macro to convert from degrees to radians. This function is based on an online distance calculator I found which makes use of the spherical law of cosines.
#define DEG2RAD(degrees) (degrees * 0.01745327) // degrees * pi over 180
static void distanceFunc(sqlite3_context *context, int argc, sqlite3_value **argv)
{
// check that we have four arguments (lat1, lon1, lat2, lon2)
assert(argc == 4);
// check that all four arguments are non-null
if (sqlite3_value_type(argv[0]) == SQLITE_NULL || sqlite3_value_type(argv[1]) == SQLITE_NULL || sqlite3_value_type(argv[2]) == SQLITE_NULL || sqlite3_value_type(argv[3]) == SQLITE_NULL) {
sqlite3_result_null(context);
return;
}
// get the four argument values
double lat1 = sqlite3_value_double(argv[0]);
double lon1 = sqlite3_value_double(argv[1]);
double lat2 = sqlite3_value_double(argv[2]);
double lon2 = sqlite3_value_double(argv[3]);
// convert lat1 and lat2 into radians now, to avoid doing it twice below
double lat1rad = DEG2RAD(lat1);
double lat2rad = DEG2RAD(lat2);
// apply the spherical law of cosines to our latitudes and longitudes, and set the result appropriately
// 6378.1 is the approximate radius of the earth in kilometres
sqlite3_result_double(context, acos(sin(lat1rad) * sin(lat2rad) + cos(lat1rad) * cos(lat2rad) * cos(DEG2RAD(lon2) - DEG2RAD(lon1))) * 6378.1);
}
This defines an SQL function distance(Latitude1, Longitude1, Latitude2, Longitude2), which returns the distance (in kilometres) between two points.
To use this function, add the code above to your Xcode project, and then add this line immediately after you call sqlite3_open:
sqlite3_create_function(sqliteDatabasePtr, "distance", 4, SQLITE_UTF8, NULL, &distanceFunc, NULL, NULL);
…where sqliteDatabasePtr is the database pointer returned by your call to sqlite3_open.
Assuming you have a table called Locations, with columns called Latitude and Longitude (both of type double) containing values in degrees, you can then use this function in your SQL like this:
SELECT * FROM Locations ORDER BY distance(Latitude, Longitude, 51.503357, -0.1199)
Your question is a bit unclear but it seems that you know how to write your own custom SQLite functions meaning that your actual question is about how to write the various trigonometric functions.
You don't need to write them. Simply use the standard math functions.
#import <math.h>
double newCos = cos(someRadianAngle);
See the man page for cos, sin, tan, atan, etc.
Do 1 thing pass var1 & var2 to function & then use getValue in query. Xcode has in built atan,acos function you just need to pass value in radians.
float getValue=[self calculatetan:(float)var1 withSecondValue:(float)var2];
-(float)calculatetan:(float)var1 withSecondValue:(float)var2
{
newCos = 2 * atan( sqrt(1-pow(var,2))/(1+var) )// or split this
return newCos;
}