How to spatially plot an attribute parameter? - spatial

Here is my code below. I want to plot CO2 distribution over the South African map.
Loading packages
library(tidyverse)
theme_set(theme_bw())
library("sf")
library("rnaturalearth")
library("rnaturalearthdata")
Assigning world to the countries of the world data
world <- ne_countries(scale = "medium", returnclass = "sf")
Reading the local file with CO2 and geographic coordinates (Lon and Lat)
SA_CO2 <- read_csv2("C:/Users/Xolile Ncipha/Documents/SA_CO2_DJF_2004_2009.csv")
Converting the data frame to a sf object and the coordinate reference system projection to WGS84, which is the CRS code #4326.
(SA_CO2 <- st_as_sf(SA_CO2, coords = c("Lon", "Lat"), crs = 4326, agr = "constant"))
Plotting the map and overlaying it with CO2 dataThe output of my code/script.
ggplot(data = world) + geom_sf() + geom_sf(data = SA_CO2, aes(fill = CO2)) +
CO2 legend
scale_fill_gradientn(colors = sf.colors(10)) +
Confining the map to South African domain.
coord_sf(xlim = c(15, 35), ylim = c(-36, -22.3), expand = FALSE) +
Axis labels
xlab("Longitude") + ylab("Latitude")
The results is the geographic points on the map. I don't get the overlay of CO2 data and its spatial distribution. I have attached a picture of the resulting map and the spatial data.

This is a good question. But, unfortunately, you did not provide a link to the source of your data ( "SA_CO2_DJF_2004_2009.csv"), so it was necessary for me to create some data that is likely somewhat similar to your data, but probably not exactly the same.
I found a spelling error in the following line of your code. Even after correcting the spelling error, this line of code continued causing an error.
scale_fill_gradientn(colors = sf.colors(10)) +
The data I used included tons of CO2 from SA during the years, 2010-2016. I also selected a few SA cities with populations from another source, and then allocated the SA annual CO2 by the ratio of the combined city populations. Therefore, those cities with smaller populations were allocated smaller proportions of the annual CO2 and the larger cities were allocated larger ratio's of CO2.
https://howsouthafrica.com/major-cities-international-airports-south-africa/
https://data.worldbank.org/indicator/EN.ATM.CO2E.KT?locations=ZA&view=map
The code used to create the plot shown at the link below is:
ggplot(data = world) +
geom_sf() +
geom_sf(data = coor.sf, aes(size = tons),
fill = "blue", color = "blue", alpha = .3) +
coord_sf(xlim = c(15, 35), ylim = c(-36, -22.3),
expand = FALSE) + xlab("Longitude") + ylab("Latitude")
Please email me if you have any questions.
[[![SA CO2]]

Related

Calculate area covered by a polygon on earth

I am trying to calculate the area covered by a polygon on a map in square kilometers.
Based on the code from [1] and the corresponding paper [2] I have this code:
double area = 0;
auto coords = QList<QGeoCoordinate>{(QGeoCoordinate(50.542908183, 6.2521438908), QGeoCoordinate(50.250550175, 6.2521438908), QGeoCoordinate(50.250550175, 6.4901310043), QGeoCoordinate(50.542908183, 6.4901310043))};
for(int i=0; i<coords.size()-1; i++)
{
const auto &p1 = coords[i];
const auto &p2 = coords[i+1];
area += qDegreesToRadians(p2.longitude() - p1.longitude()) *
(2 + qSin(qDegreesToRadians(p2.latitude())) +
qSin(qDegreesToRadians(p1.latitude())));
}
area = area * 6378137.0 * 6378137.0 / 2.0;
qDebug() << "Area:" << (area/1000000);
qDebug() << coords;
But the calculated area is completely wrong. Also moving the polyon's vertices around results in strange results: Depending on the vertex the calculated area gets smaller althought the polgon's area is increased and vice verse. The calculated area also seems to depend on which vertex is used as start vertex.
Interestingly the signed are of a ring algorithm (getArea from [1]) returns correct results, meaning that the calculated area increases/decreases when the polygon's size is changed.
The code for calculating the area on a sphere was also used elsewhere so I am pretty sure that something is wrong with my implementation.
[1] https://github.com/openlayers/openlayers/blob/v2.13.1/lib/OpenLayers/Geometry/LinearRing.js#L251
[2] https://trs.jpl.nasa.gov/bitstream/handle/2014/40409/JPL%20Pub%2007-3%20%20w%20Errata.pdf?sequence=3&isAllowed=y
[3] Polygon area calculation using Latitude and Longitude generated from Cartesian space and a world file
I still could not find the error in my code but switching to the ringArea method from https://github.com/mapbox/geojson-area/blob/master/index.js works.

How do I create a buffered LineString with a width defined in meters when using GEOSwift?

I'm hoping to find a way using GEOSwift to take a series of user's LatLngs, construct a linestring and buffer it to always be 30 meters wide regardless of the user's location. I feel like there must be an easier way than the path I'm going down and any help would be appreciated.
Background:
From what I can tell the buffer functions width parameter is currently defined in decimal degrees as my coordinate system is EPSG 4326, which makes calculating the width in meters difficult. I can get a rough estimation of meters per decimal degree for both longitude or latitude with the Haversine formula.
The problem I have is the series of points can move both latitudinally and longitudinally. So the buffer width I need in these cases lies somewhere between ThirtyMetersInLatDegrees and ThirtyMetersInLngDegrees. And in this case the width to supply to the buffer function becomes a weird approximation/ average of the user's overall longitudinal and latitudinal movement throughout the linestring related to ThirtyMetersInLngDegrees and ThirtyMetersInLatDegrees.
i.e. assuming ThirtyMetersInLngDegrees is the max:
ThirtyMetersInLatDegrees <= bufferWidth <= ThirtyMetersInLngDegrees
How can I better accomplish this?
Here's how I'm calculating meters per decimal degree:
//Earth’s radius
let R=6378137.0
let deviceLatitude = 37.535997
let OneMeterInLatDegrees = 1/R * (180/Double.pi)
let OneMeterInLngDegrees = 1/(R*cos(Double.pi*deviceLatitude/180)) * (180/Double.pi)
let ThirtyMetersInLatDegrees = 30 * latDegreesPerMeter
let ThirtyMetersInLngDegrees = 30 * lngDegreesPerMeter

How to estimate? "simple" Nonlinear Regression + Parameter Constraints + AR residuals

I am new to this site so please bear with me. I want to
the nonlinear model as shown in the link: https://i.stack.imgur.com/cNpWt.png by imposing constraints on the parameters a>0 and b>0 and gamma1 in [0,1].
In the nonlinear model [1] independent variable is x(t) and dependent are R(t), F(t) and ξ(t) is the error term.
An example of the dataset can be shown here: https://i.stack.imgur.com/2Vf0j.png 68 rows of time series
To estimate the nonlinear regression I use the nls() function with no problem as shown below:
NLM1 = nls(**Xt ~ (aRt-bFt)/(1-gamma1*Rt), start = list(a = 10, b = 10, lamda = 0.5)**,algorithm = "port", lower=c(0,0,0),upper=c(Inf,Inf,1),data = temp2)
I want to estimate NLM1 with allowing for also an AR(1) on the residuals.
Basically I want the same procedure as we go from lm() to gls(). My problem is that in the gnls() function I dont know how to put contraints for the model parameters a, b, gamma1 and the model estimates wrong values for them.
nls() has the option for lower and upper bounds. I cant do the same on gnls()
In the gnls(): I need to add the contraints something like as in nls() lower=c(0,0,0),upper=c(Inf,Inf,1)
NLM1_AR1 = gnls( model = Xt ~ (aRt-bFt)/(1-gamma1*Rt), data = temp2, start = list(a =13, b = 10, lamda = 0.5),correlation = corARMA(p = 1))
Does any1 know the solution on how to do it?
Thank you

How to convert TangoXyxIjData into a matrix of z-values

I am currently using a Project Tango tablet for robotic obstacle avoidance. I want to create a matrix of z-values as they would appear on the Tango screen, so that I can use OpenCV to process the matrix. When I say z-values, I mean the distance each point is from the Tango. However, I don't know how to extract the z-values from the TangoXyzIjData and organize the values into a matrix. This is the code I have so far:
public void action(TangoPoseData poseData, TangoXyzIjData depthData) {
byte[] buffer = new byte[depthData.xyzCount * 3 * 4];
FileInputStream fileStream = new FileInputStream(
depthData.xyzParcelFileDescriptor.getFileDescriptor());
try {
fileStream.read(buffer, depthData.xyzParcelFileDescriptorOffset, buffer.length);
fileStream.close();
} catch (IOException e) {
e.printStackTrace();
}
Mat m = new Mat(depthData.ijRows, depthData.ijCols, CvType.CV_8UC1);
m.put(0, 0, buffer);
}
Does anyone know how to do this? I would really appreciate help.
The short answer is it can't be done, at least not simply. The XYZij struct in the Tango API does not work completely yet. There is no "ij" data. Your retrieval of buffer will work as you have it coded. The contents are a set of X, Y, Z values for measured depth points, roughly 10000+ each callback. Each X, Y, and Z value is of type float, so not CV_8UC1. The problem is that the points are not ordered in any way, so they do not correspond to an "image" or xy raster. They are a random list of depth points. There are ways to get them into some xy order, but it is not straightforward. I have done both of these:
render them to an image, with the depth encoded as color, and pull out the image as pixels
use the model/view/perspective from OpenGL and multiply out the locations of each point and then figure out their screen space location (like OpenGL would during rendering). Sort the points by their xy screen space. Instead of the calculated screen-space depth just keep the Z value from the original buffer.
or
wait until (if) the XYZij struct is fixed so that it returns ij values.
I too wish to use Tango for object avoidance for robotics. I've had some success by simplifying the use case to be only interested in the distance of any object located at the center view of the Tango device.
In Java:
private Double centerCoordinateMax = 0.020;
private TangoXyzIjData xyzIjData;
final FloatBuffer xyz = xyzIjData.xyz;
double cumulativeZ = 0.0;
int numberOfPoints = 0;
for (int i = 0; i < xyzIjData.xyzCount; i += 3) {
float x = xyz.get(i);
float y = xyz.get(i + 1);
if (Math.abs(x) < centerCoordinateMax &&
Math.abs(y) < centerCoordinateMax) {
float z = xyz.get(i + 2);
cumulativeZ += z;
numberOfPoints++;
}
}
Double distanceInMeters;
if (numberOfPoints > 0) {
distanceInMeters = cumulativeZ / numberOfPoints;
} else {
distanceInMeters = null;
}
Said simply this code is taking the average distance of a small square located at the origin of x and y axes.
centerCoordinateMax = 0.020 was determined to work based on observation and testing. The square typically contains 50 points in ideal conditions and fewer when held close to the floor.
I've tested this using version 2 of my tango-caminada application and the depth measuring seems quite accurate. Standing 1/2 meter from a doorway I slid towards the open door and the distance changed form 0.5 meters to 2.5 meters which is the wall at the end of the hallway.
Simulating a robot being navigated I moved the device towards a trash can in the path until 0.5 meters separation and then rotated left until the distance was more than 0.5 meters and proceeded forward. An oversimplified simulation, but the basis for object avoidance using Tango depth perception.
You can do this by using camera intrinsics to convert XY coordinates to normalized values -- see this post - Google Tango: Aligning Depth and Color Frames - it's talking about texture coordinates but it's exactly the same problem
Once normalized, move to screen space x[1280,720] and then the Z coordinate can be used to generate a pixel value for openCV to chew on. You'll need to decide how to color pixels that don't correspond to depth points on your own, and advisedly, before you use the depth information to further colorize pixels.
The main thing is to remember that the raw coordinates returned are already using the basis vectors you want, i.e. you do not want the pose attitude or location

Improving detection of the orange colour in MATLAB

One of my tasks is to detect some colours from ant colonies from the 16000 images. So, I've already done it very good with blue, pink and green, but now I need to improve detection of the orange colour. It's a bit tricky for me, since I am new one in a field of image processing. I put some examples what I have done and what was my problem.
Raw image:http://img705.imageshack.us/img705/2257/img4263u.jpg
Detection of the orange colour:http://img72.imageshack.us/img72/8197/orangedetection.jpg
Detection of the green colour:http://img585.imageshack.us/img585/1347/greendetection.jpg
I had used selectPixelsAndGetHSV.m to get the HSV value, and after it I used colorDetectHSV.m to detect pixels with the same HSV value.
Could you give me any sugesstion how to improve detection of the orange colour and not to detect whole ants and broods around them?
Thank you in advance!
function [K]=colorDetectHSV(RGB, hsvVal, tol)
HSV = rgb2hsv(RGB);
% find the difference between required and real H value:
diffH = abs(HSV(:,:,1) - hsvVal(1));
[M,N,t] = size(RGB);
I1 = zeros(M,N); I2 = zeros(M,N); I3 = zeros(M,N);
T1 = tol(1);
I1( find(diffH < T1) ) = 1;
if (length(tol)>1)
% find the difference between required and real S value:
diffS = abs(HSV(:,:,2) - hsvVal(2));
T2 = tol(2);
I2( find(diffS < T2) ) = 1;
if (length(tol)>2)
% find the difference between required and real V value:
difV = HSV(:,:,3) - hsvVal(3);
T3 = tol(3);
I3( find(diffS < T3) ) = 1;
I = I1.*I2.*I3;
else
I = I1.*I2;
end
else
I = I1;
end
K=~I;
subplot(2,1,1),
figure,imshow(RGB); title('Original Image');
subplot(2,1,2),
figure,imshow(~I,[]); title('Detected Areas');
You don't show what you are using as target HSV values. These may be the problem.
In the example you provided, a lot of areas are wrongly selected whose hue ranges from 30 to 40. These areas correspond to ants body parts. The orange parts you want to select actually have a hue ranging from approximately 7 to 15, and it shouldn't be difficult to differentiate them from ants.
Try adjusting your target values (especially hue) and you should get better results. Actually you can also probably disregard brightness and saturation, hue seems to be sufficient in this case.

Resources