Use xarray with custom function and resample - dask

I'm trying to take an array and resample it with a custom function. From this post: Apply function along time dimension of XArray
def special_mean(x, drop_min=False):
s = np.sum(x)
n = len(x)
if drop_min:
s = s - x.min()
n -= 1
return s/n
is an example sample_mean.
I have a dataset that is:
<xarray.Dataset>
Dimensions: (lat: 100, lon: 130, time: 7305)
Coordinates:
* lon (lon) float32 -99.375 -99.291664 -99.208336 ... -88.708336 -88.625
* lat (lat) float32 49.78038 49.696426 49.61247 ... 41.552795 41.46884
lev float32 1.0
* time (time) datetime64[ns] 2040-01-01 2040-01-02 ... 2059-12-31
Data variables:
tmin (time, lat, lon) float32 dask.array<chunksize=(366, 100, 130), meta=np.ndarray>
tmax (time, lat, lon) float32 dask.array<chunksize=(366, 100, 130), meta=np.ndarray>
prec (time, lat, lon) float32 dask.array<chunksize=(366, 100, 130), meta=np.ndarray>
relh (time, lat, lon) float32 dask.array<chunksize=(366, 100, 130), meta=np.ndarray>
wspd (time, lat, lon) float32 dask.array<chunksize=(366, 100, 130), meta=np.ndarray>
rads (time, lat, lon) float32 dask.array<chunksize=(366, 100, 130), meta=np.ndarray>
Attributes:
history: Fri Jun 14 10:32:22 2019: ncatted -a _FillValue,,o,d,9e+20 IBIS...
And then I apply a resample that is:
data.resample(time='1MS').map(special_mean)
<xarray.Dataset>
Dimensions: (time: 240)
Coordinates:
* time (time) datetime64[ns] 2040-01-01 2040-02-01 ... 2059-12-01
lev float32 1.0
Data variables:
tmin (time) float32 dask.array<chunksize=(1,), meta=np.ndarray>
tmax (time) float32 dask.array<chunksize=(1,), meta=np.ndarray>
prec (time) float32 dask.array<chunksize=(1,), meta=np.ndarray>
relh (time) float32 dask.array<chunksize=(1,), meta=np.ndarray>
wspd (time) float32 dask.array<chunksize=(1,), meta=np.ndarray>
rads (time) float32 dask.array<chunksize=(1,), meta=np.ndarray>
How do I do this function such that I can retain the 'lon' and 'lat' coordinates like when doing
data.resample(time='1MS').mean()

Here's one example of how you can use xr.apply_ufunc().
import xarray as xr
data = xr.tutorial.open_dataset('air_temperature')
def special_mean(x, drop_min=False):
s = np.sum(x)
n = len(x)
if drop_min:
s = s - x.min()
n -= 1
return s/n
def special_func(data):
return xr.apply_ufunc(special_mean, data, input_core_dims=[["time"]],
kwargs={'drop_min': True}, dask = 'allowed', vectorize = True)
data.resample(time='1MS').apply(special_func)
<xarray.Dataset>
Dimensions: (lat: 25, lon: 53, time: 24)
Coordinates:
* time (time) datetime64[ns] 2013-01-01 2013-02-01 ... 2014-12-01
* lat (lat) float32 75.0 72.5 70.0 67.5 65.0 ... 25.0 22.5 20.0 17.5 15.0
* lon (lon) float32 200.0 202.5 205.0 207.5 ... 322.5 325.0 327.5 330.0
Data variables:
air (time, lat, lon) float64 244.6 244.7 244.7 ... 297.7 297.7 297.7

I suspect that you can do what you want with the apply_ufunc method.
(although as a disclaimer I do not know the Xarray API well.)

Related

chunksize for time dimension is not working with xr.open_mfdataset

Using xr.open_mfdataset, I wish to chunk in the time dimension; however, the chunksize for 'time' remain equal to 1 despite setting it equal to 12. Chunking for 'lon' and 'lat' work though. How can I chunk in the 'time' dimension? Thanks- PG
VARS_USED = ["LANDFRAC", "PSL", "PRECC", "PRECL", "TREFHT", "ICEFRAC", "PRECSL_H218OS", "PRECSC_H218Os", "PRECRL_H218OR", "PRECRC_H218Or", "PRECSL_H2OS", "PRECSC_H2Os", "PRECRL_H2OR", "PRECRC_H2Or"]
WSOL_DATA_PATH = '/glade/p/ncgd0030/Steig/90_WSOL/b.ie12.BG1850C5CN.f19_g16.90_WSOL.001_daily.cam.h0.000[2-4]-??_avg_fc.nc'
WSOL_DATA = xr.open_mfdataset(WSOL_DATA_PATH, decode_cf=False, chunks={'time':12, 'lon':72,'lat':48})[VARS_USED]
OUTPUT:
<xarray.Dataset>
Dimensions: (lat: 96, lon: 144, time: 36)
Coordinates:
* lon (lon) float64 0.0 2.5 5.0 7.5 10.0 12.5 15.0 17.5 20.0 ...
* time (time) float64 380.0 409.5 439.0 469.5 500.0 530.5 561.0 ...
* lat (lat) float64 -90.0 -88.11 -86.21 -84.32 -82.42 -80.53 ...
Data variables:
LANDFRAC (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PSL (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECC (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECL (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
TREFHT (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
ICEFRAC (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECSL_H218OS (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECSC_H218Os (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECRL_H218OR (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECRC_H218Or (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECSL_H2OS (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECSC_H2Os (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECRL_H2OR (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
PRECRC_H2Or (time, lat, lon) float32 dask.array<shape=(36, 96, 144), chunksize=(1, 48, 72)>
Attributes:
Conventions: CF-1.0
source: CAM
chunks in open_mfdataset currently only chunks within each input file. If each file contains only one time point, then setting a larger chunk size won't have any effect.
If you want larger chunks along the time dimension, you need to add a separate .chunk() call afterwards. E.g.
WSOL_DATA = WSOL_DATA.chunk({'time': 12})

translating CLLocation coordinates into SCNVector3 matrics

Goal: Have a AR item fixed to a location in real world based on CLLocation coordinates (LLA), and have it in the same place regardless of my orientation on viewload.
Ive been trying to convert geo LLA coordinates into an SCNVector3 matrix. The issues im having are as followed:
1) Lat1 and Long1 are based on my current coordinates from a CLLocationManager on my device. Im looking for a way to get the relative SCNVector3[x,y,z] needed by my device to and place an item at lat2 and long2's location in the real world.
2) How do I get this targeted Lat2, Long2 to be in the same place every time. I feel like the alignment of . gravityAndHeading isn't working to always orient the matrix facing true north. So though sometimes I may get the correct distance from my device location, the orientation is off.
3) I would like the values to be accurate enough to recognize something at least 2-3 meters away. even when i manually program the SCNVector3 from the same position, it seems a bit off.
Any guidance would be greatly appreciated. I've already looked at http://www.mathworks.com/help/aeroblks/llatoecefposition.html and http://www.jaimerios.com/?p=39 but it seems to return the distance but not the relative position to the phone in an SCNVector format.
import SceneKit
import ARKit
import CoreLocation
...
func latLonToECEF(_ lat1: Double,_ lon2: Double,_ alt: Double?) -> [String:Double] {
let x = 6371 * cos(lat1) * cos(long1)
let y = 6371 * cos(lat1) * sin(long1)
let z = 6371 * sin(lat1)
let lat2 = static_lat
let long2 = static_long
let x2 = 6371 * cos(lat2) * cos(long2)
let y2 = 6371 * cos(lat2) * sin(long2)
let z2 = 6371 * sin(lat2)
return ["x": (x2 - x), "y": (y2 - y), "z": (z2 - z)]
}

Problems upsampling data using accelerate

After I downsample a vector with a constant decimating factor, I want to upsample the vector back to the original sample rate (after performing some analyses). However, I am struggling with the upsampling.
For the downsampling I apply vDSP_desamp from the Accelerate framework, and for the upsampling I tried to apply vDSP_vlint:
// Create some test data for input vector
float inputData[10] = {0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9};
int inputLength = 10;
int decimationFactor = 2; // Downsample by factor 2
int downSampledLength = inputLength/decimationFactor;
// Allocate downsampled output vector
float* downSampledData = malloc(downSampledLength*sizeof(float));
// Create filter (average samples)
float* filter = malloc(decimationFactor*sizeof(float));
for (int i = 0; i < decimationFactor; ++i){
filter[i] = 1.0/decimationFactor;
}
// Downsample and average
vDSP_desamp(inputData,
(vDSP_Stride) decimationFactor,
filter,
downSampledData,
(vDSP_Length) downSampledLength, // Downsample to 5 samples
(vDSP_Length) decimationFactor );
free(filter);
The output of downSampledData using this code is:
0.05, 0.25, 0.45, 0.65, 0.85
To upsample the (processed) data vector back to the original sample rate I use the following code:
// For this example downSampledData is just copied to processedData ...
float* processedData = malloc(downSampledLength*sizeof(float));
processedData = downSampledData;
// Create vector used by vDSP_vlint to indicate interpolation constants.
float* b = malloc(downSampledLength*sizeof(float));
for (int i = 0; i < downSampledLength; i++) {
b[i] = i + 0.5;
}
// Allocate data vector for upsampled data
float* upSampledData = malloc(inputLength*sizeof(float));
// Upsample and interpolate
vDSP_vlint (processedData,
b,
1,
upSampledData,
1,
(vDSP_Length) inputLength, // Resample back to 10 samples
(vDSP_Length) downSampledLength);
However, the output of upSampledData is
0.15, 0.35, 0.55, 0.75, 0.43, 0.05, 0.05, 0.05, 0.08, 0.12
which is not correct, apparently. How should I apply vDSP_vlint? Or should I use other functions for upsampling the data?

Calculating bounding box given center lat /long and distance on iOS?

In my iOS app I have a lat and long that gives me my location. I want to be able to calculate the bounding box that satisfies a certain distance d from my location. How can I do this?
TRY 1:
So I tried the solution given by #Neeku. I can see how it's supposed to return the right information but unfortunately it's off. So I don't think I can use it.
The code I wrote is this and I pass in 1000 meters:
MKCoordinateRegion startRegion = MKCoordinateRegionMakeWithDistance(center, meters, meters);
CLLocationCoordinate2D northWestCorner, southEastCorner;
northWestCorner.latitude = startRegion.center.latitude + .5 * startRegion.span.latitudeDelta;
northWestCorner.longitude = startRegion.center.longitude - .5 * startRegion.span.longitudeDelta;
southEastCorner.latitude = startRegion.center.latitude - .5 * startRegion.span.latitudeDelta;
southEastCorner.longitude = startRegion.center.longitude - .5 * startRegion.span.longitudeDelta;
NSLog(#"CENTER <%#,%#>", #(center.latitude),#(center.longitude));
NSLog(#"NW <%#,%#>, SE <%#,%#>",#(northWestCorner.latitude),#(northWestCorner.longitude),#(southEastCorner.latitude),#(southEastCorner.longitude));
So then the result is:
CENTER <38.0826682,46.3028721>
NW <38.08717278501047,46.29717303828632>, SE <38.07816361498953,46.29717303828632>
I then put that in google maps and get this: (see screenshot)
So then to my understanding the 1000 meters should go from the center to the sides of the box. The map is measuring the corner which should be OVER 1000 meters and it's actually just over 800 meters. This is the problem I am trying to solve.
I tried this method before and the distances simply aren't accurate. So, this solution has not worked for me. If you have more suggestions or maybe want to point out what is done wrong here please let me know.
Thank you
Let's say that your desired distance is 111 meters. Then you use the following code:
// 111 kilometers / 1000 = 111 meters.
// 1 degree of latitude = ~111 kilometers.
// 1 / 1000 means an offset of coordinate by 111 meters.
float offset = 1.0 / 1000.0;
float latMax = location.latitude + offset;
float latMin = location.latitude - offset;
// With longitude, things are a bit more complex.
// 1 degree of longitude = 111km only at equator (gradually shrinks to zero at the poles)
// So need to take into account latitude too, using cos(lat).
float lngOffset = offset * cos(location.latitude * M_PI / 180.0);
float lngMax = location.longitude + lngOffset;
float lngMin = location.longitude - lngOffset;
latMax, latMin, lngMax, lngMin will give you your bounding box coordinates.
(You can change this code pretty easily if you need distance other than 111 meters. Just update offset variable accordingly).
You can add/subtract half of the span from the latitude and longitude respectively and you get the values that you need:
CLLocationCoordinate2D centerCoord = CLLocationCoordinate2DMake(38.0826682, 46.3028721);
MKCoordinateRegion region = MKCoordinateRegionMakeWithDistance(centerCoord, 1000, 1000);
double latMin = region.center.latitude - .5 * startRegion.span.latitudeDelta;
double latMax = region.center.latitude + .5 * startRegion.span.latitudeDelta;
double lonMin = region.center.longitude - .5 * startRegion.span.longitudeDelta;
double lonMax = region.center.longitude + .5 * startRegion.span.longitudeDelta;
Just remember that:
latitudeDelta
The amount of north-to-south distance (measured in degrees) to display on the map. Unlike longitudinal distances, which vary based on
the latitude, one degree of latitude is always approximately 111
kilometers (69 miles).
longitudeDelta
The amount of east-to-west distance (measured in degrees) to display for the map region. The number of kilometers spanned by a
longitude range varies based on the current latitude. For example, one
degree of longitude spans a distance of approximately 111 kilometers
(69 miles) at the equator but shrinks to 0 kilometers at the poles.

How do I calculate a Rectangle around a Geographic Point?

CLLocationManager will hand off to my delegate a new CLLocation whenever location has changed. The coordinates of that location are expressed as a CLLocationCoordinate2D object, which simply contains a latitude and a longitude. I'd like to take this location and determine the latitude and longitude 1000m south and 1000 west and the latitude and longitude 1000m north and 1000m east. This way I end up with one coordinate southwest of the location and one northeast of the location.
I have no clue how to do this, and my GoogleFoo seems quite poor tonight. What information I have found has offered up impenetrable mathematics. Anybody help a brotha hacker out? I'm fine to use an iOS API if there is one, but an equation that just operates on double values for the lat and long would be even better. It doesn't have to be accurate within centimeters, though within meters would be nice. Ideally, it would look something like this:
NSArray *rect = CalculateRectangleFromLocation(
clLocationCoordinate2D,
1000.0
);
And then *rect would have four values: the lat and long of the southwest corner and the lat and long of the northeast corner.
Here is the code to get the top/right/bottom/left coordinates of the bounding rectangle.
LatLon.h
#import <Foundation/Foundation.h>
#import <CoreLocation/CoreLocation.h>
extern double radians(double degrees);
extern double degrees(double radians);
extern CLLocationCoordinate2D LatLonDestPoint(CLLocationCoordinate2D origin, double brearing, CLLocationDistance distance);
LatLon.m
const CLLocationDegrees kLatLonEarthRadius = 6371.0;
double radians(double degrees) {
return degrees * M_PI / 180.0;
}
double degrees(double radians) {
return radians * 180.0 / M_PI;
}
CLLocationCoordinate2D LatLonDestPoint(CLLocationCoordinate2D origin, double bearing, CLLocationDistance distance) {
double brng = radians(bearing);
double lat1 = radians(origin.latitude);
double lon1 = radians(origin.longitude);
CLLocationDegrees lat2 = asin(sin(lat1) * cos(distance / kLatLonEarthRadius) +
cos(lat1) * sin(distance / kLatLonEarthRadius) * cos(brng));
CLLocationDegrees lon2 = lon1 + atan2(sin(brng) * sinf(distance / kLatLonEarthRadius) * cos(lat1),
cosf(distance / kLatLonEarthRadius) - sin(lat1) * sin(lat2));
lon2 = fmod(lon2 + M_PI, 2.0 * M_PI) - M_PI;
CLLocationCoordinate2D coordinate;
if (! (isnan(lat2) || isnan(lon2))) {
coordinate.latitude = degrees(lat2);
coordinate.longitude = degrees(lon2);
}
return coordinate;
}
Usage
CLLocationCoordinate2D location = ...;
double distance = ...;
CLLocationCoordinate2D right = LatLonDestPoint(location, 90.0, distance);
CLLocationDegrees rectRight = right.longitude;
CLLocationCoordinate2D top = LatLonDestPoint(location, 0.0, distance);
CLLocationDegrees rectTop = top.latitude;
CLLocationCoordinate2D left = LatLonDestPoint(location, 270.0, distance);
CLLocationDegrees rectLeft = left.longitude;
CLLocationCoordinate2D bottom = LatLonDestPoint(location, 180.0, distance);
CLLocationDegrees rectBottom = bottom.latitude;
Swift
extension CLLocationCoordinate2D {
fileprivate func radians(degrees: Double) -> Double { return degrees * .pi / 180.0 }
fileprivate func degrees(radians: Double) -> Double { return radians * 180.0 / .pi }
func coordinate(bearing: Double, distanceInMeter distance: CLLocationDistance) -> CLLocationCoordinate2D {
let kLatLonEarthRadius: CLLocationDegrees = 6371.0
let brng: Double = radians(degrees: bearing)
let lat1: Double = radians(degrees: self.latitude)
let lon1: Double = radians(degrees: self.longitude)
let lat2: CLLocationDegrees = asin(
sin(lat1) * cos(distance / kLatLonEarthRadius) +
cos(lat1) * sin(distance / kLatLonEarthRadius) * cos(brng)
)
var lon2: CLLocationDegrees = lon1 + atan2(
sin(brng) * sin(distance / kLatLonEarthRadius) * cos(lat1),
cos(distance / kLatLonEarthRadius) - sin(lat1) * sin(lat2)
)
lon2 = fmod(lon2 + .pi, 2.0 * .pi) - .pi
var coordinate = CLLocationCoordinate2D()
if !lat2.isNaN && !lon2.isNaN {
coordinate.latitude = degrees(radians: lat2)
coordinate.longitude = degrees(radians: lon2)
}
return coordinate
}
func rect(distanceInMeter meter: CLLocationDistance) -> (north: Double, west: Double, south: Double, east: Double) {
let north = coordinate(bearing: 0, distanceInMeter: meter).latitude
let south = coordinate(bearing: 180, distanceInMeter: meter).latitude
let east = coordinate(bearing: 90, distanceInMeter: meter).longitude
let west = coordinate(bearing: 270, distanceInMeter: meter).longitude
return (north: north, west: west, south: south, east: east)
}
}
I usually do this by using the PROJ4 library to convert latitude and longitude to a projection in meters that's useful for my region (UTM works well if you don't have more information, I'm in Northern California so surveyors in my region all work in EPSG:2226), adding the appropriate offset in meters to that, and then using PROJ4 to convert back.
Later edit: The answer given by Jayant below is fine, depending on how accurate your meters rectangle needs to be. The earth isn't a sphere, it isn't even an oblate spheroid, so the projection in which add your distance to your latitude and longitude may matter. Even using PROJ4 those meters are at sea level. Geography is harder than you'd think.

Resources