Mapping UGC or FIPS6 geocodes to polygons? - geolocation

I am looking to convert UGC or FIPS6 geocodes to polygons (or even rough lat/lng coordinates + radius). An example of the geocodes can be found here: http://alerts.weather.gov/cap/us.php?x=0
Anybody knows where I could find a mapping for these geocodes?

The data used by the NWS can be found here:
http://www.nws.noaa.gov/geodata/
In order to actually get the coordinates from the data, I used the program OpenJump to save the data in the CSV format.

Updated Answer for June 2019
The NWS Public Forecast Zones can be downloaded as a shapefile from https://www.weather.gov/gis/PublicZones
I used QGIS to convert the shapefile to WGS84 (EPSG:4326) and exported to CSV using WKT geometry. That resulted in a 122MB CSV file.
Instructions for Windows QGIS 3.4.3
Download and extract z_02ap19.zip
In QGIS, Layer -> Add Layer -> Add Vector Layer... (or press Ctrl+Shift+V)
Source Type = File, Encoding = System, Vector Dataset(s) = z_02ap19.shp extracted eariler. Then click Add.
Result
[Optional] Right click the layer, Set CRS -> Set Layer CRS... and set the CRS to EPSG:4326.
Right click the layer, Export -> Save Feature As...
Format = Comma Separated Value [CSV]
Choose a file location.
Choose an encoding, usually System or UTF-8.
Uncheck "Add saved file to map"
Make sure all fields are selected
Geometry type should be Automatic (They all end up as Polygons)
Layer Options:
CREATE_CSVT = YES (Creates a single file that describes the field types, useful for re-importing the file back into other GIS programs)
GEOMETRY = AS_WKT
LINEFORMAT = CRLF (Windows) or LF (Unix), historically, but most programs now can handle both
SEPARATOR = COMMA (Up to you)
STRING_QUOTING = ALWAYS (Likely doesn't matter as the data won't contain quotes anyways)
WRITE_BOM = NO (Byte-order mark, up to you)
Click OK, and QGIS will generated the file which takes several seconds.

Related

Reading byte by byte HEIF/HEIC images XMP metadata

I am trying to build a native byte parser that given an HEIF image it returns back its metadata (mainly width and height of the image).
I am struggling a lot at the moment finding the right documentation and specs to use for parsing such info. I have to do such thing for both XMP and EXIF metadata, but let's focus only on XMP for now.
What I need is the exact byte structure of where to find what. According to the HEIF international standard doc (here):
For image items, XMP metadata shall be stored as an item of item_type value 'mime' and content type'application/rdf+xml'. The body of the item shall be a valid XMP document, in XML form.
Perfect, if I analyse a sample image I can find such marker:
From now on I can't find anywhere how to get the info I need. I would expect something saying "the first 2 bytes are the header, with marker 0xFF 0xCE (just an example), the next 2 bytes are the width, and following 2 bytes the height...etc".
In my case I am going by intuition. My sample image is of dimensions 8736x5856. If in the tool I look for Big-Endian 2 byte integer 8736, I can find it:
And hey, 2 bytes later there is the 5856 height as well:
But again, I arrived here by luck and intuition. I need a proper schema that tells me where to find what in such a way that I can traslate it to code.
What I think you'r seeing is a "mime" and "ispe" mp4 box as HEIF is ISOBMFF based. I would recommend looking at the file using a mp4 capable tool like mp4dump, HexFiend or fq (note: my tool). The "ispe" (Image Spatial Extents) box i probably what you want to read.
fq does no support ispe box yet but you could read it like this:
$ fq 'grep_by(.type=="ispe").data | tobytes | [.[-8:-4], .[-4:] | tonumber]' file.heif
[
8736,
5856
]
So what you need is probably a basic ISOBMFF reader and then look for the "ispe" box and decode it. If you'r only looking for the first of a specific box you can probably ignore that ISOBMFF is a tree structure.

How to Combine Two HDF5 Datasets without intermediate buffer

I have several HDF5 files all of which have a /dataset that contains vectors. I would like to combine all these vectors into one dataset in one file (that is repeatedly append from one file to another). The combined dataset would have chunked storage and be resizable.
Every option I've seen for doing this seems to require reading all the data into a buffer, and then writing it back out, is there a way to more simply pass a dataset/dataspace from one file to another in order to append the data?
Have you investigated h5py Group .copy() method? Although documented as a group action, it works with any h5py object (groups, datasets, links and references). By default it copies object attributes, and supports recursive copying of group members. If you prefer a command line tool, the HDF Group has one to do this. Take a look at h5copy here: HDF5 Group h5 copy doc
Here is a example that demonstrates a simple h5py .copy() implementation. It creates a set of 3 files -- each with 1 dataset (named /dataset, dtype=float, shape=(10,10)). It then creates a NEW HDF5 file, and is followed by another loop to open the previous files and copies the dataset from the "read" file (h5r) to the new "write" file (h5w).
for i in range (1,4):
with h5py.File('SO_68025342_'+str(i)+'.h5',mode='w') as h5f:
arr = np.random.random(100).reshape(10,10)
h5f.create_dataset('dataset',data=arr)
with h5py.File('SO_68025342_all.h5',mode='w') as h5w:
for i in range (1,4):
with h5py.File('SO_68025342_'+str(i)+'.h5',mode='r') as h5r:
h5r.copy('dataset', h5w, name='dataset_'+str(i) )
Here is a method to copy data from multiple files to a single dataset in the merged file. It comes with caveats: 1) all datasets must have the same shape, and 2) you know the number of datasets in advance to size the new dataset. (If not, you can create a resizeable dataset by addingmaxshape=(None,a0,a1), and then use .resize() as needed. I have another post with 2 examples here: How can I combine multiple .h5 file? Look at Methods 3a and 3b.
with h5py.File('SO_68025342_merge.h5',mode='w') as h5w:
for i in range (1,4):
with h5py.File('SO_68025342_'+str(i)+'.h5',mode='r') as h5r:
if 'dataset' not in h5w.keys():
a0, a1 = h5r['dataset'].shape
h5w.create_dataset('dataset', shape=(3,a0,a1))
h5w['dataset'][i-1,:] = h5r['dataset']
Assuming your files aren't so conveniently named, you can use glob.iglob() to loop on the file names to read. Then use .keys() to get the dataset names in each file. Also, if all of your datasets really are named /dataset, you need to come up with a naming convention for the new datasets.
Here is a link to the h5py docs with more details: h5py Group .copy() method
If you are not bound to a particular library and programming language, one way to solve your issue could be with the usage of HDFql (in C, C++, Java, Python, C#, Fortran or R).
Given that your posts seem to mention C# quite often, find below a solution in C#. It assumes that 1) the dataset name is dset, 2) each dataset is of data type float, and 3) each dataset is a vector of one dimension (size 100) - feel free to adapt the code to your concrete use-case:
// declare variable
float []data = new float[100];
// retrieve all file names (from current directory) that end with '.h5'
HDFql.Execute("SHOW FILE LIKE \\.h5$");
// create an HDF5 file named 'output.h5' and use (i.e. open) it
HDFql.Execute("CREATE AND USE FILE output.h5");
// create a chunked and extendible HDF5 dataset named 'dset' in file 'output.h5'
HDFql.Execute("CREATE CHUNKED(100) DATASET dset AS FLOAT(0 TO UNLIMITED)");
// register variable 'data' for subsequent usage (by HDFql)
HDFql.VariableRegister(data);
// loop cursor and process each file found
while(HDFql.CursorNext() == HDFql.Success)
{
// alter (i.e. extend) dataset 'dset' (from file 'output.h5') with more 100 floats
HDFql.Execute("ALTER DIMENSION dset TO +100");
// select (i.e. read) dataset 'dset' (from file found) and populate variable 'data'
HDFql.Execute("SELECT FROM \"" + HDFql.CursorGetChar() + "\" dset INTO MEMORY " + HDFql.VariableGetNumber(data));
// insert (i.e. write) values stored in variable 'data' into dataset 'dset' (from file 'output.h5') at the end of it (using an hyperslab)
HDFql.Execute("INSERT INTO dset(-1:::) VALUES FROM MEMORY " + HDFql.VariableGetNumber(data));
}

QGIS with CSV file, incorrect coordinates

I'm working on a QGIS project and adding layers from CSV files. I can Add Delimited Text Layer, then save the layer as a shapefile, making sure I've selected Latitude and Longitude correctly as y and x. But whether I specify Lat as y and Long as x or vice versa, the point shows up in the same place, in the Galapagos Islands, not Chicago as it should be. I'm using the correct Geometry CRS for the project.
Resolved: The problem resolved itself when I closed and re-opened QGIS.

What kind of coordinates are these numbers?

I have geo data, that contain X field like: 1012532,749 and Y field like: 178774,7655. This data from the shapefile format, but I don`t know in what GEO standart this data is.
Maybe someone know, or can show me the way to find out, how translate this coords in GPS.
From your comment the situation is clear:
You got an invalid shape file:
The prj file states WGS84 coordinates in decimal degrees
which has range: (longitude(x) [-180-.0, 180.0], latitude(y): [-90.0, 90.0].
But the coordinates posted are not in valid range.
The prj definition does not fit to the rest of the shape files (your posted coordinate).
(This happens because the prj file is optional, and has probably the default settings of some other project)
There is little chance of knowing, without further knowlegde.
Simply ask the data provider which geographical datum (name) the coordinates are related to. Further you should claim, that the file is erorrnous and that they should provide a correct prj file (or remove the prj file, if the coordinates are not related to world coordinates)
IF you know the source, e.g the swiss "Landesvermessung" then you could think it is for example a "Swiss Grid CH1903+ / LV95" grid system. (This was an example, the coordinates are not in that Swiss grid)
But it does not make sense to reverse engineer that, just ask the data provider, or if appliable read the info where you have got that data from.

British National Grid Shapefile - convert to WGS84 Lat/Lon

I have a set of ESRI shapefiles which, I'm told, have been written using the British National Grid coordinate system. I need to convert these files to WGS84 lat/lons, for onward conversion to KML files. I'm having trouble doing this as follows:
If I open each of the original files in MapInfo Professional telling it that my file has a projection which is British National Grid then I can't see any geographic objects in the file; the map window is completely empty.
If I use MapInfo Professional's Universal Translator to convert the files to a WGS84 MapInfo TAB file then, just as before, the converted file won't display any geographic objects, the MapInfo window is empty.
Can I verify the coordinate system of these files? Am I missing anything here? Should I be able to convert the shapefiles in the way I'm expecting to be able to and view them using MapInfo Professional? Will another tool do a better job for me?
Thanks.
More Info:
My shapefile has coordinates which don't seem to translate to lat/lon properly and I'm now wondering if the coordinates aren't actually British National Grid. I'm seeing coordinates such as 383702523, 399081141 which appears to be approximately lat/lon 53.488182, -2.247153. Have you any idea what projection system my input file is in?
OS grid doesn't use WGS84 - it uses Airy 1936 (OSGB36) spheroid
So you need to go from OSgrid -> lat/lon then OSGB36->WGS84.
See http://www.ordnancesurvey.co.uk/oswebsite/gps/docs/convertingcoordinatesEN.pdf
To do OS grid to lat-lon see http://www.movable-type.co.uk/scripts/latlong-gridref.html
Then to go from OSGB36 -> WGS84 see http://www.movable-type.co.uk/scripts/latlong-convert-coords.html
http://gothos.info/2009/04/14/transform-projections-with-gdal-ogr/
ogr2ogr is a great tool for doing these sorts of conversions. You would run it with a command like
ogr2ogr -t_srs EPSG:4326 map_wgs84.shp map_original.shp
-t_srs is the option to transform co-ordinate systems. 4326 is the EPSG SRID for WGS84.

Resources