Folium-ChoropletMap issues with key_on: does not overlay the choroplet map - geojson

I have a strange issue with the following piece of code:
m10=folium.Map(location=[41.9027835,12.4963655],tiles='openstreetmap',zoom_start=5)
df.reset_index(inplace = True)
folium.Choropleth(
geo_data = df.to_json(),
data = df,
columns=['TERRITORIO', var],
key_on='feature.properties.TERRITORIO',
fill_color='YlGnBu',
fill_opacity=0.6,
line_opacity=1,
nan_fill_color='black',
legend_name= get_title_(file_name),
smooth_factor=0).add_to(m10)
folium.features.GeoJson(df,
name='Labels',
style_function=lambda x: {'color':'transparent','fillColor':'transparent','weight':0},
tooltip=folium.features.GeoJsonTooltip(fields=[var],
aliases = [indicator],
labels=True,
sticky=False
)
).add_to(m10)
I use the same piece of code on two different geodataframes. With the first (smaller) dataframe I have no issues.
However, when I try to do the same with the other one I do not see the choroplet map layer.
This is the first dataset (after the reset of the index):
TERRITORIO ... geometry
0 Nord ... MULTIPOLYGON (((9.85086 44.02340, 9.85063 44.0...
1 Centro ... MULTIPOLYGON (((10.31417 42.35043, 10.31424 42...
2 Mezzogiorno ... MULTIPOLYGON (((8.41112 38.86296, 8.41127 38.8...
This is the second dataset (after the reset of the index)
TERRITORIO ... geometry
0 Abruzzo ... MULTIPOLYGON (((930273.425 4714737.743, 930147...
1 Basilicata ... MULTIPOLYGON (((1073851.435 4445828.604, 10738...
2 Calabria ... MULTIPOLYGON (((1083350.847 4416684.239, 10833...
3 Campania ... MULTIPOLYGON (((1037266.901 4449456.848, 10372...
4 Emilia-Romagna ... MULTIPOLYGON (((618335.211 4893983.160, 618329...
These are insted the json files:
first:
{"type": "FeatureCollection", "features": [{"id": "0", "type": "Feature", "properties": {"INDICATORE": "Densit\\u00e0 di verde storico", "NOTE": null, "Shape_Area": 57926800546.7, "Shape_Leng": 2670893.51269, "TERRITORIO": "Nord", "UNITA_MISURA": "per 100 m2", "V_2004": null, "V_2005": null, "V_2006": null, "V_2007": null, "V_2008": null, "V_2009": null, "V_2010": null, "V_2011": 2.4, "V_2012": 2.4, "V_2013": 2.4, "V_2014": 2.4, "V_2015": 2.4, "V_2016": 2.4, "V_2017": 2.4, "V_2018": 2.4, "V_2019": null, "index": 0}, "geometry": {"type": "MultiPolygon", "coordinates": ...
second:
{"type": "FeatureCollection", "features": [{"id": "0", "type": "Feature", "properties": {"INDICATORE": "Densit\\u00e0 e rilevanza del patrimonio museale", "NOTE": null, "Shape_Area": 10831496151.0, "Shape_Leng": 664538.009079, "TERRITORIO": "Abruzzo", "UNITA_MISURA": "per 100 km2", "V_2004": null, "V_2005": null, "V_2006": null, "V_2007": null, "V_2008": null, "V_2009": null, "V_2010": null, "V_2011": null, "V_2012": null, "V_2013": null, "V_2014": null, "V_2015": 0.22, "V_2016": null, "V_2017": 0.13, "V_2018": 0.11, "V_2019": null}, "geometry": {"type": "MultiPolygon", "coordinates":...
I really do not understand why one works and the other does not.
Do you have any suggestion?
Thank you in advance for your time!

I can't find any geometry for Nord, Centro, Mezzogiorno Italy, so have sythesized by dissolving regions geometry
have setup functions and variables used by your code to make this a MWE
can switch between geometries by # regions==False, north/central/south==True if True: both generate appropriate folium maps
it's clear in your question that your two sets of geometry are using different CRS. First data set looks like EPSG:4326 (hence works). Second looks like a UTM CRS (points in meters not degrees) that would need to be projected to EPSG:4326
import folium
import geopandas as gpd
import numpy as np
# make SO code runnable, get some geometry and set columns / variables used by code
df = gpd.read_file("https://github.com/openpolis/geojson-italy/raw/master/geojson/limits_IT_regions.geojson").sort_values("reg_name")
df["var"] = np.random.randint(1,10,len(df))
df["TERRITORIO"] = df["reg_istat_code_num"]
df["NCM"] = np.where(df["reg_istat_code_num"]<9,"Nord", np.where(df["reg_istat_code_num"]<15,"Centro", "Mezzogiorno"))
var = "var"
file_name="regions"
indicator = "some number"
def get_title_(file_name):
return file_name
# regions==False, north/central/south==True
if True:
df = df.dissolve("NCM")
file_name = "ncm"
# unchanged code
m10 = folium.Map(location=[41.9027835, 12.4963655], tiles="openstreetmap", zoom_start=5)
df.reset_index(inplace=True)
folium.Choropleth(
geo_data=df.to_json(),
data=df,
columns=["TERRITORIO", var],
key_on="feature.properties.TERRITORIO",
fill_color="YlGnBu",
fill_opacity=0.6,
line_opacity=1,
nan_fill_color="black",
legend_name=get_title_(file_name),
smooth_factor=0,
).add_to(m10)
folium.features.GeoJson(
df,
name="Labels",
style_function=lambda x: {
"color": "transparent",
"fillColor": "transparent",
"weight": 0,
},
tooltip=folium.features.GeoJsonTooltip(
fields=[var], aliases=[indicator], labels=True, sticky=False
),
).add_to(m10)
m10

Modifying the crs I was able to overcome the above issiue.
# Create the folium map
m10=folium.Map(location=[41.9027835,12.4963655],tiles='openstreetmap',zoom_start=5)
# Data
df.to_crs(crs = 4326, inplace = True)
df.reset_index(inplace = True)
folium.Choropleth(
geo_data = df.to_json(),
data = df,
columns=['TERRITORIO', var],
key_on='feature.properties.TERRITORIO',
fill_color='YlGnBu',
fill_opacity=0.6,
line_opacity=1,
nan_fill_color='black',
legend_name= get_title_(file_name),
smooth_factor=0).add_to(m10)

Related

Telegraf splits input data into different outputs

I want to write Telegraf config file which will:
Uses openweathermap input or custom http request result
{
"fields": {
...
"humidity": 97,
"temperature": -11.34,
...
},
"name": "weather",
"tags": {...},
"timestamp": 1675786146
}
Splits result on two similar JSONs:
{
"sensorID": "owm",
"timestamp": 1675786146,
"value": 97,
"type": "humidity"
}
and
{
"sensorID": "owm",
"timestamp": 1675786146,
"value": -11.34,
"type": "temperature"
}
Sends this JSONs into MQTT queue
Is it possible or I must create two different configs and make two api calls?
I found next configuration which solves my problem:
[[outputs.mqtt]]
servers = ["${MQTT_URL}", ]
topic_prefix = "owm/data"
data_format = "json"
json_transformation = '{"sensorID":"owm","type":"temperature","value":fields.main_temp,"timestamp":timestamp}'
[[outputs.mqtt]]
servers = ["${MQTT_URL}", ]
topic_prefix = "owm/data"
data_format = "json"
json_transformation = '{"sensorID":"owm","type":"humidity","value":fields.main_humidity,"timestamp":timestamp}'
[[inputs.http]]
urls = [
"https://api.openweathermap.org/data/2.5/weather?lat={$LAT}&lon={$LON}2&appid=${API_KEY}&units=metric"
]
data_format = "json"
Here we:
Retrieve data from OWM in input plugin.
Transform received data structure in needed structure in two different output plugins. We use this language https://jsonata.org/ for this aim.

Bug? Using a data transform with date/timeunit

For a while now I've been using a flatten transform to move data from a columnar data frame to a Vega Lite spec, and wonder if it's the source of a recent problem I've encountered with a time series. The time series spec is in a gist on the vega editor, along with a similar pattern plotting sin and cos that uses flatten without a problem.
As near as I can tell, the spx time series should work with this transform, and it does work if I flatten the data myself. I'm out of ideas; maybe there's something odd with flatten and date/times? I've tried every combination of type specification, timeUnit, etc., I can think of but nothing seems to work.
Can anyone spot an error in the spx time series gist?
For some reason, your dates are not getting parsed properly. It may be a bug but you can easily workaround it by using a calculate to convert your date to a proper date.
Editor
{
"transform": [
{"flatten": ["spxdate", "open", "high", "low", "close"]},
{"calculate": "toDate(datum.spxdate)", "as": "new"}
],
"data": {
"values": {
"spxdate": [
"1990-01-02",
"1990-01-03",
"1990-01-04",
"1990-01-05",
"1990-01-08",
"1990-01-09",
"1990-01-10",
"1990-01-11",
"1990-01-12",
"1990-01-15",
"1990-01-16",
"1990-01-17",
"1990-01-18",
"1990-01-19",
"1990-01-22",
"1990-01-23",
"1990-01-24",
"1990-01-25",
"1990-01-26",
"1990-01-29",
"1990-01-30",
"1990-01-31",
"1990-02-01",
"1990-02-02",
"1990-02-05",
"1990-02-06",
"1990-02-07",
"1990-02-08",
"1990-02-09",
"1990-02-12",
"1990-02-13",
"1990-02-14",
"1990-02-15",
"1990-02-16",
"1990-02-20",
"1990-02-21",
"1990-02-22",
"1990-02-23",
"1990-02-26",
"1990-02-27",
"1990-02-28",
"1990-03-01",
"1990-03-02",
"1990-03-05",
"1990-03-06",
"1990-03-07",
"1990-03-08",
"1990-03-09",
"1990-03-12",
"1990-03-13",
"1990-03-14",
"1990-03-15",
"1990-03-16",
"1990-03-19",
"1990-03-20",
"1990-03-21",
"1990-03-22",
"1990-03-23",
"1990-03-26",
"1990-03-27",
"1990-03-28",
"1990-03-29",
"1990-03-30",
"1990-04-02",
"1990-04-03",
"1990-04-04",
"1990-04-05",
"1990-04-06",
"1990-04-09",
"1990-04-10",
"1990-04-11",
"1990-04-12",
"1990-04-16",
"1990-04-17",
"1990-04-18",
"1990-04-19",
"1990-04-20",
"1990-04-23",
"1990-04-24",
"1990-04-25",
"1990-04-26",
"1990-04-27",
"1990-04-30",
"1990-05-01",
"1990-05-02",
"1990-05-03",
"1990-05-04",
"1990-05-07",
"1990-05-08",
"1990-05-09",
"1990-05-10",
"1990-05-11",
"1990-05-14",
"1990-05-15",
"1990-05-16",
"1990-05-17",
"1990-05-18",
"1990-05-21",
"1990-05-22",
"1990-05-23"
],
"open": [
353.3999938964844,
359.69000244140625,
358.760009765625,
355.6700134277344,
352.20001220703125,
353.8299865722656,
349.6199951171875,
347.30999755859375,
348.5299987792969,
339.92999267578125,
337,
340.7699890136719,
337.3999938964844,
338.19000244140625,
339.1400146484375,
330.3800048828125,
331.6099853515625,
330.260009765625,
326.0899963378906,
325.79998779296875,
325.20001220703125,
322.9800109863281,
329.0799865722656,
328.7900085449219,
330.9200134277344,
331.8500061035156,
329.6600036621094,
333.75,
333.0199890136719,
333.6199951171875,
330.0799865722656,
331.0199890136719,
332.010009765625,
334.8900146484375,
332.7200012207031,
327.9100036621094,
327.6700134277344,
325.70001220703125,
324.1600036621094,
328.67999267578125,
330.260009765625,
331.8900146484375,
332.739990234375,
335.5400085449219,
333.739990234375,
337.92999267578125,
336.95001220703125,
340.1199951171875,
337.92999267578125,
338.6700134277344,
336,
336.8699951171875,
338.07000732421875,
341.9100036621094,
343.5299987792969,
341.57000732421875,
339.739990234375,
335.69000244140625,
337.2200012207031,
337.6300048828125,
341.5,
342,
340.7900085449219,
339.94000244140625,
338.70001220703125,
343.6400146484375,
341.0899963378906,
340.7300109863281,
340.0799865722656,
341.3699951171875,
342.07000732421875,
341.9200134277344,
344.3399963378906,
344.739990234375,
344.67999267578125,
340.7200012207031,
338.0899963378906,
335.1199951171875,
331.04998779296875,
330.3599853515625,
332.0299987792969,
332.9200134277344,
329.1099853515625,
330.79998779296875,
332.25,
334.4800109863281,
335.5799865722656,
338.3900146484375,
340.5299987792969,
342.010009765625,
342.8699951171875,
343.82000732421875,
352,
354.75,
354.2699890136719,
354,
354.4700012207031,
354.6400146484375,
358,
358.42999267578125
],
"high": [
359.69000244140625,
360.5899963378906,
358.760009765625,
355.6700134277344,
354.239990234375,
354.1700134277344,
349.6199951171875,
350.1400146484375,
348.5299987792969,
339.94000244140625,
340.75,
342.010009765625,
338.3800048828125,
340.4800109863281,
339.9599914550781,
332.760009765625,
331.7099914550781,
332.3299865722656,
328.5799865722656,
327.30999755859375,
325.7300109863281,
329.0799865722656,
329.8599853515625,
332.1000061035156,
332.1600036621094,
331.8599853515625,
333.760009765625,
336.0899963378906,
334.6000061035156,
333.6199951171875,
331.6099853515625,
333.20001220703125,
335.2099914550781,
335.6400146484375,
332.7200012207031,
328.1700134277344,
330.9800109863281,
326.1499938964844,
328.6700134277344,
331.94000244140625,
333.4800109863281,
334.3999938964844,
335.5400085449219,
336.3800048828125,
337.92999267578125,
338.8399963378906,
340.6600036621094,
340.2699890136719,
339.0799865722656,
338.6700134277344,
337.6300048828125,
338.9100036621094,
341.9100036621094,
343.760009765625,
344.489990234375,
342.3399963378906,
339.7699890136719,
337.5799865722656,
339.739990234375,
341.5,
342.5799865722656,
342.07000732421875,
341.4100036621094,
339.94000244140625,
343.760009765625,
344.1199951171875,
342.8500061035156,
341.7300109863281,
341.8299865722656,
342.4100036621094,
343,
344.7900085449219,
347.29998779296875,
345.19000244140625,
345.3299865722656,
340.7200012207031,
338.5199890136719,
335.1199951171875,
332.9700012207031,
332.739990234375,
333.760009765625,
333.57000732421875,
331.30999755859375,
332.8299865722656,
334.4800109863281,
337.0199890136719,
338.4599914550781,
341.07000732421875,
342.0299987792969,
343.0799865722656,
344.9800109863281,
352.30999755859375,
358.4100036621094,
355.0899963378906,
354.67999267578125,
356.9200134277344,
354.6400146484375,
359.07000732421875,
360.5,
359.2900085449219
],
"low": [
351.9800109863281,
357.8900146484375,
352.8900146484375,
351.3500061035156,
350.5400085449219,
349.6099853515625,
344.32000732421875,
347.30999755859375,
339.489990234375,
336.57000732421875,
333.3699951171875,
336.260009765625,
333.9800109863281,
338.19000244140625,
330.2799987792969,
328.6700134277344,
324.1700134277344,
325.3299865722656,
321.44000244140625,
321.7900085449219,
319.8299865722656,
322.9800109863281,
327.760009765625,
328.0899963378906,
330.45001220703125,
328.20001220703125,
326.54998779296875,
332,
332.4100036621094,
329.9700012207031,
327.9200134277344,
330.6400146484375,
331.6099853515625,
332.4200134277344,
326.260009765625,
324.4700012207031,
325.70001220703125,
322.1000061035156,
323.9800109863281,
328.4700012207031,
330.1600036621094,
331.0799865722656,
332.7200012207031,
333.489990234375,
333.57000732421875,
336.3299865722656,
336.95001220703125,
336.8399963378906,
336.1400146484375,
335.3599853515625,
334.92999267578125,
336.8699951171875,
338.07000732421875,
339.1199951171875,
340.8699951171875,
339.55999755859375,
333.6199951171875,
335.69000244140625,
337.2200012207031,
337.0299987792969,
340.6000061035156,
339.7699890136719,
338.2099914550781,
336.3299865722656,
338.70001220703125,
340.3999938964844,
340.6300048828125,
338.94000244140625,
339.8800048828125,
340.6199951171875,
341.260009765625,
341.9100036621094,
344.1000061035156,
342.05999755859375,
340.1099853515625,
337.5899963378906,
333.4100036621094,
330.0899963378906,
329.7099914550781,
330.3599853515625,
330.6700134277344,
328.7099914550781,
327.760009765625,
330.79998779296875,
332.1499938964844,
334.4700012207031,
335.1700134277344,
338.1099853515625,
340.1700134277344,
340.8999938964844,
342.7699890136719,
343.82000732421875,
351.95001220703125,
352.8399963378906,
351.95001220703125,
354,
352.5199890136719,
353.7799987792969,
356.0899963378906,
356.989990234375
],
"close": [
359.69000244140625,
358.760009765625,
355.6700134277344,
352.20001220703125,
353.7900085449219,
349.6199951171875,
347.30999755859375,
348.5299987792969,
339.92999267578125,
337,
340.75,
337.3999938964844,
338.19000244140625,
339.1499938964844,
330.3800048828125,
331.6099853515625,
330.260009765625,
326.0799865722656,
325.79998779296875,
325.20001220703125,
322.9800109863281,
329.0799865722656,
328.7900085449219,
330.9200134277344,
331.8500061035156,
329.6600036621094,
333.75,
332.9599914550781,
333.6199951171875,
330.0799865722656,
331.0199890136719,
332.010009765625,
334.8900146484375,
332.7200012207031,
327.989990234375,
327.6700134277344,
325.70001220703125,
324.1499938964844,
328.6700134277344,
330.260009765625,
331.8900146484375,
332.739990234375,
335.5400085449219,
333.739990234375,
337.92999267578125,
336.95001220703125,
340.2699890136719,
337.92999267578125,
338.6700134277344,
336,
336.8699951171875,
338.07000732421875,
341.9100036621094,
343.5299987792969,
341.57000732421875,
339.739990234375,
335.69000244140625,
337.2200012207031,
337.6300048828125,
341.5,
342,
340.7900085449219,
339.94000244140625,
338.70001220703125,
343.6400146484375,
341.0899963378906,
340.7300109863281,
340.0799865722656,
341.3699951171875,
342.07000732421875,
341.9200134277344,
344.3399963378906,
344.739990234375,
344.67999267578125,
340.7200012207031,
338.0899963378906,
335.1199951171875,
331.04998779296875,
330.3599853515625,
332.0299987792969,
332.9200134277344,
329.1099853515625,
330.79998779296875,
332.25,
334.4800109863281,
335.57000732421875,
338.3900146484375,
340.5299987792969,
342.010009765625,
342.8599853515625,
343.82000732421875,
352,
354.75,
354.2799987792969,
354,
354.4700012207031,
354.6400146484375,
358,
358.42999267578125,
359.2900085449219
]
}
},
"$schema": "https://vega.github.io/schema/vega-lite/v5.json",
"height": 400,
"width": 600,
"mark": "line",
"encoding": {
"x": {"field": "new", "type": "temporal", "timeUnit": "yearmonthdate"},
"y": {"field": "close", "type": "quantitative"}
}
}

Is there a way to use OCR to extract specific data from a CAD technical drawing?

I'm trying to use OCR to extract only the base dimensions of a CAD model, but there are other associative dimensions that I don't need (like angles, length from baseline to hole, etc). Here is an example of a technical drawing. (The numbers in red circles are the base dimensions, the rest in purple highlights are the ones to ignore.) How can I tell my program to extract only the base dimensions (the height, length, and width of a block before it goes through the CNC)?
The issue is that the drawings I get are not in a specific format, so I can't tell the OCR where the dimensions are. It has to figure out on its own contextually.
Should I train the program through machine learning by running several iterations and correcting it? If so, what methods are there? The only thing I can think of are Opencv cascade classifiers.
Or are there other methods to solving this problem?
Sorry for the long post. Thanks.
I feel you... it's a very tricky problem, and we spent the last 3 years finding a solution for it. Forgive me for mentioning the own solution, but it will certainly solve your problem: pip install werk24
from werk24 import Hook, W24AskVariantMeasures
from werk24.models.techread import W24TechreadMessage
from werk24.utils import w24_read_sync
from . import get_drawing_bytes # define your own
def recv_measures(message: W24TechreadMessage) -> None:
for cur_measure in message.payload_dict.get('measures'):
print(cur_measure)
if __name__ == "__main__":
# define what information you want to receive from the API
# and what shall be done when the info is available.
hooks = [Hook(ask=W24AskVariantMeasures(), function=recv_measures)]
# submit the request to the Werk24 API
w24_read_sync(get_drawing_bytes(), hooks)
In your example it will return for example the following measure
{
"position": <STRIPPED>
"label": {
"blurb": "ø30 H7 +0.0210/0",
"quantity": 1,
"size": {
"blurb": "30",
"size_type":" "DIAMETER",
"nominal_size": "30.0",
},
"unit": "MILLIMETER",
"size_tolerance": {
"toleration_type": "FIT_SIZE_ISO",
"blurb": "H7",
"deviation_lower": "0.0",
"deviation_upper": "0.0210",
"fundamental_deviation": "H",
"tolerance_grade": {
"grade":7,
"warnings":[]
},
"thread": null,
"chamfer": null,
"depth":null,
"test_dimension": null,
},
"warnings": [],
"confidence": 0.98810
}
or for a GD&T
{
"position": <STRIPPED>,
"frame": {
"blurb": "[⟂|0.05|A]",
"characteristic": "⟂",
"zone_shape": null,
"zone_value": {
"blurb": "0.05",
"width_min": 0.05,
"width_max": null,
"extend_quantity": null,
"extend_shape": null,
"extend": null,
"extend_angle": null
},
"zone_combinations": [],
"zone_offset": null,
"zone_constraint": null,
"feature_filter": null,
"feature_associated": null,
"feature_derived": null,
"reference_association": null,
"reference_parameter": null,
"material_condition": null,
"state": null,
"data": [
{
"blurb": "A"
}
]
}
}
Check the documentation on Werk24 for details.
Although a managed offering, Mixpeek is one free option:
pip install mixpeek
from mixpeek import Mixpeek
mix = Mixpeek(
api_key="my-api-key"
)
mix.upload(file_name="design_spec.dwg", file_path="s3://design_spec_1.dwg")
This /upload endpoint will extract the contents of your DWG file, then when you search for terms it will include the file_path so you can render it in your HTML.
Behind the scenes it uses the open source LibreDWG library to run a number of AutoCAD native commands such as DATAEXTRACTION.
Now you can search for a term and the relevant DWG file (in addition to the context in which it exists) will be returned:
mix.search(query="retainer", include_context=True)
[
{
"file_id": "6377c98b3c4f239f17663d79",
"filename": "design_spec.dwg",
"context": [
{
"texts": [
{
"type": "text",
"value": "DV-34-"
},
{
"type": "hit",
"value": "RETAINER"
},
{
"type": "text",
"value": "."
}
]
}
],
"importance": "100%",
"static_file_url": "s3://design_spec_1.dwg"
}
]
More documentation here: https://docs.mixpeek.com/

How to draw waveform from waveformdata object in iOS Swift?

[
{
"id": "48250",
"created_at": "2014-07-06 13:05:10",
"user_id": "7",
"duration": "7376",
"permalink": "shawne-back-to-the-roots-2-05072014",
"description": "Years: 2000 - 2005\r\nSet Time: Warm Up (11 pm - 01 am)\r\n",
"downloadable": "1",
"genre": "Drum & Bass",
"genre_slush": "drumandbass",
"title": "Shawne # Back To The Roots 2 (05.07.2014)",
"uri": "https:\/\/api-v2.hearthis.at\/\/shawne-back-to-the-roots-2-05072014\/",
"permalink_url": "http:\/\/hearthis.at\/\/shawne-back-to-the-roots-2-05072014\/",
"artwork_url": "http:\/\/hearthis.at\/_\/cache\/images\/track\/500\/801982cafc20a06ccf6203f21f10c08d_w500.png",
"background_url": "",
"waveform_data": "http:\/\/hearthis.at\/_\/wave_data\/7\/3000_4382f398c454c47cf171aab674cf00f0.mp3.js",
"waveform_url": "http:\/\/hearthis.at\/_\/wave_image\/7\/4382f398c454c47cf171aab674cf00f0.mp3.png",
"user": {
"id": "7",
"permalink": "shawne",
"username": "Shawne (hearthis.at)",
"uri": "https:\/\/api-v2.hearthis.at\/shawne\/",
"permalink_url": "http:\/\/hearthis.at\/shawne\/",
"avatar_url": "http:\/\/hearthis.at\/_\/cache\/images\/user\/512\/06a8299b0e7d8f2909a22697badd7c09_w512.jpg"
},
"stream_url": "http:\/\/hearthis.at\/shawne\/shawne-back-to-the-roots-2-05072014\/listen\/",
"download_url": "http:\/\/hearthis.at\/shawne\/shawne-back-to-the-roots-2-05072014\/download\/",
"playback_count": "75",
"download_count": "9",
"favoritings_count": "7",
"favorited": false,
"comment_count": "0"
}
]
This Api returns waveform url and waveformdata. How do I convert waveform data to draw waveform similar to image in waveformurl
Api - https://hearthis.at/api-v2/
It looks like the "data" is merely a succession of bar heights:
[136,132,133,133,138,...]
So just draw a succession of bars at those heights (or heights proportional to them). You might need to draw just every nth bar, or maybe average each clump of n bars together, in order to get a neater representation (that is what they do at the site you pointed to).

How to transform a point with latitude and longitude value to a coordinates in highcharts map?

Highmap metadata has an option 'hc-transform',can someone explain how to use it?
I used the default China map geodata in highmaps,but now I want to add a new point,for example [lat,lon] [121,23],I want to know how can I transform it to a coordinate.
"crs": {
"type": "name",
"properties": {
"name": "urn:ogc:def:crs:EPSG:3415"
}
},
"hc-transform": {
"cn-all": {
"xpan": 350,
"ypan": 25,
"hitZone": {
"type": "Polygon",
"coordinates": [
[
[7902,3046],
[7929,3041],
[7947,3014],
[7915,2972]
]
]
},
"crs": "+proj=tmerc +lat_0=49 +lon_0=-2 +k=0.9996012717 +x_0=400000 +y_0=-100000 +ellps=airy +toWGS84=446.448,-125.157,542.06,0.15,0.247,0.842,-20.489 +units=m +vunits=m +no_defs",
"scale": 0.000129831107685,
"jsonres": 15.5,
"jsonmarginX": -999,
"jsonmarginY": 9851.0,
"xoffset": -3139937.49309,
"yoffset": 4358972.7486
}
}
Here is what I do.
var transform = Highcharts.maps['countries/cn/cn-all']['hc-transform']['default'];
var position = $(".chart").highcharts().transformFromLatLon({ lat: 121, lon: 23 }, transform);
console.log(position); // {x:NaN ,y:NaN}
Am I wrong in 'hc-transform' setting? How should I make the case.
The 'hc-transform' setting is right.
please try switch lat and lon, like this:
$(".chart").highcharts().transformFromLatLon({ lat: 121, lon: 23 }, transform);
then x and y come out.

Resources