Why is all incoming sms/phone call data repeated in Twilio request? - twilio

I'm seeing all the pertinent data for an incoming phone call and sms repeated when I receive requests from Twilio to handle the communication:
{
AccountSid: [ 'xxx', 'xxx' ],
ToZip: [ '30680', '30680' ],
FromState: [ 'NY', 'NY' ],
Called: [ '+111', '+111' ],
FromCountry: [ 'US', 'US' ],
CallerCountry: [ 'US', 'US' ],
CalledZip: [ '30680', '30680' ],
Direction: [ 'inbound', 'inbound' ],
FromCity: [ 'NEW YORK', 'NEW YORK' ],
CalledCountry: [ 'US', 'US' ],
CallerState: [ 'NY', 'NY' ],
CallSid: [ 'xxx', 'xxx' ],
CalledState: [ 'GA', 'GA' ],
From: [ '+222', '+222' ],
CallerZip: [ '10028', '10028' ],
FromZip: [ '10028', '10028' ],
ApplicationSid: [ 'xxx', 'xxx' ],
CallStatus: [ 'ringing', 'ringing' ],
ToCity: [ 'STATHAM', 'STATHAM' ],
ToState: [ 'GA', 'GA' ],
To: [ '+111', '+111' ],
ToCountry: [ 'US', 'US' ],
CallerCity: [ 'NEW YORK', 'NEW YORK' ],
ApiVersion: [ '2010-04-01', '2010-04-01' ],
Caller: [ '+222', '+222' ],
CalledCity: [ 'STATHAM', 'STATHAM' ]
}

This was happening because my webserver redirects all http traffic to https. The redirect causes twilio to repeat the information (like the to and from number). I hope this saves someone the trouble of figuring this out in the future.

Related

Parsing Quoted Strings and DateTime Offset - GROK and Logstash

With Grok Debuger I am trying to parse some custom data:
1 1 "Device 1" 1 "Input 1" 0 "On" "Off" "2020-01-01T00:00:00.1124303+00:00"
So far I have:
%{INT:id} %{INT:device} %{QUOTEDSTRING:device_name} %{INT:input}
%{QUOTEDSTRING:input_name} %{INT:state} %{QUOTEDSTRING:on_phrase}
%{QUOTEDSTRING:off_phrase} \"%{TIMESTAMP_ISO8601:when}\"
However, I am getting things like double quotes around strings %{QUOTEDSTRING), and two lots of hours and minutes with the time and date %{TIMESTAMP_ISO8601:when}
{
"id": [
[
"1"
]
],
"device": [
[
"1"
]
],
"device_name": [
[
""Device 1""
]
],
"input": [
[
"1"
]
],
"input_name": [
[
""Input 1""
]
],
"state": [
[
"0"
]
],
"on_phrase": [
[
""On""
]
],
"off_phrase": [
[
""Off""
]
],
"when": [
[
"2020-01-01T00:00:00.1124303+00:00"
]
],
"YEAR": [
[
"2020"
]
],
"MONTHNUM": [
[
"01"
]
],
"MONTHDAY": [
[
"01"
]
],
"HOUR": [
[
"00",
"00"
]
],
"MINUTE": [
[
"00",
"00"
]
],
"SECOND": [
[
"00.1124303"
]
],
"ISO8601_TIMEZONE": [
[
"+00:00"
]
]
}
Also, I am a little stuck when it comes to the logstash.conf as I am not sure what I would put as the index in the output. The following code is from a previous example from github:
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}" }
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
manage_template => false
index => "sample-%{+YYYY.MM.dd}"
}
}
I'm guessing mine would look something like this:
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "%{INT:id} %{INT:device} %{QUOTEDSTRING:device_name} %{INT:input} %{QUOTEDSTRING:input_name} %{INT:state} %{QUOTEDSTRING:on_phrase} %{QUOTEDSTRING:off_phrase} \"%{TIMESTAMP_ISO8601:when}\"" }
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
manage_template => false
index => "sample-%{????????}"
}
}
Again I'm unclear as to what I am supposed to do with "sample-%{????????}"
In regard to the double-double-quotes: just use DATA instead of QUOTEDSTRING:
"%{DATA:device_name}"
Duplicated entries in the hours and minutes come from the timezone: first entry is the actual hour, the second one is the hour of the timezone. Same for the minutes.
To get rid of it you would need a custom pattern:
"(?<when>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?(?<ISO8601_TIMEZONE>Z|[+-](?:2[0123]|[01]?[0-9])(?::?(?:[0-5][0-9])))?)"
(if you are not interested in parsing the timestamp at all, just use DATA again).
So, your pattern might look like this:
%{INT:id} %{INT:device} "%{DATA:device_name}" %{INT:input} "%{DATA:input_name}" %{INT:state} "%{DATA:on_phrase}" "%{DATA:off_phrase}" "(?<when>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?(?<ISO8601_TIMEZONE>Z|[+-](?:2[0123]|[01]?[0-9])(?::?(?:[0-5][0-9])))?)"
Regarding index:
you can omit it completely then the default one is used: logstash-%{+YYYY.MM.dd}
you can use sample-%{+YYYY.MM.dd} if you want to have separate indexes for each day
you can use sample- to have just one index
you can use any other combination of the fields in your index pattern

HighCharts not displaying series data

I have a timeseries data which I am trying to display with Highstocks:
Here is the data:
{
"title": {
"text": "My Graph"
},
"series": [
[
{
"name": "Future Index Longs",
"data": [
[
"2019-02-05",
104516
],
[
"2019-02-06",
127260
],
[
"2019-02-07",
156291
],
[
"2019-02-08",
167567
]
]
}
],
[
{
"name": "Future Index Longs",
"data": [
[
"2019-02-05",
21
],
[
"2019-02-06",
0
],
[
"2019-02-07",
1263
],
[
"2019-02-08",
12
]
]
}
],
[
{
"name": "Future Index Longs",
"data": [
[
"2019-02-05",
33873
],
[
"2019-02-06",
61093
],
[
"2019-02-07",
43125
],
[
"2019-02-08",
41928
]
]
}
],
[
{
"name": "Future Index Longs",
"data": [
[
"2019-02-05",
47542
],
[
"2019-02-06",
55084
],
[
"2019-02-07",
75256
],
[
"2019-02-08",
77786
]
]
}
],
[
{
"name": "Future Index Longs",
"data": [
[
"2019-02-05",
185952
],
[
"2019-02-06",
243437
],
[
"2019-02-07",
275935
],
[
"2019-02-08",
287293
]
]
}
]
]
}
The graph is empty and no data is displayed. What am I doing wrong?
Sorry to add this filler here but I am required to add more text to post this question and since this is a pretty simple question, I don't have much to add.
You have the wrong format on your series, it should be an array of objects.
Like this: series: [{ ... }, { ... }]
Check this fiddle: https://jsfiddle.net/wg1vnyzp/1/
To have a chart with datetime axes in Highcharts you have to pass the X value as the timestamp in milliseconds since 1970.
Highstock example:
https://jsfiddle.net/BlackLabel/f0rsz6cd/1/
Note that in Highcharts you have to define xAxis.type as datetime like that:
xAxis: {
type: 'datetime'
}
Highcharts demo:
https://jsfiddle.net/BlackLabel/kas2oywp/
API reference:
https://api.highcharts.com/highcharts/series.line.data.x
https://api.highcharts.com/highcharts/xAxis.type

Using Codable to parse nested JSON data problem

*I'm trying to use Codable to parse complex JSON data. But I have a problem with "route_polyline"field.
Problem:
CodingKeys(stringValue: "route_polyline", intValue: nil), _JSONKey(stringValue: "Index 0", intValue: 0)], debugDescription: "Expected to decode Dictionary but found an array instead.
Any help? I want to use Codable and not JSONSerialization.
My model:
struct RouteFareResponse: Decodable {
let status: String
let trip_cost: Trip_cost
}
struct Trip_cost: Decodable {
let start_address: String
let end_address: String
let trip_time: Int
let trip_distance: String
let transmit_cost: Int
let cost_in_zone: Int
let outdoor_zone_cost: Int
let sub_zones_cost: Int
let perform_cost: Int
let services_cost: Int
let min_cost: Int
let discount_by_promo: Int
let result_trip_cost:Int
let trip_cost: Int
let trip_cost_with_discount: Int
let route_polyline: [Polyline]
let _debug_sub_zones_info: String//Dict
let _debug: [Debug]
}
struct Polyline: Decodable {
var lat: Double
var long: Double
enum PolylineKeys: String, CodingKey {
case route_polyline
}
init(from decoder: Decoder) throws {
let values = try decoder.container(keyedBy: PolylineKeys.self)
var route_polyline = try values.nestedUnkeyedContainer(forKey: .route_polyline)
var longLat = try route_polyline.nestedUnkeyedContainer()
long = try longLat.decode(Double.self)
lat = try longLat.decode(Double.self)
}
}
struct Debug: Codable {
let trip_distance: Int
let trip_time: Int
let zone_distance: Int
let zone_time: Int
let out_zone_distance: Int
let out_zone_time: Int
// let transmit_distance: NSNull
// let transmit_time: NSNull
let tariff_price_prefix: String
}
JSON Response:
{
"status": "success",
"trip_cost": {
"start_address": "2-й Павловский пер., 18, Москва, Россия, 115093",
"end_address": "Piața Marii Adunări Naționale, Bulevardul Ștefan cel Mare și Sfînt, Chișinău, Молдавия",
"trip_time": 934,
"trip_distance": "1,314.52",
"transmit_cost": 0,
"cost_in_zone": 0,
"outdoor_zone_cost": 0,
"sub_zones_cost": 0,
"perform_cost": 0,
"services_cost": 0,
"min_cost": 0,
"discount_by_promo": 0,
"result_trip_cost": 0,
"trip_cost": 0,
"trip_cost_with_discount": 0,
"route_polyline": [
[
55.71801,
37.62929
],
[
55.71376,
37.63317
],
[
55.71108,
37.62459
],
[
55.70636,
37.62239
],
[
55.70134,
37.608
],
[
55.7115,
37.58071
],
[
55.67312,
37.52316
],
[
55.63334,
37.44638
],
[
55.54088,
37.08855
],
[
55.51861,
36.99843
],
[
55.47076,
36.92839
],
[
55.36682,
36.75983
],
[
55.33846,
36.72214
],
[
55.32199,
36.70649
],
[
55.2975,
36.70001
],
[
55.2369,
36.68447
],
[
55.17533,
36.67373
],
[
55.07732,
36.62831
],
[
55.04849,
36.60693
],
[
55.0127,
36.54831
],
[
54.97539,
36.48667
],
[
54.92141,
36.39975
],
[
54.90194,
36.36941
],
[
54.8747,
36.35157
],
[
54.83675,
36.32096
],
[
54.77415,
36.23814
],
[
54.73767,
36.18605
],
[
54.71318,
36.16446
],
[
54.64494,
36.13463
],
[
54.62446,
36.12338
],
[
54.59645,
36.08795
],
[
54.54708,
36.01808
],
[
54.5191,
35.97127
],
[
54.47491,
35.88614
],
[
54.40394,
35.73519
],
[
54.36474,
35.62843
],
[
54.32196,
35.50525
],
[
54.24807,
35.43855
],
[
54.18797,
35.34998
],
[
54.15873,
35.29848
],
[
54.14077,
35.28455
],
[
54.09443,
35.2272
],
[
54.04353,
35.14278
],
[
53.99988,
35.08885
],
[
53.95291,
35.02331
],
[
53.89699,
34.94324
],
[
53.78786,
34.82704
],
[
53.76976,
34.81073
],
[
53.74931,
34.80781
],
[
53.71795,
34.80157
],
[
53.66801,
34.77262
],
[
53.60077,
34.7352
],
[
53.55531,
34.70988
],
[
53.4788,
34.67168
],
[
53.45724,
34.66608
],
[
53.4438,
34.67235
],
[
53.41763,
34.66852
],
[
53.38279,
34.65954
],
[
53.3233,
34.64507
],
[
53.21967,
34.61469
],
[
53.01306,
34.5253
],
[
52.94232,
34.5251
],
[
52.84383,
34.52932
],
[
52.79696,
34.53675
],
[
52.7599,
34.5311
],
[
52.71971,
34.53223
],
[
52.66035,
34.53044
],
[
52.64336,
34.51429
],
[
52.6047,
34.51525
],
[
52.43118,
34.50557
],
[
52.38021,
34.50342
],
[
52.35699,
34.49416
],
[
52.2817,
34.46901
],
[
52.2076,
34.45348
],
[
52.17831,
34.45381
],
[
52.13737,
34.47217
],
[
52.03141,
34.47033
],
[
51.93017,
34.49023
],
[
51.91541,
34.44451
],
[
51.89897,
34.38614
],
[
51.86806,
34.30569
],
[
51.82,
34.15201
],
[
51.78168,
34.02024
],
[
51.73943,
33.92501
],
[
51.70179,
33.83608
],
[
51.65752,
33.70137
],
[
51.54035,
33.4268
],
[
51.52512,
33.38398
],
[
51.4958,
33.3356
],
[
51.44937,
33.23951
],
[
51.42716,
33.16679
],
[
51.40182,
33.06367
],
[
51.3634,
32.98993
],
[
51.34672,
32.91794
],
[
51.32141,
32.87128
],
[
51.2664,
32.6516
],
[
51.23106,
32.48914
],
[
51.20507,
32.29943
],
[
51.16934,
32.07204
],
[
51.1508,
31.97069
],
[
51.13949,
31.8706
],
[
51.13728,
31.70108
],
[
51.13263,
31.62517
],
[
51.1191,
31.52805
],
[
51.08803,
31.30808
],
[
51.06756,
31.15402
],
[
50.99689,
31.13589
],
[
50.95574,
31.12393
],
[
50.93721,
31.1177
],
[
50.92171,
31.09714
],
[
50.90721,
31.08228
],
[
50.8909,
31.07671
],
[
50.84758,
31.043
],
[
50.7346,
30.95449
],
[
50.52917,
30.79495
],
[
50.4994,
30.77167
],
[
50.48368,
30.71295
],
[
50.46354,
30.63963
],
[
50.46132,
30.6289
],
[
50.45511,
30.63293
],
[
50.43845,
30.6127
],
[
50.42376,
30.5699
],
[
50.42164,
30.55162
],
[
50.40604,
30.51815
],
[
50.39558,
30.50879
],
[
50.38195,
30.47954
],
[
50.32586,
30.39378
],
[
50.29453,
30.35951
],
[
50.26486,
30.33643
],
[
50.25778,
30.31004
],
[
50.24634,
30.28063
],
[
50.22903,
30.26774
],
[
50.18994,
30.2224
],
[
50.18091,
30.21798
],
[
50.16218,
30.2255
],
[
50.14369,
30.23776
],
[
50.13303,
30.23166
],
[
50.11478,
30.23418
],
[
50.08751,
30.23467
],
[
50.05339,
30.21647
],
[
50.03091,
30.21478
],
[
50.00816,
30.2016
],
[
49.95806,
30.17323
],
[
49.9348,
30.18623
],
[
49.92008,
30.19057
],
[
49.89046,
30.17741
],
[
49.83993,
30.15933
],
[
49.80218,
30.19935
],
[
49.75538,
30.19921
],
[
49.69172,
30.19481
],
[
49.56323,
30.17288
],
[
49.50329,
30.16781
],
[
49.459,
30.15574
],
[
49.41405,
30.12468
],
[
49.37424,
30.11152
],
[
49.33214,
30.10614
],
[
49.23068,
30.08139
],
[
49.20557,
30.07636
],
[
49.17632,
30.08126
],
[
49.14088,
30.08948
],
[
49.13227,
30.10026
],
[
49.12034,
30.14063
],
[
49.11067,
30.14782
],
[
49.05454,
30.15435
],
[
49.01042,
30.16103
],
[
48.98754,
30.16486
],
[
48.97725,
30.17264
],
[
48.92938,
30.23371
],
[
48.91232,
30.24705
],
[
48.8892,
30.25526
],
[
48.86788,
30.25988
],
[
48.8302,
30.25662
],
[
48.77143,
30.25854
],
[
48.72721,
30.25507
],
[
48.6695,
30.23987
],
[
48.56247,
30.23094
],
[
48.50009,
30.22887
],
[
48.41243,
30.24079
],
[
48.213,
30.28787
],
[
48.15789,
30.3029
],
[
48.13357,
30.30171
],
[
48.05351,
30.29445
],
[
47.96978,
30.30973
],
[
47.91599,
30.31322
],
[
47.85884,
30.30052
],
[
47.8276,
30.29059
],
[
47.79247,
30.2623
],
[
47.75315,
30.27486
],
[
47.7404,
30.27392
],
[
47.72084,
30.21455
],
[
47.70569,
30.12985
],
[
47.69383,
30.0688
],
[
47.67141,
29.99311
],
[
47.65685,
29.89882
],
[
47.63792,
29.83431
],
[
47.62534,
29.80075
],
[
47.62718,
29.74939
],
[
47.63537,
29.68184
],
[
47.58942,
29.57482
],
[
47.55006,
29.5017
],
[
47.49814,
29.45195
],
[
47.45969,
29.41467
],
[
47.4303,
29.37051
],
[
47.33885,
29.24453
],
[
47.32377,
29.22576
],
[
47.30789,
29.20682
],
[
47.27122,
29.20243
],
[
47.25495,
29.19737
],
[
47.2455,
29.18709
],
[
47.23985,
29.17093
],
[
47.22769,
29.13168
],
[
47.1724,
29.01439
],
[
47.12864,
28.91717
],
[
47.11337,
28.86019
],
[
47.10136,
28.86359
],
[
47.0479,
28.84916
],
[
47.03916,
28.85336
],
[
47.02463,
28.83238
]
],
"_debug_sub_zones_info": {},
"_debug": {
"trip_distance": 1314516,
"trip_time": 56015,
"zone_distance": 10385.483746984,
"zone_time": 560,
"out_zone_distance": 1297603.4600007,
"out_zone_time": 55455,
"transmit_distance": null,
"transmit_time": null,
"tariff_price_prefix": "night_"
}
}
}
Found kinda same problem but solution in this topic only works with the first couple of coordinates
The value of route_polyline is a nested array, but you have to decode the inner array directly to lat/long.
Change the Polyline struct to
struct Polyline: Decodable {
let lat: Double
let long: Double
init(from decoder: Decoder) throws {
var longLat = try decoder.unkeyedContainer()
long = try longLat.decode(Double.self)
lat = try longLat.decode(Double.self)
}
}
Please avoid snake_cased variables. You can decode those keys to lowerCamelCased variables with the .convertFromSnakeCase key decoding strategy.

How do i create long/big Dictionaries in Swift? Error: Expression was to complex to be solved in reasnoble time

So i have this piece of code (fieldKey is a String)
var request = [
"size": 0,
"aggs": [
fieldKey : [
"global": [],
"aggs": [
"global": [
"aggs": [
"facet": [
"nested": [
"path": "tags"
],
"aggs": [
"bar": [
"filter": [
"match": [
"tags.name": fieldKey
]
]
],
"aggs": [
"filtered": [
"terms": [
"field": "tags.name"
],
"aggs": [
"values": [
"terms": [
"field": "tags.value.raw",
"min_doc_count": 1
]
]
]
]
]
]
]
]
]
]
]
]
]
Im trying to create a JSON request for Elasticsearch server.
I get the "Expression was to complex to be solved in reasonable time; consider breaking up the expression into distinct sub-expressions" error.
This is when i tried to do this.
var request = [
"size": 0,
"aggs": [String : AnyObject]()
]
request["aggs"]![fieldKey] = [
fieldKey : [
"global": [],
"aggs": [
"global": [
"aggs": [
"facet": [
"nested": [
"path": "tags"
],
"aggs": [
"bar": [
"filter": [
"match": [
"tags.name": fieldKey
]
]
],
"aggs": [
"filtered": [
"terms": [
"field": "tags.name"
],
"aggs": [
"values": [
"terms": [
"field": "tags.value.raw",
"min_doc_count": 1
]
]
]
]
]
]
]
]
]
]
]
]
But now i get the "Cannot assign to immutable expression of type "AnyObject?!"" error but i clearly used the "var" when creating the request? Does anyone know how to solve this? Is there any better way of creating such long Dictionaries/JSON files? Thanks

Why App store not accepting my routing app coverage file?

After I upload the Aus.geojson file (routing app coverage file) I get the following error:
JSON file you uploaded was invalid. The file must contain only one element of type multipolygon. Below is the JSON file which i was submitting and in which I see only one multipolygon element. Why I am getting the error?
{
"type": "FeatureCollection",
"features": [
{ "type": "Feature", "properties": { "name": "Australia" }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 145.397978, -40.792549 ], [ 146.364121, -41.137695 ], [ 146.908584, -41.000546 ], [ 147.689259, -40.808258 ], [ 148.289068, -40.875438 ], [ 148.359865, -42.062445 ], [ 148.017301, -42.407024 ], [ 147.914052, -43.211522 ], [ 147.564564, -42.937689 ], [ 146.870343, -43.634597 ], [ 146.663327, -43.580854 ], [ 146.048378, -43.549745 ], [ 145.431930, -42.693776 ], [ 145.295090, -42.033610 ], [ 144.718071, -41.162552 ], [ 144.743755, -40.703975 ], [ 145.397978, -40.792549 ] ] ], [ [ [ 143.561811, -13.763656 ], [ 143.922099, -14.548311 ], [ 144.563714, -14.171176 ], [ 144.894908, -14.594458 ], [ 145.374724, -14.984976 ], [ 145.271991, -15.428205 ], [ 145.485260, -16.285672 ], [ 145.637033, -16.784918 ], [ 145.888904, -16.906926 ], [ 146.160309, -17.761655 ], [ 146.063674, -18.280073 ], [ 146.387478, -18.958274 ], [ 147.471082, -19.480723 ], [ 148.177602, -19.955939 ], [ 148.848414, -20.391210 ], [ 148.717465, -20.633469 ], [ 149.289420, -21.260511 ], [ 149.678337, -22.342512 ], [ 150.077382, -22.122784 ], [ 150.482939, -22.556142 ], [ 150.727265, -22.402405 ], [ 150.899554, -23.462237 ], [ 151.609175, -24.076256 ], [ 152.073540, -24.457887 ], [ 152.855197, -25.267501 ], [ 153.136162, -26.071173 ], [ 153.161949, -26.641319 ], [ 153.092909, -27.260300 ], [ 153.569469, -28.110067 ], [ 153.512108, -28.995077 ], [ 153.339095, -29.458202 ], [ 153.069241, -30.350240 ], [ 153.089602, -30.923642 ], [ 152.891578, -31.640446 ], [ 152.450002, -32.550003 ], [ 151.709117, -33.041342 ], [ 151.343972, -33.816023 ], [ 151.010555, -34.310360 ], [ 150.714139, -35.173460 ], [ 150.328220, -35.671879 ], [ 150.075212, -36.420206 ], [ 149.946124, -37.109052 ], [ 149.997284, -37.425261 ], [ 149.423882, -37.772681 ], [ 148.304622, -37.809061 ], [ 147.381733, -38.219217 ], [ 146.922123, -38.606532 ], [ 146.317922, -39.035757 ], [ 145.489652, -38.593768 ], [ 144.876976, -38.417448 ], [ 145.032212, -37.896188 ], [ 144.485682, -38.085324 ], [ 143.609974, -38.809465 ], [ 142.745427, -38.538268 ], [ 142.178330, -38.380034 ], [ 141.606582, -38.308514 ], [ 140.638579, -38.019333 ], [ 139.992158, -37.402936 ], [ 139.806588, -36.643603 ], [ 139.574148, -36.138362 ], [ 139.082808, -35.732754 ], [ 138.120748, -35.612296 ], [ 138.449462, -35.127261 ], [ 138.207564, -34.384723 ], [ 137.719170, -35.076825 ], [ 136.829406, -35.260535 ], [ 137.352371, -34.707339 ], [ 137.503886, -34.130268 ], [ 137.890116, -33.640479 ], [ 137.810328, -32.900007 ], [ 136.996837, -33.752771 ], [ 136.372069, -34.094766 ], [ 135.989043, -34.890118 ], [ 135.208213, -34.478670 ], [ 135.239218, -33.947953 ], [ 134.613417, -33.222778 ], [ 134.085904, -32.848072 ], [ 134.273903, -32.617234 ], [ 132.990777, -32.011224 ], [ 132.288081, -31.982647 ], [ 131.326331, -31.495803 ], [ 129.535794, -31.590423 ], [ 128.240938, -31.948489 ], [ 127.102867, -32.282267 ], [ 126.148714, -32.215966 ], [ 125.088623, -32.728751 ], [ 124.221648, -32.959487 ], [ 124.028947, -33.483847 ], [ 123.659667, -33.890179 ], [ 122.811036, -33.914467 ], [ 122.183064, -34.003402 ], [ 121.299191, -33.821036 ], [ 120.580268, -33.930177 ], [ 119.893695, -33.976065 ], [ 119.298899, -34.509366 ], [ 119.007341, -34.464149 ], [ 118.505718, -34.746819 ], [ 118.024972, -35.064733 ], [ 117.295507, -35.025459 ], [ 116.625109, -35.025097 ], [ 115.564347, -34.386428 ], [ 115.026809, -34.196517 ], [ 115.048616, -33.623425 ], [ 115.545123, -33.487258 ], [ 115.714674, -33.259572 ], [ 115.679379, -32.900369 ], [ 115.801645, -32.205062 ], [ 115.689611, -31.612437 ], [ 115.160909, -30.601594 ], [ 114.997043, -30.030725 ], [ 115.040038, -29.461095 ], [ 114.641974, -28.810231 ], [ 114.616498, -28.516399 ], [ 114.173579, -28.118077 ], [ 114.048884, -27.334765 ], [ 113.477498, -26.543134 ], [ 113.338953, -26.116545 ], [ 113.778358, -26.549025 ], [ 113.440962, -25.621278 ], [ 113.936901, -25.911235 ], [ 114.232852, -26.298446 ], [ 114.216161, -25.786281 ], [ 113.721255, -24.998939 ], [ 113.625344, -24.683971 ], [ 113.393523, -24.384764 ], [ 113.502044, -23.806350 ], [ 113.706993, -23.560215 ], [ 113.843418, -23.059987 ], [ 113.736552, -22.475475 ], [ 114.149756, -21.755881 ], [ 114.225307, -22.517488 ], [ 114.647762, -21.829520 ], [ 115.460167, -21.495173 ], [ 115.947373, -21.068688 ], [ 116.711615, -20.701682 ], [ 117.166316, -20.623599 ], [ 117.441545, -20.746899 ], [ 118.229559, -20.374208 ], [ 118.836085, -20.263311 ], [ 118.987807, -20.044203 ], [ 119.252494, -19.952942 ], [ 119.805225, -19.976506 ], [ 120.856220, -19.683708 ], [ 121.399856, -19.239756 ], [ 121.655138, -18.705318 ], [ 122.241665, -18.197649 ], [ 122.286624, -17.798603 ], [ 122.312772, -17.254967 ], [ 123.012574, -16.405200 ], [ 123.433789, -17.268558 ], [ 123.859345, -17.069035 ], [ 123.503242, -16.596506 ], [ 123.817073, -16.111316 ], [ 124.258287, -16.327944 ], [ 124.379726, -15.567060 ], [ 124.926153, -15.075100 ], [ 125.167275, -14.680396 ], [ 125.670087, -14.510070 ], [ 125.685796, -14.230656 ], [ 126.125149, -14.347341 ], [ 126.142823, -14.095987 ], [ 126.582589, -13.952791 ], [ 127.065867, -13.817968 ], [ 127.804633, -14.276906 ], [ 128.359690, -14.869170 ], [ 128.985543, -14.875991 ], [ 129.621473, -14.969784 ], [ 129.409600, -14.420670 ], [ 129.888641, -13.618703 ], [ 130.339466, -13.357376 ], [ 130.183506, -13.107520 ], [ 130.617795, -12.536392 ], [ 131.223495, -12.183649 ], [ 131.735091, -12.302453 ], [ 132.575298, -12.114041 ], [ 132.557212, -11.603012 ], [ 131.824698, -11.273782 ], [ 132.357224, -11.128519 ], [ 133.019561, -11.376411 ], [ 133.550846, -11.786515 ], [ 134.393068, -12.042365 ], [ 134.678632, -11.941183 ], [ 135.298491, -12.248606 ], [ 135.882693, -11.962267 ], [ 136.258381, -12.049342 ], [ 136.492475, -11.857209 ], [ 136.951620, -12.351959 ], [ 136.685125, -12.887223 ], [ 136.305407, -13.291230 ], [ 135.961758, -13.324509 ], [ 136.077617, -13.724278 ], [ 135.783836, -14.223989 ], [ 135.428664, -14.715432 ], [ 135.500184, -14.997741 ], [ 136.295175, -15.550265 ], [ 137.065360, -15.870762 ], [ 137.580471, -16.215082 ], [ 138.303217, -16.807604 ], [ 138.585164, -16.806622 ], [ 139.108543, -17.062679 ], [ 139.260575, -17.371601 ], [ 140.215245, -17.710805 ], [ 140.875463, -17.369069 ], [ 141.071110, -16.832047 ], [ 141.274095, -16.388870 ], [ 141.398222, -15.840532 ], [ 141.702183, -15.044921 ], [ 141.563380, -14.561333 ], [ 141.635520, -14.270395 ], [ 141.519869, -13.698078 ], [ 141.650920, -12.944688 ], [ 141.842691, -12.741548 ], [ 141.686990, -12.407614 ], [ 141.928629, -11.877466 ], [ 142.118488, -11.328042 ], [ 142.143706, -11.042737 ], [ 142.515260, -10.668186 ], [ 142.797310, -11.157355 ], [ 142.866763, -11.784707 ], [ 143.115947, -11.905630 ], [ 143.158632, -12.325656 ], [ 143.522124, -12.834358 ], [ 143.597158, -13.400422 ], [ 143.561811, -13.763656 ] ] ] ] } }
]
}
I believe that the start and end coordinates have to match:
https://stackoverflow.com/a/12573711/785716
For anyone who is struggling to make this work, Please use this to generate the .geojson file and change type to MultiPolygon and add polygon arrays to the coordinates object. After adding those polygons, your .geojson file should look as follows,
Note the [] inside the coordinates and also your start coordinates and end coordinates should match
Here is some working .geojson snippet.
{
"type": "MultiPolygon",
"coordinates": [
[
[
[
3.33984375,
46.49839225859763
],
[
15.732421875,
46.49839225859763
],
[
15.732421875,
55.7765730186677
],
[
3.33984375,
55.7765730186677
],
[
3.33984375,
46.49839225859763
]
]
]
]
}
Please refer to Listing 7.2 of the apple docs for more info

Resources