Is there a way to do that using open source map / API (Google Map API, Microsoft Live Map API)?
Or is there a way to traverse roads and junction from map files? (if I purchase commericial map data)
Thanks.
Edit: Either open source or API
A possible solution based on openstreetmap depending on the quality of OSM data in the US.
As described here "Exit ramps/slip roads should be tagged as highway=motorway_link or highway=primary_link" if you import the map data for the US (using osm2pgsql) you can then select all motorway links.
E.g. (selects based on my own data, i.e. very much outside US)
openmapdb=# select osm_id, name, ref from planet_osm_roads where highway='motorway_link';
osm_id | name | ref
----------+------+-----
23683997 | |
26436348 | |
[..]
23683997 and 26436348 are the OSM ids - if you click on the links you'll see they are indeed entry/exit ramps.
Once you've identified a way, you can access the latitude and logitude:
openmapdb=# select astext(st_transform(way, 4326)) from planet_osm_roads where osm_id = '23683997';
LINESTRING(24.8757131412186 44.8730730514894,[..]
(1 row)
Related
SendBird treats every channel as their GroupChannel. The 1:1 chat too is technically a GroupChannel with only two users (with isDistinct = true so it would return the personal chat when you attempt to create it again).
My question is, how do I search GroupChannels by their name those include group AND 1:1 chat? The group chat would have a common name that would be shown to all the users in the group. But for 1:1 chat, the GroupChannel won't have a name, and if it has, that won't be shown to the users as for 1:1 chat, we always show the other person's name (like almost all the chat systems work).
Typically the main UI list contains mixture of the group chat and 1:1 chats (all the GroupChannels).
--------------------------------
| Search Chat TextField |
|--------------------------------|
|1 John (1:1) |
|2 John's Birthday Plan (group) |
|3 Johnney Eve (1:1) |
|4 Johansson Fans (group) |
| ... |
--------------------------------
All the items are technically GroupChannel. Note that all the 1:1 chats don't have actual name as shown in the list. The name shown in the list is the other person's nickname.
Expectation:
Now, if the user searches something like "joh", then it should return all the group chats whose name contains "joh" OR all the 1:1 chats where the other person's name contains "joh". (Basically all the items shown in the above example.)
My Attempt:
My initial solution to achieve this is to keep the 1:1 channel name as <user1 nickname> & <user2 nickname>, so when the user searches for the other user by their name, the 1:1 channel would appear just like a group channel.
Example Code:
query = SBDGroupChannel.createMyGroupChannelListQuery()
query?.order = .latestLastMessage
query?.limit = 30
query?.channelNameContainsFilter = "joh"
query.loadNextPage(...)
The Problem:
The problem with this are:
If the user searches for their own name (or just the separator character & or just a whitespace), then too all the personal chat would be visible, which is irrelevant.
My system allows user to change their nickname, so every time a user changes their nickname, then all the 1:1 channel names have to be updated (which is painful).
Sunil,
Typically when you retrieve a list of group channels for a user, it retrieves all channels that the user is potentially a part of (Depending on the memberStateFilter).
If you were explicitly looking to search, rather than providing an ongoing list of channels the user is part of, you may be able to filter channels by userIds. You'd have to filter for a channel that consists of the searching user, and the desired user.
Lets look at an example, assuming your userId is John and you're looking for your chat with Jay:
let listQuery = SBDGroupChannel.createMyGroupChannelListQuery()
listQuery?.userIdsExactFilter = ["John", "Jay"]
listQuery?.loadNextPage(completionHandler: { (groupChannels, error) in
guard error == nil else {
// Handle error.
}
// Only channelA is returned in a result list through the "list" parameter of the callback method.
...
})
If you wanted to explicitly use nicknames:
let listQuery = SBDGroupChannel.createMyGroupChannelListQuery()
listQuery?.nicknameContainsFilter = ["John", "Jay"]
listQuery?.loadNextPage(completionHandler: { (groupChannels, error) in
guard error == nil else {
// Handle error.
}
// Only channelA is returned in a result list through the "list" parameter of the callback method.
...
})
You mention that you allow users to change their nicknames, and thus rooms have to be updated. It may be worth giving your group channels (even 1:1) generic names, and then dynamically generate the display name of each chat.
Since each channel returns the list of members, you could look at the array of members, filter out the user that is logged in, and then pull the nickname of the remaining user from the array. This would ensure that no matter what the user changes their nickname to, its always accurate, and you don't have to update every channel when the user updates their nickname.
************ Updated 02/10 ************
Thanks for providing an example of what you're looking to achieve. It looks like you're essentially trying to search both channelNameContainsFilter and nicknameContainsFilter using the OR operator. This is not something we (Sendbird), currently support within the iOS SDK. So the question is, what could you do to achieve this?
One option would be to utilize the Platform API to obtain this information. The List my group channels has the search_query and search_fields parameters which would allow you to utilize that OR operator to find both channel names and nicknames that match your value.
Alternatively, since the SDK does return all of the necessary data that would be required to filter for these results, you could create a front-end filter that would only display the items that match your filter results. So the SDK returns the complete channel list, you store that list, and then when the user searches, you filter through the list to find channels that match your needs and display only those.
As a side note, Stackoverflow may not be the best place for this type of discussion as there is a lot of back and forth. Please feel free to join us in our community for more support.
I am working on Bing maps, but fairly new to spatial data types. I have managed to get the GeoJson data for a shape from bing maps for example,
{"type":"MultiPolygon","coordinates":[[[[30.86202,-17.85882],[30.93311,-17.89084],[30.90701,-17.92073],[30.87112,-17.90048],[30.86202,-17.85882],[30.86202,-17.85882],[30.86202,-17.85882]]]]}
However I need to save this as DbGeomtry in SQL, how can convert GeoJson to DbGeomtry
Option 1.
Convert the GeoJSON to WKT and then use stgeomfromtext to create the Db object.
Option 2.
Deserialize the GeoJSON using GeoJSON.Net and then use the nuget package GeoJSON.Net.Contrib.MsSqlSpatial to convert to a Db object. eg.
DbGeography dbGeographyPoint = point.ToDbGeography();
Option 3.
For some types of GeoJSON data, modifications based on this approach can be used
drop table if exists BikeShare
create table BikeShare(
id int identity primary key,
position Geography,
ObjectId int,
Address nvarchar(200),
Bikes int,
Docks int )
declare #bikeShares nvarchar(max) =
'{"type":"FeatureCollection",
"features":[{"type":"Feature",
"id":"56679924",
"geometry":{"type":"Point",
"coordinates":[-77.0592213018017,38.90222845310455]},
"properties":{"OBJECTID":56679924,"ID":72,
"ADDRESS":"Georgetown Harbor / 30th St NW",
"TERMINAL_NUMBER":"31215",
"LATITUDE":38.902221,"LONGITUDE":-77.059219,
"INSTALLED":"YES","LOCKED":"NO",
"INSTALL_DATE":"2010-10-05T13:43:00.000Z",
"REMOVAL_DATE":null,
"TEMPORARY_INSTALL":"NO",
"NUMBER_OF_BIKES":15,
"NUMBER_OF_EMPTY_DOCKS":4,
"X":394863.27537199,"Y":137153.4794371,
"SE_ANNO_CAD_DATA":null}
},
......'
-- NOTE: This GeoJSON is truncated.
-- Copy full example from https://github.com/Esri/geojson-layer-js/blob/master/data/dc-bike-share.json
INSERT INTO BikeShare(position, ObjectId, Address, Bikes, Docks)
SELECT geography::STGeomFromText('POINT ('+long + ' ' + lat + ')', 4326),
ObjectId, Address, Bikes, Docks
from OPENJSON(#bikeShares, '$.features')
WITH (
long varchar(100) '$.geometry.coordinates[0]',
lat varchar(100) '$.geometry.coordinates[1]',
ObjectId int '$.properties.OBJECTID',
Address nvarchar(200) '$.properties.ADDRESS',
Bikes int '$.properties.NUMBER_OF_BIKES',
Docks int '$.properties.NUMBER_OF_EMPTY_DOCKS' )
I suggest to try Option 2 first.
Note: Consider Geography instead of Geometry if you are using GCS_WGS_1984 projection as is with Bing Maps.
Instead of retrieving GeoJSON from Bing Maps, retrieve Well Known Text: https://www.bing.com/api/maps/sdk/mapcontrol/isdk/wktwritetowkt
Send this to your backend and then use the geometry::stgeomfromtext https://learn.microsoft.com/en-us/sql/t-sql/spatial-geometry/stgeomfromtext-geometry-data-type?view=sql-server-ver15
Note that DbGeometery will not support spatially accurate calculations. Consider storing the data as a DbGeograpgy instead using geography::stgeomfromtext https://learn.microsoft.com/en-us/sql/t-sql/spatial-geography/stgeomfromtext-geography-data-type?view=sql-server-ver15 and pass in '4326' as the SRID.
I store data in XML files in Data Lake Store within each folder, like one folder constitutes one source system.
On end of every day, i would like to run some kid of log analytics to find out how many New XML files are stored in Data Lake Store under every folder?. I have enabled Diagnostic Logs and also added OMS Log Analytics Suite.
I would like to know what is the best way to achieve this above report?
It is possible to do some aggregate report (and even create an alert/notification). Using Log Analytics, you can create a query that searches for any instances when a file is written to your Azure Data Lake Store based on either a common root path, or a file naming:
AzureDiagnostics
| where ( ResourceProvider == "MICROSOFT.DATALAKESTORE" )
| where ( OperationName == "create" )
| where ( Path_s contains "/webhdfs/v1/##YOUR PATH##")
Alternatively, the last line, could also be:
| where ( Path_s contains ".xml")
...or a combination of both.
You can then use this query to create an alert that will notify you during a given interval (e.g. every 24 hours) the number of files that were created.
Depending on what you need, you can format the query these ways:
If you use a common file naming, you can find a match where the path contains said file naming.
If you use a common path, you can find a match where the patch matches the common path.
If you want to be notified of all the instances (not just specific ones), you can use an aggregating query, and an alert when a threshold is reached/exceeded (i.e. 1 or more events):
AzureDiagnostics
| where ( ResourceProvider == "MICROSOFT.DATALAKESTORE" )
| where ( OperationName == "create" )
| where ( Path_s contains ".xml")
| summarize AggregatedValue = count(OperationName) by bin(TimeGenerated, 24h), OperationName
With the query, you can create the alert by following the steps in this blog post: https://azure.microsoft.com/en-gb/blog/control-azure-data-lake-costs-using-log-analytics-to-create-service-alerts/.
Let us know if you have more questions or need additional details.
I created a simple graph model for hospitals. I am playing around with hiearchical trees in neo4j, so I created a multilevel location tree in graph.
Now I want to get GPS using apoc.spatial functions. Lets say that first 3 levels of locations are good enough for retrieving latitude and longitude. My query looks like this.
MATCH (h:Hospital)-[:IS_IN*..3]->(location)
CALL apoc.spatial.geocodeOnce(toString(collect(location.name))) YIELD location
set h += location
But this returns error since it does not support toString collections i guess.
Expected a String, Number or Boolean, got: Vector(550 OSBORNE ROAD,
55432, FRIDLEY)
What is the simplest way to achieve this to work ?
This should work
RETURN substring(reduce(s="", name in collect(location.name) | s + "," + name),1)
I'm developing iOS app, using Swift, and I need to be able to get the posts based on how close they are to a certain location, and sort them based on when they were posted.
I know how to check how close one item's post location is to another location, but, my problem is that I need to get all of the posts within a x miles of the user.
Would it be better to scan the table, which as I understand, selects every single value from the database, and then check if every item returned is within x miles of the user? This seems resource-intensive, as I'm expecting for there to be a lot of posts.
Or, would it be better to create another table that has a static hash key, and set two local secondary indexes, one for the latitude, and one for the longitude, and just query for that one static hash key, and query where the latitude is between x and y, and the longitude is between a and b?
The AWS DynamoDB documentation warns against using a hash key that is accessed too much:
But is it really as bad as they make it seem to use the same hash key?
This would be in a table with the following values for a static hash key, where Post ID is the ID of the actual post.
**static hash key** | **latitude (local secondary index)** | **longitude (local secondary index)** | **dateCreated (local secondary index) | **Post ID**
and for the scan option:
**ID** | **latitude ** | **longitude ** | **date created ** | **poster** | **comments** | **flags** | **upvotes** | **downvotes** | **popularity**
Would having a static key be better than scanning a table performance-wise? How about cost-wise?
Follow the guidelines. We are suffering from two problems in production because of bad hashes. So my advice to you is to drop the static hash concept. It won't scale at a reasonable price, and will be a pain to monitor.
Building on top of the scan approach, you can reason about using Global Secondary Indices on the lat/lang attributes.