Save and process HTML5 geolocation in Rails - ruby-on-rails

How can I store the geolocation (longitude and latitude) of a website user in Rails? (Ruby 1.9.2 + Rails 3)
Ideally I'd like to hook into the HTML5 geolocation feature instead of using an IP based lookup as I'd like to give the user control over sharing their location using the built in prompt/notifcation. But as this runs on the browser client side I am not sure how I can connect/process the data on the server side in Rails. Any idea/best-practice for getting and saving the geolocation in Rails?

Convert the lat and long decimal degrees to JSON and send to server via AJAX (or you could start by just using hidden fields), and store them in a latitude and a longitude column in the database.
If you want to do more than just store the data, use a spatially-enabled database such as Postgres with PostGIS to store geolocations allowing complex spatial functions and queries on the gathered data. To facilitate doing this in rails look at the GeoRuby and spatial_adapter gems.

Related

Rails lightweight/raw WebSockets

I have a Rails app that I want to use for some WebSocket stuff without ActionCable imposing Rails specific client-side structure on a separate JS app as well as some inter-process/server overhead on the server for every message (Redis).
Id still like to use Rails for the server instances so can directly use ActiveRecord and other such Rails components, but is it possible to strip out the extra stuff (channels etc.) from ActionCable to just get plain messages?
And ideally some control over what instance the socket connects to (e.g. to use the chat room example, make everyone that joins a given room connects to the same Rails process, e.g. via a URL query parameter, and not a tonne of separate subdomains/"separate apps")?

How to store browser's network information into database?

I want to store browser's network information into database. How can I do this in ruby on rails?
Network information like - All individual request that comes in network that I want to store into database.
Is there any gems or any api that I can get this information and store into database?
Thanks

Database geolocation which one is better

I am looking into getting a geolocation database. I am trying to understand the difference between a paid and a free service besides how accurate the results are.
I want to display data on the page based on the user's location. Should I user server side or client side to check the location and display the data accordingly? I can imagine how to do it server side, but not client side.
If I want to get the user's ip, country, region, city, and show the cities within x km around that city, I would also need the Latitude and Longitude correct?
I was looking at
http://freegeoip.net/
and
http://www.ip2location.com/databases#comparison
option: DB5
I suggest maxmind database for Geo location, I have used maxmind database for my
website
the link of maxmind data base is-http://maxmind.com/geoip/legacy/geolite/
freegeoip uses maxmind GeoLite2 database. You could as well download it yourself (~25mb) and lookup on your server which would be faster.

Using Eventmachine to write to a database and then using Rails to read it and display

I had wrote a Eventmachine server script which will receive location data from a remote GPS tracker. Now, I would like to ask the following:
1) How to write the location data into a MySQL database say by the name Position using Ruby?
2) Then, using a Rails framework to read from this SAME database called Position and display the location on a Google map.
3) If I run the Eventmachine server in the same Rails framework, how to specify the MySQL database in the Rails framework?
Thanks
Non-blocking ActiveRecord & Rails is a good starting point for doing async mysql. I recommend against receiving real-time data in the same process as your Rails process. It just leads to more headache in my past experience. Instead, I would have an Eventmachine process that's sole job is to listen for incoming GPS data, write that to your datastore, and then have Rails serve pages based on the datastore. You may also want to checkout Firehose.io or Faye as a way to push changes to your frontend.

Best Way to log API Calls, per minute / per hour

We are using reverse-geocoding in a rails webservice, and have run into quota problems when using the Google reverse geocoder through geokit. We are also implementing the simple-geo service, and I want to be able to track how many requests per minute/hour we are making.
Any suggestions for tracking our reverse-geocoding calls?
Our code will look something like the following. Would you do any of these?
Add a custom logger and process in the background daily
Use a super-fantastic gem that I don't know about that does quotas and rating easily
Insert into database a call and do queries there.
Note: I don't need the data in real-time, just want to be able to know in an hourly period, what's our usual and max requests per hour. (and total monthly requests)
def use_simplegeo(lat, lng)
SimpleGeo::Client.set_credentials(SIMPLE_GEO_OAUTHTOKEN, SIMPLE_GEO_OAUTHSECRET)
# maybe do logging/tracking here?
nearby_address = SimpleGeo::Client.get_nearby_address(lat, lng)
located_location = LocatedLocation.new
located_location.city = nearby_address[:place_name]
located_location.county = nearby_address[:county_name]
located_location.state = nearby_address[:state_code]
located_location.country = nearby_address[:country]
return located_location
end
Thanks!
The first part here is not answering the question you are asking but my be helpful if haven't considered it before.
Have you looked at not doing your reverse geocoding using your server (i.e. through Geokit) but instead having this done by the client? In other words some Javascript loaded into the user's browser making Google geocoder API calls on behalf of your service.
If your application could support this approach than this has a number of advantages:
You get around the quota problem because your distributed users each have their own daily quota and don't consume yours
You don't expend server resources of your own doing this
If you still would like to log your geocoder queries and you are concerned about the performance impact to your primary application database then you might consider one of the following options:
Just create a separate database (or databases) for logging (which write intensive) and do it synchronously. Could be relational but perhaps MongoDB or Redis might work either
Log to the file system (with a custom logger) and then cron these in batches into structured, queriable storage later. The storage could be external such as on Amazon's S3 if that works better.
Just write a record into SimpleGeo each time you do a Geocode and add custom meta-data to those records to tie them back to your own model(s)

Resources