Tracking search query parameters - ruby-on-rails

My rails application revolves around searching for 1 specific record given various filters. I'm trying to figure out what's the best way to keep track of search queries so in the future we can either do reporting or a dashboard. Basically I need insight into what users are looking for.
This is what a search query would look like:
http://localhost:3000/s?utf8=%E2%9C%93&location=New+York%2C+NY%2C+United+States&activity=Figure+Skating&start_time=Nov+13th+2015&end_time=Dec+25th+2015&period=morning&geolocation=%5B40.7127837%2C-74.00594130000002%5D
cleaner:
http://localhost:3000/s?utf8=%E2%9C%93&location=&activity=&start_time=&end_time=&geolocation=

You can add a hook in your search controller method that stores the params used in every search. Perhaps you can create a new model called Analytics, where every time someone searches for something the Analytics stores all the values of the search... that way you can have all the information stored in your db and you can analyze it.

The easiest way would be to track this via the Google Analytics:
https://support.google.com/analytics/answer/1012264?hl=en

Related

Better Realtime filter/search option to integrate in ASP.NET MVC application

Our SQL Server database has a huge number of data in different tables (millions of records) and we are populating the data using paging in the UI. Data is dynamically added and updated. Now I am planning to implement realtime filtering/searching of the data. The UI will have a text box where user types search keyword and the rows in the grid should be refreshed based on the search string as a typeahead functionality. I am thinking to use azure search as one of the candidate for implementing this functionality. But we are not on cloud yet and I have to justify my team if we had to go through this route.
Are there any other products or methods that can help in achieving this? I am still doing research on this. But if anyone has already implemented a solution like this, I would like to know more about your recommendations.
I recently started using https://datatables.net, and it's been a good experience.
Otherwise, you'll probably have to roll your own solution that keeps track of page and filter criteria on the client side and makes Ajax calls whenever they're altered. Your endpoint would need to take the page and filter as parameters, perform the necessary query of the data, and return the desired records.

Filtering Posts in Feed - Rails 4

I'm working on a Rails app based on the railstutorial.org book/tutorial. I've added some attributes to my post so that a post has an offered pay and a date (in the future).
I'd like to filter the feed based on user selected filters such as posts where the pay is between x and y, and the date is before xyz, etc.
After searching online I considered scopes to accomplish this, but don't fully understand them and didn't find good documentation for them.
I need a way to temporarily store the filters the user selects and use them to query the database and return the correct posts.
I've successfully gotten the information from the view to the controller, but am struggling finding a good way to store the filter settings in my application to use between controller actions.
Does anyone have any suggestions on how to attack this issue? I'm not sure what code would be useful, but let me know and I can upload the files. Thanks!
You should be able to save the filters either in the session or in the cookies.
session is preferred between the two, as it can hold lot more data, compared to cookies, which have a lower limit.
Refer to session and cookies for more info.

Search Gems for Rails

I was browsing reddit for the answer to this and came across this conversation which lists out a bunch of search gems for rails, which is cool. But what I wanted was something where I could:
Enter: OMG Happy Cats
It searches the whole database looking for anything that has OMG Happy Cats and returns me a an array of model objects that contain that value, that I can then use Active model serializer (Very important to be able to use this) on to return you a json object of search results so you can display what ever you want to the user.
So that json object, if this was a blog, would have a post object, maybe a category object and even a comment object.
Everything I have seen is very specific to one controller, one model. Which is nice an all but I am more of a "search for what you want, we will return you what you want, maybe grow smarter like this gem, searchkick which also has the ability to offer spelling suggestion.
I am building this with an API, so it would be limited to everything that belongs to a blog object (as to make it not so huge of a search), so it would search things like posts, tags, categories, comments and pages looking for your term, return a json object (as described) and boom done.
Any ideas?
You'll be best considering the underlying technology for this
--
Third Party
As far as I know (I'm not super experienced in this area), the best way to search an entire Rails database is to use a third party system to "index" the various elements of data you require, allowing you to search them as required.
Some examples of this include:
Sunspot / Solr
ElasticSearch
Essentially, having one of these "third party" search systems gives you the ability to index the various records you want in a separate database, which you can then search with your application.
--
Notes
There are several advantages to handling "search" with a third party stack.
Firstly, it takes the load off your main web server - which means it'll be more reliable & able to handle more traffic.
Secondly, it will ensure you're able to search all the data of your application, instead of tying into a particular model / data set
Thirdly, because many of these third party solutions index the content you're looking for, it will free up your database connectivity for your actual application, making it more efficient & scaleable
For PostgreSQL you should be able to use pg_search.
I've never used it myself but going by the documentation on GitHub, it should allow you to do:
documents = PgSearch.multisearch('OMG Happy Cats').to_a
objects = documents.map(&:searchable)
groups = objects.group_by{|o| o.class.name.pluralize.downcase}
json = Hash[groups.map{|k,v| [k,ActiveModel::ArraySerializer.new(v).as_json]}].as_json
puts json.to_json

Should I use Rail's find methods vs using Thinking Sphinx or some other search framework?

I would like to add a search/browse feature to my website.
Users will be able to search for people by email address or first and last name. They'll also be able to browse through the site's users with the option of using the following search filters:
age range
country
location (city)
gender
Should this be done without the use of something like Thinking Sphinx?
I used sphinx in my real projects.
I would recommend you to use thinking sphinx gem with Sphinx Search for following reasons:
1: Performence
All your searchable data will become search index file. Therefore while user
perform their search, there is no SQL query to the database.
2: It is easier to do complex search. If you have a look thinking sphinx wiki
you will see how many different kind of search you can do with it. Plus the
Geo-location Search. It is very hard or a lot more work needed if you do it by
using Rails where methods or SQL query.
By the way, use delta index and scheduled rake tasks to keep your search index up to date.
I normally rebuild my search index every early morning.

Store data in Ruby on Rails without Database

I have a few data values that I need to store on my rails app and wanted to know if there are any alternatives to creating a database table just to do this simple task.
Background: I'm writing some analytics and dashboard tools for my ruby on rails app and i'm hoping to speed up the dashboard by caching results that will never change. Right now I pull all users for the last 30 days, and re-arrange them so I can see the number of new users per day. It works great but takes quite a long time, in reality I should only need to calculate the most recent day and just store the rest of the array somewhere else.
Where is the best way to store this array?
Creating a database table seems a bit overkill, and I'm not sure that global variables are the correct answer. Is there a best practice for persisting data like this?
If anyone has done anything like this before let me know what you did and how it turned out.
Ruby has a built-in Hash-based key value store named PStore. This provides simple file based, transactional persistance.
PStore documentation
If you've got a database already, it's really not a big deal to create a separate table for tracking this sort of thing. When doing reporting, it's often to your advantage to create derivative summary tables exactly like what you're describing. You can update these as required using a simple SQL statement and there's no worry that your temporary store will somehow go away.
That being said, the type of report you're trying to generate is actually something that can be done in real-time except on extravagantly large data sets. The key is to have indexes that describe the exact grouping operation you're trying to do. For instance, if you're grouping by calendar date, you can create a "date" field and sync it to the "created_at" time as required. An index on this date field will make doing a GROUP BY created_date very quick:
SELECT created_date AS on_date, COUNT(id) AS new_users FROM users GROUP BY created_date
Using a lightweight database like sqlite shouldn't feel like an overkill. Alternatively, you can use key-store solutions like tokyo cabinet or even store the array in a flat file manually but I really don't see any overkill in using sqlite.

Resources