ruby on rails sunspot solr search show some specific results on top - ruby-on-rails

I'm using Sunspot Solr for indexing and searching in our Rails application
search = Product.search do
with(:categoery, [1,2,3])
order_by(:priority)
#and some other filters
end
But for some specific Users i want to display some product with id suppose [8,9] ordered on-top of search result and then other products(as i don't want to discard other results), is it possible in same search query?

Have a look at the Query Elevation Component :
The Query Elevation Component lets you configure the top results for a
given query regardless of the normal Lucene scoring. This is sometimes
called "sponsored search," "editorial boosting," or "best bets." This
component matches the user query text to a configured map of top
results.
The text can be any string or non-string IDs, as long as it's indexed.
Although this component will work with any QueryParser, it makes the
most sense to use with DisMax or eDisMax.
The Query Elevation Component is supported by distributed searching.

Related

How do a general search across string properties in my nodes?

Working with Neo4j in a Rails app.
I have nodes with several string properties containing long strings of user generated content. For example in my nodes of type: "Book", I might have properties, "review", and "summary", which would contain long-form string values.
I was trying to design queries that returned nodes which match those properties to general language search terms provided by a user in a search box. As my query got increasingly complicated, it occurred to me that I was trying to resolve natural language search.
I looked into some of the popular search gems in Rails, but they all seem to depend on ActiveRecord. What search solutions exist for Neo4j.rb?
There are a few ways that you could go about this!
As FrobberOfBits said, Neo4j has what are called "legacy indexes" which use Lucene it the background to provide indexing of generic things. It does support the new schema indexes. Unfortunately those are based on exact matches (though I'm pretty sure that will change in Neo4j 2.3.x somewhat).
Neo4j does support pattern matching on strings via the =~ operator, but those queries aren't indexed. So the performance depends on the size of your database.
We often recommend a gem called searchkick which lets you define indexes for Elasticsearch in your models. Then you can just call a Model.search method to do your searches and it will first query elasticsearch to get the node IDs and then load those nodes via Neo4j.rb. You can use that via the neo4j-searchkick gem: https://github.com/neo4jrb/neo4j-searchkick
Lastly, if you're doing NLP and are trying to extract important words from your text, you could create a Tag/Word label and create relationships from your nodes to these NLP extracted nodes so that you can search based on those nodes in the future. You could even build recommendations from one text node to another based on the number/type of common tag nodes.
I don't know if anything specific exists for neo4j.rb and activerecord. What I can say is that generally this stuff is handled through the use of legacy indexes that are implemented by Lucene.
The premise is that you create a lucene-managed index on certain properties, and that then gives you access to use the Lucene query language via cypher to get data from those indices. Relative to neo4j.rb, it doesn't look any different than running cypher queries, like this:
START item=node:node_auto_index("(title:'foo bar' AND body:baz*) OR title:'bat'")
RETURN item
Note that lucene indexes and that query language can only be used in a START block, not a MATCH block. Refer to the Lucene Query Syntax to discover more about what you can do with that query syntax (fuzzy matching, wildcards, etc -- quite a bit more extensive than what regex would give you).

rails search all columns of all rows for value

I have a search record that stores an all_words column. It is a list of words separated by a space. I have another model named Lead. I want to search all the columns of all the rows of the leads table for the values in all_words. And any record that produces a match in any of its columns will be retrieved. Kind of like this:
possible_values = search.all_words.split
Lead.where(first_name: possible_values )
.where(last_name: possible_values )
.where(status: possible_values )
...
But this doesn't look clean. How can I go about this?
Indexing
You'll be much better suited to using an index-based search solution
I wrote about this the other day - if you're going to "search" your database (especially multiple attributes), you really need to use a third party solution to provide access to the data you require.
The "common" way to search databases, across all flavours of SQL, is to use full text search (which basically looks up information within an array of different attributes / columns, rather than specific matches).
The following solutions are popular for Rails based projects:
Thinking Sphinx
Sunspot Solr
ElasticSearch
--
References
The magic of these is that they will index any of the data you wish to search, storing the data in a semi-persistent data set.
This is vitally important, as one of the main reasons why full text searching your database is a bad idea is the performance implications it will cause. You'll be best using one of the aforementioned gems to get it working correctly
There's a good Railscast about this here:
What you are looking for is full-text search. Depending on the type of database you have you will use different strategies.
You will be able to create a search index on as many columns as you like.
For Postgresql
The good thing is that Postgresql already has full-text search capabilities. You can use those gems to benefit from it.
PG Search
Textacular
For Mysql
Dusen (uses FULLTEXT index capabilities in MySQL and LIKE queries)
Thinking sphinx (uses Sphinx search server)
Sunspot (uses solr search server)

Caching results of query when filters are unchanged

I am building a contact management sort system. I am having a list page which has several filters to filter the results such as "area", "category", etc. And also I have search fields for name, address and contact info.
Suppose I set area as "Chicago" and category as "Family" and then press "apply filters" (filters and search fields will be submitted), I will get the result. Now if I had mentioned something in name filed then Il attach a where query to the resulting activerelation.
Suppose Ive got a result with above filters in one request. Then I want to search a different name, Ill have to query the database with the filters of are and category again which is not necessary.. is there a way to cache results from previous search?
I recommend not worrying about this until you can show you have a problem.
If you did have a problem you could:
Return all results and do the filtering in JavaScript
Cache all results on the server and do the filtering in Ruby there

Order Solr/sunspot search results by geo location

I'd like to be able to order my search results by score and location. Each user in the DB has lat/lot and I am currently indexing:
location :coordinates do
Sunspot::Util::Coordinates.new latlon[0], latlon[1]
end
The model which I would performing the search against is also indexed in the same manner. Essentially what I am trying to achieve is that the results be ordered by score and then by location. So if I search for Walmart, I would like to see all Walmart's ordered by their geo proximity to my location.
I remember reading something about solr's new geo-sort but not sure if it is out of alpha and/or if sunspot has implemented a wrapper.
What would you recommend?
Because of the way that Sunspot calculates location types you'll need to do some extra leg work to have it sort by distance from your target as well. The way it works is that it creates a geo-hash for each point and then searches using regular fulltext search on that geo-hash. The result is that you probably won't be able to determine if a point 10km away is further than a point that is 5km away, but you'll be able to tell if a point 50km away is further than a point 1-2km away. The exact distances are arbitrary but the result is that you probably won't have as fine-grained of a result as you would like and the search acts more as a way to filter points that are within an acceptable proximity. After you have filtered your points using the built-in location search, there are three ways to accomplish what you want:
Upgrade to Solr 3.1 or later and upgrade your schema.xml to use the new spatial search columns. You'll then need to make custom modifications to Sunspot to create fields and orderings that work with these new data types. As far as I know these aren't available in Sunspot yet, so you'll have to make those connections on your own and you'll have to dig around in Solr to do some manual configurations.
Leverage the Spatial Solr Plugin. You'll have to install a new JAR into your Solr directory and you'll have to make some modifications to Sunspot, but they are relatively painless and the full instructions can be found here.
Leverage your DB, if your DB is also indexed on the location columns then you can use the Sunspot built-in location search to filter your results down to a reasonable sized set. You can then query the DB for those results and order them by proximity to your location using your own distance function.

Lucene term boosting with sunspot-rails

I'm having an issue with Lucene's Term [Boosting][1] query syntax, specifically in Ruby on Rails via the sunspot_rails gem. This is whereby you can specify the weight of a specific term during a query, and is not related to the weighting of a particular field.
The HTML query generated by sunspot uses the qf parameter to specify the fields to be searched as configured, and the q parameter for the query itself. When the caret is added to a search term to specify a boost (i.e. q=searchterm^5) it returns no results, even though results would be returned without the boost term.
If, on the other hand, I create an HTTP query manually and manually specify the field to search (q=title_texts:searchterm^5), results are returned and scores seem affected by the boost.
In short, it appears as though query term boosting doesn't work in conjunction with fields specified with qf.
My application calls for search across several fields, using the respective boosts associated to those fields, conditionally in turn with boosting on individual terms of a query.
Any insight?
[1]: http://lucene.apache.org/java/2_9_1/queryparsersyntax.html#Boosting a Term
Sunspot uses the dismax parser for fulltext search, which eschews the usual Lucene query syntax in favor of a limited (but user-input-friendly) query syntax combined with a set of additional parameters (such as qf) that can be constructed by the client application to tune how search works. Sunspot provides support for per-field boost using the boost_fields method in the fulltext DSL:
http://outoftime.github.com/sunspot/docs/classes/Sunspot/DSL/Fulltext.html#M000129
The solution I have found is to use DisMax, but adding the bq parameter with a boolean string with the boosted terms therein.

Resources