My rails application using sunspot solr to search the books/authors. Now If I give search query as
"Jeffrey Archer"
it ll give the set of books which has "jeffrey archer" as author. But If I give
"Jefrey Archer"
sunspot doesn't return any results. So that time I want to show
did you mean? "jeffrey archer"
How to implement this using sunpspot solr ?
Spellcheck integration into Sunspot is experimental: https://github.com/sunspot/sunspot/tree/spellcheck
I have not been very satisfied with the results that I've been able to achieve with spellcheck in Solr 1.4 and 3.x. However, with the release of Solr 4.0 and its improved spellcheck functionality, it's something that's due to be revisited.
You can try the branch I've linked, alternately you may want to use RSolr directly to execute your own spellcheck requests directly.
Pull requests welcome!
Solr provides a Spell Check Component using which you can provide Spell correction to the user based on the terms in your index.
Sunspot is just a CLient, so you just need to check how you configure Spell Check Component.
Related
I have used Sunspot for my website searching and filtering. I have also price slider on my website. Sunspot provide me stats for price columns but with filters. I want to exclude the price filters from stats. Otherwise stats will generate on filtered results.
I have got same issue which was mentioned below JIRA link.
https://issues.apache.org/jira/browse/SOLR-3177
It would be useful to exclude the effects of some "fq" params from the set of documents used to compute stats – similar to
how you can exclude tagged filters when generating facet counts...
https://wiki.apache.org/solr/SimpleFacetParameters#Tagging_and_excluding_Filters
So this was already implemented into SOLR. But I am not able to understand, how I can do same with Sunspot. Can any one help me for same?
Thanks
I'm using grails 3.1.5, I need to use solr to extract keywords from users's uploaded documents to check if they're valide or not, and if it is possible to check the keyword position in the picture, I could't find a working plugin of solr for grails 3+ so I think I will have to do it manually with groovy/java, is there a way to do what i'm asking for ? Solr wasn't my choice but the client's , is there another way to do what I'm asking for ? (Still prefering using solr)
Thank you
I've tried thinking sphinx after being pointed in that direction and simple filtering seems impossible. I've googled and asked questions for 2 days now and it seems it can't be done which is shocking because it's something commonly done when searching on websites.
All I would like to do add filtering options to my search form such as filtering by one or a combination of:
When user hits browse page all the sites users are returned but showing 20 results per page
Filtering options
in: location
who are: sexual preference
between the ages: age range
and located in: country
My search page works fine because all I require is 1 textfield a user uses for finding users by email, username or full name. My browse page is a different story because I'm using 1 form with multiple text fields and one or two select fields.
Example
Is there a gem that does this easily and performs well at the same time?
or would doing this manually via find methods be the only way?
Kind regards
Apart from using Sphinx and Thinking Sphinx, you can think of those gems: meta_where and meta_search
However after reading your description I think Sphinx is the best choice here indeed.
You wrote that it seems impossible to apply simple filtering using Thinking Sphinx. Let me explain a bit of Thinking Sphinx within the post you mentioned under the link: Example
You can go for Elasticsearch. Ruby has the 'Tire' gem, which is a client for ElasticSearch http://www.elasticsearch.org/
I have a mediawiki installation that I've customized with some of my own extensions. Here is the basic platform, pretty standard LAMP install.
Ubuntu Server
Apache 2
Mediawiki 1.15
PHP 5.2.6
MySQL 5.0.67
For the actual MW search I use Lucene (EzMwLucene). I also have custom extension that displays tabular data from a separate database within a MW page. Lucene doesn't index this info (which, in my case is actually good because it would clutter your expected search results). For this installation I didn't do anything to Lucene other than install it and wouldn't know how to customize it for my needs and it may be "too powerful".
At any rate, I need to create a search for the data in my other database. I have a master table that is updated daily based on data stored in other (normalized) tables. At the moment it is one of these searches that basically creates a SQL query based on the criteria you enter. This is a lot of work, though. I would like it to be more of a "type and submit" type search.
I don't think I need a comprehensive "cut & paste" type answer, but if anybody has something that I can google I would be very appreciative. I don't need to recreate the wheel, which is what I would be doing if I followed what I see in google.
If you would like to see my master database, let me know, I would want to sanitize it to make me more anonymous (whatever that means). Also, if you're familiar with MW and would like to see any of my extension code, again, let me know.
TL;DR: need to make a custom search feature with LAMP (displayed in Mediawiki). Any guidance appreciated.
Thanks SO!
Why do you need to add custom search? This will relate to the best answer.
For simplicity, you could use the Google Search Engine - http://www.mediawiki.org/wiki/Extension:Google_Custom_Search_Engine
Otherwise it sounds like you need to write a full-text query for the database.
I want to add a twitter feed for a specific keyword search to my rails application. Where should I start?
You might start with one of the Twitter API libraries written for Ruby.
You may want to consider grabbing the RSS feed for the search and parsing that. I show that in Railscasts episode 168. If you need something more fancy the API is the way to go as Dav mentioned.
But whichever solution you choose, it's important to cache the search results locally on your end. This way your site isn't hitting Twitter every time someone goes to the page. This improves performance and will make your site more stable (not breaking when twitter breaks). You can have the cache auto-update every 10 minutes (or whatever fits) using a cron task.
We download and store the tweets in a local database. I recently wrote a blog post about how I achieved this:
http://www.arctickiwi.com/blog/16-download-you-twitter-feed-using-ruby-on-rails-with-oauth
You can then use will_paginate to handle your pagination and go back as far as you want.