Search in mongoid/mongo console not work - ruby-on-rails

I have some fields in my model category, tags which contains lots of text
Take the following data for example,
It should be found by these search keywords Iron, Pig, Academic Data
{
"_id" : "M0130AUSM561NNBR",
"name" : "Pig Iron Production for United StatesMonthly, Not Seasonally Adjusted, ",
"categories" : [
"Production of Commodities",
"NBER Macrohistory Database",
"Academic Data"
],
"tags" : "[\"iron\", \"metals\", \"nber\", \"production\", \"monthly\", \"nation\", \"usa\", \"nsa\"]",
"updated_at" : ISODate("2014-12-30T03:38:13.954Z"),
"created_at" : ISODate("2014-12-30T03:38:13.954Z")
}
I tried to query by Indicator.text_search("Pig")
But I got nothing
irb(main):005:0> Indicator.text_search("Pig")
=> #<Mongoid::Contextual::TextSearch
selector: {}
class: Indicator
search: Pig
filter: {}
project: N/A
limit: N/A
language: default>
Tried to search with Indicator.any_of({ :text => /.*Production.*/ })
I still got nothing.
=> #<Mongoid::Criteria
selector: {"$or"=>[{"text"=>/.*Production.*/}]}
options: {}
class: Indicator
embedded: false>
Tried to search in mongo console, still not works (nothing in the result)
> db.indicators.find({"title": /.*Production.*/})
>
Model : Indicator.rb
class Indicator
include Mongoid::Document
include Mongoid::Timestamps
include Mongoid::Attributes::Dynamic
# include Mongoid::Search
field :id, type: String
field :name, type: String
field :category, type: String
field :tags, type: String
# search_in :tags, :category
end

The shell query
db.indicators.find({ "title" : /.*Production.*/ })
won't find any documents because there is no title field, according to your sample documents. Is a text index defined?
db.indicators.ensureIndex({ "name" : "text", "categories" : "text", "tags" : "text" })
Additionally, tags is a string that resembles an array of strings - is that what you want? Or do you want n array of strings? I'd think you want
"tags" : ["iron", "metals", "nber", "production", "monthly", "nation", "usa", "nsa"]
rather than
"tags" : "[\"iron\", \"metals\", \"nber\", \"production\", \"monthly\", \"nation\", \"usa\", \"nsa\"]"

Related

How to query more fields on mongo DB aggregation query?

I would like to know how to add an extra field to the response of collection.aggregate?
The query below groups activities by user_id. And I would like to know how to also include the user_name in the response.
DB Models
class Activity
include Mongoid::Document
field :hd_race_id, type: Float
field :metric_elapsed_time, type: Float
field :metric_distance, type: Float
field :user_name, type: String
belongs_to :user
...
class User
include Mongoid::Document
field :user_name, type: String
has_many :activities
...
Query
Activity.collection.aggregate([
{
"$group" => {
"_id" => "$user_id",
"distance" => { "$sum" => "$metric_distance" },
"time" => { "$sum" => "$metric_elapsed_time" },
},
},
{ "$sort" => { "distance" => -1 } },
])
Thank you in advance
Use the operator $first (aggregation accumulator) inside the $group stage.
For example:
"user_name": {"$first": "$user_name"}
or for the programming language you are using (not sure what it is), try something like:
"user_name" => {"$first" => "$user_name"},
For an example, see the "Group & Total" chapter in my Practical MongoDB Aggregations book

"Field 'posts' is missing required arguments: id",

I am trying to play around with rails hooked up to graphql and I have the following error trying to display a series of posts by a user
"Field 'posts' is missing required arguments: id"
Here is my query:
query {
posts(user_id: 10, type: "Video") {
title
file
}
}
And in my query_type.rb file I have the following defined:
field :posts, [Types::PostType], null: false do
argument :id, ID, required: true, as: :user_id
argument :type, String, required: true
end
def posts(user_id:, type:)
posts = Post.where("user_id = ? AND type = ?", user_id, type)
end
It is a simple query. I'm new to this technology (GraphQL) and I don't see what the problem is. Can someone pinpoint what is wrong? Thank you.
You need to send the exact name in the parameters when running the query.
In your schema definition you have 2 required arguments called id of type ID and type of type string. So you have 2 options:
Update your query to send in the correct name id:
query {
posts(id: "10", type: "Video") {
title
file
}
}
Or, update your schema definition to receive a user_id:
field :posts, [Types::PostType], null: false do
argument :user_id, ID, required: true, as: :user_id
argument :type, String, required: true
end
def posts(user_id:, type:)
posts = Post.where("user_id = ? AND type = ?", user_id, type)
end

ElasticSearch + Tire how to force term to return same value as in field

I have list of countries and I whant users be able to sort results by county. So I have this helper:
def facets_for model, field
ul = ""
links = ""
model.facets[field]['terms'].each do |facet|
links << content_tag('li') do
link_to("#{facet['term']} #{facet['count']}", params.merge(field => facet['term']))
end
end
ul << content_tag("ul", class: field) do
links.html_safe
end
ul.html_safe
end
and in model:
class model
....
mapping do
indexes :country do
indexes :name, :type => :string, index: "not_analyzed"
end
end
def self.search params
...
filter :term, destination: params[:destination] if params[:destination].present?
facet("destination") { terms 'country.name' }
...
end
but
facet['term']
always return country name in lowercase. I could make it with Country.find(facet).name but I think it is unnecessary. Is there any way to store in facet same string value as in field?
Updated
my mapping:
{
"wishes" : {
"wish" : {
"properties" : {
"body" : {
"type" : "string"
},
"country" : {
"properties" : {
"name" : {
"type" : "string"
}
}
},
"country_id" : {
"type" : "string"
},
"created_at" : {
"type" : "date",
"format" : "dateOptionalTime"
}
} ... }}}
Your mapping is not created right, you can try to reindex your data.
Model.index.delete # to delete index with bad mapping
Model.create_elasticsearch_index # this should create index with mapping defined in Model
And after that you can try to run Model.import again.

A single row in my .csv fails to import due to TypeError: can't convert String into Integer, though others pass

Similar to this question...
When importing a CSV, I get the following Ruby 1.9.3 error: "TypeError: can't convert String into Integer"
It reads in full:
pry(main)> Lightbulb.import
TypeError: can't convert String into Integer
from /Users/user/.rvm/gems/ruby-1.9.3-p448/bundler/gems/rails-1c2717d3f5a3/activesupport/lib/active_support/core_ext/object/try.rb:36:in `[]'
I believe this is because we're passing a string (Amazon ASIN) as an index to the array in lightbulb.rb, as well as importing other data from the Amazon Product Inventory.
It works for all but on of my 35 rows. I think it is because the data for this particular Amazon product cannot be converted to an integer? I don't really know...
This is the import process which fails with one row in the CSV (listed below) where the string cannot be converted to an integer:
# asin organization_name lumens_per_watt light_output watts life_hours light_color
def self.import
destroy_all
table = CSV.read("#{Rails.root}/vendor/data/bulb.csv", :headers => true)
table.each do |row|
ap = Product.find_or_save(row['asin'])
create(
:price => ap.lowest_new_price,
:small_image => ap.small_image,
:medium_image => ap.medium_image,
:large_image => ap.large_image,
:detail_page_url => ap.detail_page_url,
:asin => ap.asin,
:brand => row['organization_name'],
:feature => ap.feature,
:organization_name => row['organization_name'],
:label => ap.label,
:manufacturer => row['organization_name'],
:product_model => ap.product_model,
:sku => ap.sku,
:title => ap.title,
:total_new => ap.total_new,
:editorial_reviews => ap.editorial_reviews,
:efficiency => row['lumens_per_watt'],
:brightness => row['light_output'],
:actual_watts => row['watts'],
:expected_life => row['life_hours'],
:light_color => row['light_color'],
:perceived_watts => calculate_perceived_watts(row['light_output'])
)
end
end
Here is the full code for Product.find_or_save:
class Product
include Mongoid::Document
include Mongoid::Timestamps
field :asin, type: String
field :parent_asin, type: String
field :detail_page_url, type: String
field :item_links, type: Array, default: []
field :sales_rank, type: String
field :small_image, type: String
field :medium_image, type: String
field :large_image, type: String
field :image_sets, type: String, default: []
field :brand, type: String
field :ean, type: String
field :ean_list, type: Hash, default: {}
field :feature, type: Array, default: []
field :item_dimensions, type: Hash, default: {}
field :label, type: String
field :manufacturer, type: String
field :product_model, type: String
field :mpn, type: String
field :package_dimensions, type: Hash, default: {}
field :part_number, type: String
field :product_group, type: String
field :product_type_name, type: String
field :publisher, type: String
field :sku, type: String
field :studio, type: String
field :title, type: String
field :lowest_new_price, type: String
field :lowest_used_price, type: String
field :total_new, type: String
field :total_used, type: String
field :offers, type: Hash, default: {}
field :customer_reviews, type: String
field :editorial_reviews, type: String
field :similar_products, type: Array, default: []
field :nodes, type: Array, default: []
def self.find_or_save(asin)
if item = where(asin: asin).first
item
else
amazon_product_hash = AmazonApi.product(asin)
attrs = ProductWrapper.parse(amazon_product_hash)
create(attrs)
end
end
If I have one row that fails - just one! :)
B00B4CPKT4,Philips,56,730,13,25000,2700
If I remove that row, just 1 of 35, it runs perfectly. It always fails at that one row in the .csv. I have placed it first, in the middle, and last - even rewritten it to find any hidden gremlin characters.
Here is the header and first row, for example, which work flawlessly:
asin,organization_name,lumens_per_watt,light_output,watts,life_hours,light_color
B00BXG7UZ8,Cree,75,450,6,25000,2700
I tried adding .to_i to make :asin => ap.asin.to_i, but no luck!
This is the last line I see in development.log, and I see 13 instances of this line corresponding to the 13 frustrating times I tried to run Lightbulb.import with this faulty product in the bulb.csv file:
MOPED: 54.234.253.6:33037 QUERY database=heroku_app14604604 collection=products selector={"$query"=>{"asin"=>"B00B4CPKT4"}, "$orderby"=>{:_id=>1}} flags=[] limit=-1 skip=0 batch_size=nil fields=nil (5231.9772ms)

Date in Mongoid queries

I have a model:
class SimpleAction
include Mongoid::Document
field :set_date, :type => Date
and I have some data in collection:
{ "_id" : ObjectId("4f6dd2e83a698b2518000006"), "name" : "lost",
"notes" : "", "set_date(1i)" : "2012", "set_date(2i)" : "3",
"set_date(3i)" : "25", "set_date(4i)" : "13", "set_date(5i)" : "57",
"duration" : 15, "todo" : "4" }
You can see that mongoid store date in the five fields - set_date(ni).
I have two question:
How can I filter data by set_date field in the mongo console client? Something like this:
db.simple_actions.find({ set_date : { "$lte" : new Date() } })
My query didn't return any data.
How can I filter data by set_date field in my Rails controller? Something like this:
#simple_actions = SimpleAction.where(:set_date => { '$lte' => Date.today })
I would recommend not using Date, but instead DateTime:
field :set_date, :type => DateTime
Now not only will it be stored in 1 field, like so:
"set_date" : ISODate("2012-03-14T17:42:27Z")
But Mongoid will correctly handle various conversions for queries like you want:
SimpleAction.where( :set_date => { :$lte => Date.today } )
You can use it with class Time.
Rails is so good with Api for mobile (such as Android),when you sent date from mobile to Api like: 1482723520. (last_created=1482723520)
You can use it like that:
time = Time.at(params[:last_created].to_i)
And in Rails you can query like that:
Number.where(created_at: { :$gte => time })

Resources