I am using Mongoid 3. I have a Video model. Should dates be an embedded document or an Array type?
If I have this structure:
{
:id => 2,
:dates => [
{
:date => Time.now.strftime('%Y%m%d').to_i,
:views => {
:non_uniques => 1,
:uniques => 1,
:countries => {
:us => 1,
:uk => 1
}
},
:likes => 1,
:comments => 1,
}
]
}
Moreover, should views, countries be an embedded document?
As you are planning to capture the additional information with the date , i think your current schema is correct . One aspect is also need to be consider , how you are going to use the data or query. If you want to see the total views and likes for a video for particular date i think your approach is correct , but if you are going to show overall likes and view rather than daily then array will be better. It is typically what you are doing is correct from the sense of NoSQL and embedded document but in last it all depends what all you want to query . Here your reading of daily statistic will be very fast.
Related
I'm working on a Rails application that uses Searchkick as an interface to Elasticsearch. Site search is working just fine, but I'm running into an unexpected issue on a page where I'm attempting to retrieve the most recent recoreds from Searchkick across a couple different models. The goal is a reverse chronological list of this recent activity, with the two object types intermingled.
I'm using the following code:
models = [ Post, Project ]
includes = {
Post => [ :account => [ :profile ] ],
Project => [ :account => [ :profile ] ],
}
#results = Searchkick.search('*',
:models => models,
:model_includes => includes,
:order => { :id => :desc },
:limit => 27,
)
For the purposes of getting the backend working, the page in development is currently just displaying the title, record type (class name), and ID, like this:
<%= "#{result.title} (#{result.class} #{result.id})" %>
Which will output this:
Greetings from Tennessee! (Post 999)
This generally seems to be working fine, except that ES is returning the results sorted by ID as strings, not integers. I tested by setting the results limit to 1000 and found that with tables containing ~7,000 records, 999 is considered highest, while 6905 comes after 691 in the list.
Looking through the Elasticsearch documentation, I do see mention of sorting numeric fields but I'm unable to figure out how to translate that to the Seachkick DSL. It this possible and supported?
I'm running Searchkick 4.4 and Elasticsearch 7.
Because Elasticsearch stores IDs as strings rather than integers, I solved this problem by adding a new obj_id field in ES and ordering results based on that.
In my Post and Project models:
def search_data
{
:obj_id => id,
:title => title,
:content => ActionController::Base.helpers.strip_tags(content),
}
end
And in the controller I changed the order value to:
:order => { :obj_id => :desc }
The records are sorting correctly now.
I know it's bad to search in serialize, but I encounter this problem.
I got a model called Question and contain a serialize column assignments
id question_set_id assignments
1 1 {"12982": true, "12332": true}
2 2 {"12222": true, "12332": true}
3 3 {'11111': true}
And I got a array called group_ids
group_ids = ["12982","12332"]
I need to find the record who contain at least one group_id in assignments.
So, the result of this example should be like
[
{
:id => 1,
:question_set_id => 1,
:assignments => {"12982": true, "12332": true}
},
{
:id => 2,
:question_set_id => 2,
:assignments => {"12222": true, "12332": true}
}
]
I've tried
Question.where("assignments IS NOT NULL").where("assignments LIKE '%?%'", 12982)
It seems works, but how to apply an array?
And according this answer, I tried
Question.where("assignments IS NOT NULL").where("assignments= ?", groups_ids.to_yaml)
However, it return a blank array.
With mysql 5.7.? you can query JSON data directly. I didn't try it but following the docs something like
Question.where("JSON_CONTAINS_PATH(assignments , 'one', '$[*].assignments.12982', '$[*].assignments.12332') == 1")
should work. You can furthor more convert your column from text to JSON data type and get validation of the json document and optimized storage.
I have such code
#pre_articles = Article.find(:all, :conditions => { :ART_ID => #linkla.map(&:LA_ART_ID)})
#articles = Kaminari.paginate_array(#pre_articles).page(params[:page]).per(15)
It's selecting for me array of data, but it's huge, so i decided to add pagination. It select's 15 entries for view, but also for every page (in log) i see, how sql is selecting all array as in first #pre_articles. For more speed: how to select on every page for example 0-15, 15-30, 30-45 etc entries and send it for view? now it's selecting all data, but dislpaying as i need
Oh sorry, important!:
#linkla = LinkArt.where(:LA_ID => #la_typs.map(&:LAT_LA_ID), :LA_GA_ID => #genart.map(&:GA_ID))
#articles = Article.where(:ART_ID => #linkla.map(&:LA_ART_ID)).page(params[:page]).per(15)
So looks my query. As you see depending on #linkla results i select articles, and link la is selecting many as before... how to do that he select only for my page
Solution to the stated problem:
LinkType
has_many :links
# let us assume there is a column called name
# I wasn't sure about this model from your description
GenArt
has_many :links
# let us assume there is a column called name
Link
belongs_to :article
belongs_to :link_type
belongs_to :gen_art
# columns: id, article_id, link_type_id, gen_art_id
Article
has_many :links
Assuming you the params hash contains link_type_names and gen_art_names
Article.joins(:links => [:link_type, :gen_art]).where(
:links => {
:link_type => {:name => params[:link_type_names]},
:link_type => {:name => params[:gen_art_names]}
}
).page(7).per(50)
What about using where clause instead of conditional find?
#articles = Article.where(:ART_ID: #linkla.map(&:LA_ART_ID)).page(params[:page]).per(15)
The SQL query generated will include a LIMIT clause to avoid loading unnecessary data.
How does one sum all of the "total" columns in an association?
My SQL-fu sucks, so I would like to learn how to do so using Active Record for my rails 2.3.5 app (so no fancy syntax just yet please ;-) And I'm on MySQL.
Let's say I have:
Shop
has_many :customers
has_many :transactions, :through => :customers
So normal stuff.
shop = Shop.first
shop.transactions
=> 100
Ok, all that background for the question:
I want to SUM the total column in the transactions from the past year (Jan 1st 2010..Dec 31 2010) and display them by customer.
Whilst I know how to group transactions and find with conditions, it's the sum part where my lack of SQL lets me down.
first = Date.new(2010, 01, 01)
last = Date.new(2010, 12, 31)
shop.transactions(:conditions => {:created_at => first..last}, :group => :customer_id, :include => sum(:total))
I just took a stab, am i on the right track?
shop.transactions.sum(:total, :conditions => {:created_at => first..last}, :group => :customer_id)
This looks like an easier way. I now know that sum can take AR attributes too. cool.
Look into collect methods.
You can do things like:
transactions = Shop.transactions
total = 0
sum = transactions.collect{|i| total+=i.transaction.amount}
Replace amount with your property that contains the amount of the transaction.
You could also use .sum
sum = transactions.to_a.sum(&:amount)
I have somewhat of a unique situation, if I had a form with a checkbox for every state (as in US states, so 50 states say), I don't really want to add 50 columns to my db, how can I store them in an array in a single column?
I feel like I've seen this done but I'm having a hard time putting my finger on the implementation.
ActiveRecord::Base.serialize. Straight from the rails docs:
class User < ActiveRecord::Base
serialize :preferences
end
user = User.create(:preferences => { "background" => "black", "display" => large })
User.find(user.id).preferences # => { "background" => "black", "display" => large }
You could set up a States table with many to many relationship between User and State also. This would make queries more efficient.