I'm developing a booking system in rails 5.1 for hotels, whereas the price per night depends on the duration of a stay. Ok, so basically this is a question related to database design and I would like your opinion on which option to go with:
Option A: I just go ahead and save the prices in my Room table in an array, so that:
price_increments = [1_day,2_days,...,n_days] = [80,65,...,x]
I could then access this array by passing the duration of stay, so that:
def booking
days = (end_date - start_date).to_i
price = price_increments.at(days)
total = price * days
end
Option B: I create an additional table for prices, but then I wouldn't be quite sure how to access the respective price with regards to the duration, especially since the application is supposed to be a platform with multiple hotels?
What do you think? Is it save to go with Option A or shall I try to go with Option B? What would be considered to be best practice? Any advice appreciated:)
For a good reason you should develop your logic price_increments record: we need to know if it's a regular price or with major.
Any way i think it's a possible to do your opération before saving record in a singular table with before_save
def Booking < ActiveRecord::Base
before_save :init
PRICE_INCREMENT = [one, two, three, four]
def init
self.duration ||= (end_date - start_date).to_i
self.price ||= duration * PRICE_INCEREMENT[duration]
end
end
Related
I'm looking at using the new Rails 5 attributes API for a custom data type, ideally storing the data in two database columns, one for the data value and one for some extra type information.
The Attributes API seems to be designed to work with just one database column and I'm wondering if I'm missing a way to use two columns.
Example
Imagine a Money object, with one decimal or integer column for value and one string column for currency code. I'd pass in my custom money object, store it two columns, and then reading it back would combine the two columns into a Money object.
I've considered serializing the value and currency into a single Postgres JSON column, but I want to be able to do fast SQL SUM queries and sorting on just the value columns, so this doesn't seem ideal.
Thanks in advance for any insight.
I guess you're thinking about creating a ValueObject within your model.
There is ActiveRecord::Aggregations for that. Example:
class Customer < ActiveRecord::Base
composed_of :balance, class_name: "Money", mapping: %w(balance amount)
end
class Money
include Comparable
attr_reader :amount, :currency
EXCHANGE_RATES = { "USD_TO_DKK" => 6 }
def initialize(amount, currency = "USD")
#amount, #currency = amount, currency
end
def exchange_to(other_currency)
exchanged_amount = (amount * EXCHANGE_RATES["#{currency}_TO_#{other_currency}"]).floor
Money.new(exchanged_amount, other_currency)
end
def ==(other_money)
amount == other_money.amount && currency == other_money.currency
end
def <=>(other_money)
if currency == other_money.currency
amount <=> other_money.amount
else
amount <=> other_money.exchange_to(currency).amount
end
end
end
Can't answer your question directly unfortunately, but your example got me thinking. the money-rails gem allows use of a separate currency column. Perhaps it would be worth it to dig thru that gem to see what they are doing behind the scenes.
I need to get the previous and next active record objects with Rails. I did it, but don't know if it's the right way to do that.
What I've got:
Controller:
#product = Product.friendly.find(params[:id])
order_list = Product.select(:id).all.map(&:id)
current_position = order_list.index(#product.id)
#previous_product = #collection.products.find(order_list[current_position - 1]) if order_list[current_position - 1]
#next_product = #collection.products.find(order_list[current_position + 1]) if order_list[current_position + 1]
#previous_product ||= Product.last
#next_product ||= Product.first
product_model.rb
default_scope -> {order(:product_sub_group_id => :asc, :id => :asc)}
So, the problem here is that I need to go to my database and get all this ids to know who is the previous and the next.
Tried to use the gem order_query, but it did not work for me and I noted that it goes to the database and fetch all the records in that order, so, that's why I did the same but getting only the ids.
All the solutions that I found was with simple order querys. Order by id or something like a priority field.
Write these methods in your Product model:
class Product
def next
self.class.where("id > ?", id).first
end
def previous
self.class.where("id < ?", id).last
end
end
Now you can do in your controller:
#product = Product.friendly.find(params[:id])
#previous_product = #product.next
#next_product = #product.previous
Please try it, but its not tested.
Thanks
I think it would be faster to do it with only two SQL requests, that only select two rows (and not the entire table). Considering that your default order is sorted by id (otherwise, force the sorting by id) :
#previous_product = Product.where('id < ?', params[:id]).last
#next_product = Product.where('id > ?', params[:id]).first
If the product is the last, then #next_product will be nil, and if it is the first, then, #previous_product will be nil.
There's no easy out-of-the-box solution.
A little dirty, but working way is carefully sorting out what conditions are there for finding next and previous items. With id it's quite easy, since all ids are different, and Rails Guy's answer describes just that: in next for a known id pick a first entry with a larger id (if results are ordered by id, as per defaults). More than that - his answer hints to place next and previous into the model class. Do so.
If there are multiple order criteria, things get complicated. Say, we have a set of rows sorted by group parameter first (which can possibly have equal values on different rows) and then by id (which id different everywhere, guaranteed). Results are ordered by group and then by id (both ascending), so we can possibly encounter two situations of getting the next element, it's the first from the list that has elements, that (so many that):
have the same group and a larger id
have a larger group
Same with previous element: you need the last one from the list
have the same group and a smaller id
have a smaller group
Those fetch all next and previous entries respectively. If you need only one, use Rails' first and last (as suggested by Rails Guy) or limit(1) (and be wary of the asc/desc ordering).
This is what order_query does. Please try the latest version, I can help if it doesn't work for you:
class Product < ActiveRecord::Base
order_query :my_order,
[:product_sub_group_id, :asc],
[:id, :asc]
default_scope -> { my_order }
end
#product.my_order(#collection.products).next
#collection.products.my_order_at(#product).next
This runs one query loading only the next record. Read more on Github.
Say I have an ActiveRecord object that contains a quantity and a price stored in the database.
I have defined a accessor for the total_price:
def total_price
quantity * price
end
Now what if I want to use this dynamic "attribute" in multiple ActiveRecord query contexts? I might to sum on it, compute average, for multiple scope, etc.
What would be the best practices so that I don't have to repeat this quantity * price with ActiveRecord and if I don't want to denormalize by writing it in DB?
Thanks!
Well we wanted to get caption (from join model) to appear on our associated image model (I.E if you called #user.images, you'd be able to call image.caption (even though caption was in the join model)
So we looked at this RailsCast (you'll benefit from around 6:40) which gave us some information about how you can use join to create more dynamic queries. We ended up using this:
has_many :images, -> { select("#{Image.table_name}.*, #{ImageMessage.table_name}.caption AS caption") }
I'm thinking you could use something similar for your request (include some SQL to create the pseudo column in the object). Since it's the origin model, I'm thinking about a scope like this:
default_scope select("(table.quantity * table.price) as total_price")
I assume price is stored in the database. Is quantity stored in the database? If both are stored, why not make total_price a database column as well? You can update total_price whenever you update the record.'
class Order < AR::Base
before_update :update_total_price
def update_total_price
self[:total_price] = quantity * price
end
end
Obviously you can do anything you would with an ordinary column, like Order.where("total_price > 1.0") and what-not.
I am creating a rails app that has a User and Post model that implements the Act_As_Votable gem.
I want users to be able to upvote and downvote posts, but also want to rank and sort posts by a weighted_score algorithm that takes into account the number of upvotes, downvotes, and time the post was created.
My weighted_score algorithm is taken from Reddit and better described here.
My Post Model:
class Post < ActiveRecord::Base
belongs_to :user
acts_as_votable
# Raw scores are = upvotes - downvotes
def raw_score
return self.upvotes.size - self.downvotes.size
end
def weighted_score
raw_score = self.raw_score
order = Math.log([raw_score.abs, 1].max, 10)
if raw_score > 0
sign = 1
elsif raw_score < 0
sign = -1
else
sign = 0
end
seconds = self.created_at.to_i - 1134028003
return ((order + sign * seconds / 45000)*7).ceil / 7.0
end
end
I want to use the Acts_As_Voteable gem because it supports caching which may decrease the number of hard disk writes and save time. Currently the weight_score of a post can be calculated on the fly but is not saved in the database, meaning I cannot do database sorts on posts with the highest weighted_score.
If I created a column in the post model I would have to update the posts table every time a user voted on a post, which defeats the purpose of using the Acts_As_Tagable gem (as I don't take advantage of its caching ability).
So I want to add a column to the votes table to store the weighted_score (which will then be calculated every time the post is voted on), as well as a method to the Votes model to calculate this score, however the gem does not provide a model when I run its generator. It only creates a votes table which I do not know how to access without a model.
Any help on how I can add such a weighted_score column and method to the votes model, or on how to achieve efficiently storing the weighted score of a post in a different manner is appreciated.
acts_as_voteable adds methods to your model to access the votes
http://juixe.com/techknow/index.php/2006/06/24/acts-as-voteable-rails-plugin/
positiveVoteCount = post.votes_for
negativeVoteCount = post.votes_against
totalVoteCount = post.votes_count
If you want to add a column, you can run a migration as normal on the table it creates. It also does appear to create a Vote model http://juixe.com/svn/acts_as_voteable/lib/vote.rb
I would add the weighted_score column to your Post model and handle updating via callback. For instance:
class Post < ActiveRecord::Base
#...
before_save :update_weighted_score
#...
def update_weighted_score
# check if some relevant variables have changed first, for example
if cached_votes_total.changed?
# do maths
weighted_score = blah
end
end
You can do this with MYSQL out of the box with decent results, used multi-line for easier readability.
Post.order("
LOG10( ABS( some_score ) + 1 ) * SIGN( some_score ) +
( UNIX_TIMESTAMP( created_at ) / 450000 ) DESC
")
450000 is the number to tweak that will given more weighting to the score vs. the created_at.
Closer to zero gives more weight to the new-ness.
45000 will roughly return scoring for the day
450000 will roughly return scoring for the week
4500000 will roughly return scoring for the month
So I have a Vendor model, and a Sale model. An entry is made in my Sale model whenever an order is placed via a vendor.
On my vendor model, I have 3 cache columns. sales_today, sales_this_week, and sales_lifetime.
For the first two, I calculated it something like this:
def update_sales_today
today = Date.today.beginning_of_day
sales_today = Sale.where("created_at >= ?", today).find_all_by_vendor_id(self.id)
self.sales_today = 0
sales_today.each do |s|
self.sales_today = self.sales_today + s.amount
end
self.save
end
So that resets that value everytime it is accessed and re-calculates it based on the most current records.
The weekly one is similar but I use a range of dates instead of today.
But...I am not quite sure how to do Lifetime data.
I don't want to clear out my value and have to sum all the Sale.amount for all the sales records for my vendor, every single time I update this record. That's why I am even implementing a cache in the first place.
What's the best way to approach this, from a performance perspective?
I might use ActiveRecord's sum method in this case (docs). All in one:
today = Date.today
vendor_sales = Sale.where(:vendor_id => self.id)
self.sales_today = vendor_sales.
where("created_at >= ?", today.beginning_of_day).
sum("amount")
self.sales_this_week = vendor_sales.
where("created_at >= ?", today.beginning_of_week).
sum("amount")
self.sales_lifetime = vendor_sales.sum("amount")
This would mean you wouldn't have to load lots of sales objects in memory to add the amounts.
You can use callbacks on the create and destroy events for your Sales model:
class SalesController < ApplicationController
after_save :increment_vendor_lifetime_sales
before_destroy :decrement_vendor_lifetime_sales
def increment_vendor_lifetime_sales
vendor.update_attribute :sales_lifetime, vendor.sales_lifetime + amount
end
def decrement_vendor_lifetime_sales
vendor.update_attribute :sales_lifetime, vendor.sales_lifetime - amount
end
end