Static lookup on Rails Model - ruby-on-rails

I'm trying to figure out the best way to model a simple lookup in my rails app.
I have a model, Coupon, which can be of two "types", either Percent or Fixed Amount. I don't want to build a database table around the "type" lookup, so what I'd like to do is just have a coupon_type(integer) field on the Coupon table which can be either 1 (for Percent) or 2 (for Fixed).
What is the best way to handle this?
I found this: Ruby on Rails Static List Options for Drop Down Box but not sure how that would work when I want each value to have two fields, both ID and Description.
I'd like this to populate a select list as well.
Thank you for the feedback!

If this is really unlikely to change, or if it does change it will be an event significant enough to require redeployment, the easiest approach is to have a constant that defines the conditions:
COUPON_TYPES = {
:percent => 1,
:fixed => 2
}
COUPON_TYPES_FOR_SELECT = COUPON_TYPES.to_a
The first constant defines a forward mapping, the second in a format suitable for options_for_select.
It's important to note that this sort of thing would take, at most, a millisecond to read from a database. Unless you're rendering hundreds of forms per second it would hardly impact performance.
Don't worry about optimizing things that aren't perceptible problems.

Related

Lua - search for key inside of multiple tables

For my program, I have multiple tables inside of each other for organization and readability. These tables can look something like this
local buttons = {
loadingScreen = {
bLoadingEnter = function()
end
}, ...
}
What I want to do is find the first element named bLoadingEnter in the table. I dont know that the element named bLoadingEnter will be in loadingScreen. I've thought of getting all the keys in the table then check them. I couldn't get that to work. Any help would be appreciated!
If execution speed is in any way relevant, the answer is:
you don't
Looking up a key in a nested table takes a lot longer than looking it up in just a normal table. The next problem is that of uniqueness. Two or more nested tables can have the same key with different values, which may lead to odd bugs down the road. You'd either have to check for that when inserting (making your code even slower) or just hope things miraculously go well and nothing explodes later on.
I'd say just use a flat table. If your keys are well named (like bLoadingEnter) you'll be able to infer meaning from the name either way, no nesting required.
That being said, nested tables can be a good option, if most of the time you know what path to take, or when you have some sort of ordered structure like a binary search tree. Or if speed really isn't an important factor to consider.
Okay try
for _, item in ipairs(buttons) do
if item.bLoadingEnter ~= nil then
return item.bLoadingEnter
end
end

What is one way that I can reduce .includes association query?

I have an extremely slow query that looks like this:
people = includes({ project: [{ best_analysis: :main_language }, :logo] }, :name, name_fact: :primary_language)
.where(name_id: limit(limit).unclaimed_people(opts))
Look at the includes method call and notice that is loading huge number of associations. In the RailsSpeed book, there is the following quote:
“For example, consider this:
Car.all.includes(:drivers, { parts: :vendors }, :log_messages)
How many ActiveRecord objects might get instantiated here?
The answer is:
# Cars * ( avg # drivers/car + avg log messages/car + average parts/car * ( average parts/vendor) )
Each eager load increases the number of instantiated objects, and in turn slows down the query. If these objects aren't used, you're potentially slowing down the query unnecessarily. Note how nested eager loads (parts and vendors in the example above) can really increase the number of objects instantiated.
Be careful with nesting in your eager loads, and always test with production-like data to see if includes is really speeding up your overall performance.”
The book fails to mention what could be a good substitute for this though. So my question is what sort of technique could I substitute for includes?
Before i jump to answer. I don't see you using any pagination or limit on a query, that may help quite a lot.
Unfortunately, there aren't any, really. And if you use all of the objects in a view that's okay. There is a one possible substitute to includes, though. It quite complex, but it still helpful sometimes: you join all needed tables, select only fields from them that you use, alias them and access them as a flat structure.
Something like
(NOTE: it uses arel helpers. You need to include ArelHelpers::ArelTable in models where you use syntax like NameFact[:id])
relation.join(name_fact: :primary_language).select(
NameFact[:id].as('name_fact_id')
PrimaryLanguage[:language].as('primary_language')
)
I'm not sure it will work for your case, but that's the only alternative I know.
I have an extremely slow query that looks like this
There are couple of potential causes:
Too many unnecessary objects fetched and created. From you comment, looks like that is not the case and you need all the data that is being fetched.
DB indexes not optimised. Check the time taken by query. Explain the generated query (check logs to get query or .to_sql) and make sure it is not doing table scan and other costly operations.

Rails: To normalize or not to normalize for few values

Assuming in a Rails app connected to a Postgres database, you have a table called 'Party', which can have less than 5 well-defined party_types such as 'Person' or 'Organization'.
Would you store the party_type in the Party table (e.g. party.party_type = 'Person') or normalize it (e.g. party.party_type = 1 and party_type.id = 1 / party_type.name = 'Person')? And why?
If party type can be defined in code, I'll definitely go with the names "Person" etc.
If you expect such types will be dynamically added by admin/user, and have such GUI for it, then modelling it and set it like party.party_type = 1
Of course there will be a db storage/performance consideration between "1" VS "Person", but that's too minor to considerate when the app is not that big.
There are two issues here:
Are you treating these types generically or not?1
Do you display the type to the user?
If the answer to (1) is "yes", then just adding a row in the table is clearly preferable to changing a constraint and/or your application code.
If the answer to (2) is "yes", then storing a human-readable label in the database may be preferable to translating to human-readable text in the application code.
So in a nutshell, you'd probably want to have a separate table. On the other hand, if all types are known in advance and you just use them to drive specific paths of your application logic without directly displaying to user, then separate table may be superfluous - just define the appropriate CHECK to restrict the field to valid values and clearly document each value.
1 In other words, can you add a new type and the logic of your application will continue to work, or you also need to change the application logic?

What is the 'Rails Way' to implement a dynamic reporting system on data

Intro
I'm doing a system where I have a very simple layout only consisting of transactions (with basic CRUD). Each transaction has a date, a type, a debit amount (minus) and a credit amount (plus). Think of an online banking statement and that's pretty much it.
The issue I'm having is keeping my controller skinny and worrying about possibly over-querying the database.
A Simple Report Example
The total debit over the chosen period e.g. SUM(debit) as total_debit
The total credit over the chosen period e.g. SUM(credit) as total_credit
The overall total e.g. total_credit - total_debit
The report must allow a dynamic date range e.g. where(date BETWEEN 'x' and 'y')
The date range would never be more than a year and will only be a max of say 1000 transactions/rows at a time
So in the controller I create:
def report
#d = Transaction.select("SUM(debit) as total_debit").where("date BETWEEN 'x' AND 'y'")
#c = Transaction.select("SUM(credit) as total_credit").where("date BETWEEN 'x' AND 'y'")
#t = #c.credit_total - #d.debit_total
end
Additional Question Info
My actual report has closer to 6 or 7 database queries (e.g. pulling out the total credit/debit as per type == 1 or type == 2 etc) and has many more calculations e.g totalling up certain credit/debit types and then adding and removing these totals off other totals.
I'm trying my best to adhere to 'skinny model, fat controller' but am having issues with the amount of variables my controller needs to pass to the view. Rails has seemed very straightforward up until the point where you create variables to pass to the view. I don't see how else you do it apart from putting the variable creating line into the controller and making it 'skinnier' by putting some query bits and pieces into the model.
Is there something I'm missing where you create variables in the model and then have the controller pass those to the view?
A more idiomatic way of writing your query in Activerecord would probably be something like:
class Transaction < ActiveRecord::Base
def self.within(start_date, end_date)
where(:date => start_date..end_date)
end
def self.total_credit
sum(:credit)
end
def self.total_debit
sum(:debit)
end
end
This would mean issuing 3 queries in your controller, which should not be a big deal if you create database indices, and limit the number of transactions as well as the time range to a sensible amount:
#transactions = Transaction.within(start_date, end_date)
#total = #transaction.total_credit - #transaction.total_debit
Finally, you could also use Ruby's Enumerable#reduce method to compute your total by directly traversing the list of transactions retrieved from the database.
#total = #transactions.reduce(0) { |memo, t| memo + (t.credit - t.debit) }
For very small datasets this might result in faster performance, as you would hit the database only once. However, I reckon the first approach is preferable, and it will certainly deliver better performance when the number of records in your db starts to increase
I'm putting in params[:year_start]/params[:year_end] for x and y, is that safe to do?
You should never embed params[:anything] directly in a query string. Instead use this form:
where("date BETWEEN ? AND ?", params[:year_start], params[:year_end])
My actual report probably has closer to 5 database calls and then 6 or 7 calculations on those variables, should I just be querying the date range once and then doing all the work on the array/hash etc?
This is a little subjective but I'll give you my opinion. Typically it's easier to scale the application layer than the database layer. Are you currently having performance issues with the database? If so, consider moving the logic to Ruby and adding more resources to your application server. If not, maybe it's too soon to worry about this.
I'm really not seeing how I would get the majority of the work/calculations into the model, I understand scopes but how would you put the date range into a scope and still utilise GET params?
Have you seen has_scope? This is a great gem that lets you define scopes in your models and have them automatically get applied to controller actions. I generally use this for filtering/searching, but it seems like you might have a good use case for it.
If you could give an example on creating an array via a broad database call and then doing various calculations on that array and then passing those variables to the template that would be awesome.
This is not a great fit for Stack Overflow and it's really not far from what you would be doing in a standard Rails application. I would read the Rails guide and a Ruby book and it won't be too hard to figure out.

Questions about implementing surrogate key in Ruby on Rails

For an upcoming project we need to have unique real world identifiers that are exposed to users for things like Account Numbers or Case Numbers (like a bug tracking ID). These will always be system generated and unchangeable. Right now we plan to run strictly on Heroku.
While (as my name would suggest) I am new to the wonderfulness that is Ruby on Rails, I have a long background in enterprise application development. I'm trying to bridge between what I have done in the past while doing in the "RoR way"
Obviously RoR has wonderful primary key support. I have read dozens of posts here recommending to adapt business requirements to just use the out of the box id/key methodology.
So let me describe what I am trying to accomplish and please let me know if you have faced similar objectives and what approach you took.
1) Would like to have a human readable key with a consistent length. There is value in always having an Account ID or Transaction ID that is the same length (for form validation, training sales staff, etc.) Using Ruby's innate key generation one could just add buffer characters (e.g. 100000 instead of 1).
2) Compactness: My initial plan was to go with a base 36 unique key (e.g. 36 values [0..9],[a..z]). As part of our API/interface we plan on exposing certain non-confidential objects based on a shortform URL (e.g. xx.co/000001). I like the idea of being able to have a five character identifier in base 36 vs. 7+ in decimal.
So I can think of two possible approaches:
a) add my own field and develop my own unique key generator (or maybe someone will point me to one).
b) Pad leading digits (and I assume I can force the unique key generation to start at 1xxxxxxx rather than 0000001). Then use the to_s(36) method to convert it to and from base 36 for all interactions with humans. Maybe even store the actual ID value in the database in the base 36 format to avoid ongoing conversions, but always do the conversion before a query to avoid the need to have another index.
I'm leaning towards approach B, as it seems like it would be optimal from a DB performance standpoint and that it would require the least investment in non-value added overhead. Once again, any real world experience with these topics and thoughts on the best approach would be greatly appreciated.
Thanks in advance!
I would never use the primary key in a Rails table for anything of business importance. There will come a day when someone on the business end will want to change it, and it'll end up being an enormous pain in the butt and will invalidate a bunch of URLs you and your users thought were canonical and will mess up all your foreign keys and blah blah blah. It's just a really bad idea and I would encourage you not to do it.
The Rails way to do this is have a new column, called something like number or bug_tracking_number or whatever strikes your fancy, and before_validation implement a callback that gives it a value. This is where you can let your creativity shine; something like this sounds like what you want:
before_validation( :on => :create ) do
self.number = CaseNumber.count + 1
end
You can pad the number there, ensure its uniqueness, or do whatever else you want.

Resources