I have an app where the client will provide a list of pre-generated codes. When someone purchases a license, one of these codes will be served up.
There is a Product model that has_many :codes, and a Code model that belongs_to :product. The code model has a state which is either "assigned" or "unassigned".
How do I ensure that each code gets used only once, even if multiple processes are trying to fetch it? In the bad old days, I would lock the record, and if I couldn't lock it, move to the next one, but I'm not even sure I can do something like that in Rails.
The "bad old days" where ACID existed are still today. Read more on Rails/AR's locking on railsguides.
Item.transaction do
i = Item.lock("LOCK IN SHARE MODE").find(id)
# get rollin
i.save!
end
Related
I have a scenario in which I need to dump/transfer data of one user from my Rails App to another database of same configuration and tables. For instance, The application is built as
class Company < ActiveRecord::Base
has_many :depots
has_many :users
end
class Depot < ActiveRecord::Base
belongs_to :company
has_many :products
end
class User < ActiveRecord::Base
belongs_to :company
end
class Product < ActiveRecord::Base
belongs_to :depot
end
My requirement is, if companyA stops paying, I want to dump their data into another DB (databse2 for instance) to keep my actual DB clean and once they come back and start paying, I want this data back.
Second requirement is, database2 can already have some data in it. So I need to retain all the records and I want to change the IDs of companyA (as there can already be a company with the same ID) while saving in to database2 keeping associations intact. It might seem silly to do this, but that is my requirement.
I am using Postgres as my application DB.
Any helps???
You have a few options here worth investigating:
Output everything in a singular JSON file that encodes everything the client had in the database, complete with ID fields.
Dump out a series of CSV files that can be imported on the destination server.
Dump out a single .sql file that will properly restore the data simply by running it.
The first option is the most elegant, but probably requires the most work. It gives you the ability to archive your data in a neat, tidy file that's easily parsed.
The second option might be fine or could be severely ugly depending on the sorts of data you have. If there's any binary data involved that's probably not going to work, but for clean, text-only columns and tabular data it's usually fairly efficient. The advantage here is you can selectively load in parts of your data without having to commit to parsing all of it.
The third option isn't easily parsed, you need to restore it to be able to use it, but it does make insertion really, really simple. You will only have to write an archiver, no specific restoration tool is required.
Whatever approach you take you'll need to be absolutely certain that ID numbers are never, ever reissued. Do not reset your sequence generators to fill in holes, and when moving databases port these over as well and test that they're set correctly. The last thing you need is ID conflicts.
If you're really worried about ID conflicts you might want to switch to non-numeric IDs, like use a UUID for everything where conflicts are basically irrelevant, though this does not come without a cost.
I'm creating a customized "Buy Now" page that is a combination of User, address, Sale, SaleLine, and Payment models. To initiate the payment process, I need to specifie the Sale ID in a callback. So my #new method looks something like this...
# new
def bitcoin
require 'cgi'
#payment_to_tender = get_cost
#sale = Sale.create
begin
payment = create_new_payment_wallet(ENV["BITCOIN"], "http://www.example.com/a/b", #sale.id)
end
end
So the key line in there is the middle where a new Sale record is created. This page doesn't require any kind of login or anything (because it's the signup page technically). Will this be a problem? I think anytime, even a bot navigates to the page, it will spawn yet another Sale record. Will that eventually catch up with me? Should I run a script nightly that deletes all orphan Sale records that are older than a day, or should I try a different algo?
Rails can handle as many models as required
Models are just .rb files which are opened when you call ActiveRecord, they're not applications or anything super-resource intensive
However, what they represent is much more than just opening a file. The question you're really asking is "is my schema set up correctly?", which is a different ballgame
ActiveRecord Assocations
Rails is "object orientated", which means everything you do has to work around an object. This is typically an ActiveRecord Object, which is made up of a database query & associated data
One of the biggest problems with Rails apps is an inefficient use of the ActiveRecord Association structure. ActiveRecord Associations work by defining "relations" in your models, allowing you to call one piece of data & automatically have its related data attached in the object
The problem for most people is ActiveRecord Assocations pick up data they don't need, causing unnecessary expensive database calls. This is where the problems arise, and is what you're trying to address
Creating Independent Records
If you want to create a record with another, you can use the after_create method, like this:
#app/models/bitcoin.rb
Class BitCoin < ActiveRecord::Base
after_create :create_sale
end
This will actually create a sale record for you, if it's related correctly
I imported a db schema and its content from a legacy site into a Rails project.
I have DVDs and votes. Votes are simply an integer field in Dvd model like Dvd.votes:integer=10.
Since I can also associate new votes to users easily in Rails, I created a new Vote model that belongs to Dvd and User models.
My situation is that now when calling Dvd.first.votes I get an empty array [] since there's no data yet on the Vote model but the votes_count still has the imported counts.
(I also renamed the Dvd.votes field to Dvd.votes_count after adding the Vote model and added belongs_to :dvd, :counter_cache => true on Vote model.)
I'm not sure what would be the ideal solution to this problem. Should I call votes by simply querying Dvd.votes_count or there's a better way to do it?
Sorry, it might be not an answer for your question, but instead of creating your own solution I recommend to use gem: https://github.com/bouchard/thumbs_up
which already optimized. In your situation it may save a lot of time.
So you need to install gem and then import existing votes to it.
I have a requirement that certain attribute changes to records are not reflected in the user interface until those changes are approved. Further, if a change is made to an approved record, the user will be presented with the record as it exists before approval.
My first try...
was to go to a versioning plugin such as paper_trail, acts_as_audited, etc. and add an approved attribute to their version model. Doing so would not only give me the ability to 'rollback' through versions of the record, but also SHOULD allow me to differentiate between whether a version has been approved or not.
I have been working down this train of thought for awhile now, and the problem I keep running into is on the user side. That is, how do I query for a collection of approved records? I could (and tried) writing some helper methods that get a collection of records, and then loop over them to find an "approved" version of the record. My primary gripe with this is how quickly the number of database hits can grow. My next attempt was to do something as follows:
Version.
where(:item_type => MyModel.name, :approved => true).
group(:item_type).collect do |v|
# like the 'reify' method of paper_trail
v.some_method_that_converts_the_version_to_a_record
end
So assuming that the some_method... call doesn't hit the database, we kind of end up with the data we're interested in. The main problem I ran into with this method is I can't use this "finder" as a scope. That is, I can't append additional scopes to this lookup to narrow my results further. For example, my records may also have a cool scope that only shows records where :cool => true. Ideally, I would want to look up my records as MyModel.approved.cool, but here I guess I would have to get my collection of approved models and then loop over them for cool ones would would result in the very least in having a bunch of records initialized in memory for no reason.
My next try...
involved creating a special type of "pending record" that basically help "potential" changes to a record. So on the user end you would lookup whatever you wanted as you normally would. Whenever a pending record is apply!(ed) it would simply makes those changes to the actual record, and alls well... Except about 30 minutes into it I realize that it all breaks down if an "admin" wishes to go back and contribute more to his change before approving it. I guess my only option would be either to:
Force the admin to approve all changes before making additional ones (that won't go over well... nor should it).
Try to read the changes out of the "pending record" model and apply them to the existing record without saving. Something about this idea just doesn't quite sound "right".
I would love someone's input on this issue. I have been wrestling with it for some time, and I just can't seem to find the way that feels right. I like to live by the "if its hard to get your head around it, you're probably doing it wrong" mantra.
And this is kicking my tail...
How about, create an association:
class MyModel < AR::Base
belongs_to :my_model
has_one :new_version, :class_name => MyModel
# ...
end
When an edit is made, you basically clone the existing object to a new one. Associate the existing object and the new one, and set a has_edits attribute on the existing object, the pending_approval attribute on the new one.
How you treat the objects once the admin approves it depends on whether you have other associations that depend on the id of the original model.
In any case, you can reduce your queries to:
objects_pending_edits = MyModel.where("has_edits = true").all
then with any given one, you can access the new edits with obj.new_version. If you're really wanting to reduce database traffic, eager-load that association.
Let's say I had an app that was an address book. I'd like to have a page dedicated to a "dashboard". On this page, I'd like to have a running list of the events that happen within the app itself.
Event examples could be:
A user adds a contact.
A user deletes a contact.
A user updates a contact.
What would be the best way to create this type of functionality? Originally I felt that I could do some creative database calls with existing data, but I wouldn't be able to deal with events that deleted data, like when a contact is deleted.
So now I'm thinking it would have to be a separate table that simply stored the events as they occurred. Would this be how most sites accomplish this?
I could go throughout my app, and each time a CRUD operation is performed I could create a new item in the table detailing what happened, but that doesn't seem very DRY.
I supposed my question would be - what's the best way to create the dashboard functionality within an already existing application such as an address book?
Any guidance would be greatly appreciated.
The easiest way to do this is to user Observers in addition to a "logger" table in your database.
Logger
id
model_name
model_id
message
This way you can set up an Observer for all models that you want to log, and do something like this:
after_delete(contact)
Logger.create({:model_name => contact.class.to_s,
:model_id => contact.id,
:message => "Contact was deleted at #{Time.now}"})
end
Now you can log any event in a way you deem fit. Another great addition to this kind of structure is to implement "Logical Deletes", which means you never really delete a record from the table, you simple give it a flag so that it no longer shows up in regular result sets. There's a plugin that does this called acts_as_paranoid.
If you implement both things above, the dashboard can log all important actions, and if you ever need to see what happened or view the data of those events, it's all in the system and can be accessed via the Console (or controllers, if you set them up).
You may want to check out Timeline Fu: http://github.com/jamesgolick/timeline_fu:
class Post < ActiveRecord::Base
belongs_to :author, :class_name => 'Person'
fires :new_post, :on => :create,
:actor => :author
end
I've created similar functionality in the past using acts_as_audited.
This helps you track changes to your models, which you can then present to the user.
It basically just tracks the events in a separate table, as you suggested.
You can use Observers in order to handle the events.
Then just store event with the information needed in the database from those Observers.
Here is a quick link to get you started.
user paper_trail plugin, it is awesome!. We modified it though, it is used for all our audit system for complicated release process.