I have a UserReport model that connects a User model and a Report model. (has many through association).
I have another model called Comment which belongs to UserReport. (has many association)
When a report is created I need to create a UserReport for all users with one default comment.
My question is how to do that in a way that will rollback the report creation if any one of the child records fail to save.
My goal is to ensure that the DB will not stay in in-consisted state.
Any suggestions?
You want something called a transaction. The code would look something like
begin
Report.transaction do
# create report like Report.create! or something
# create comments like Comment.create! or something
end
rescue
# there was an error
end
Inside the transaction, if an error is thrown the database is reverted to what it was before the entire transaction was begun. In the rescue, you can handle any errors that were thrown.
When you save a model, the entire process is wrapped in a transaction that will be rolled back if the save fails (due to validations, callbacks, etc). So if you build your whole object tree in memory first, then attempt to save the report, none of your objects will be saved if there are any failures.
Here's an example of how you might do this:
# in report.rb
class Report < ActiveRecord::Base
validates_associated :user_reports
end
# in user_report.rb
class UserReport < ActiveRecord::Base
validates_associated :comments
end
# in your controller or wherever you're doing this
report = Report.new
User.pluck(:id).each{ |user_id| report.user_reports.build(user_id: user_id) }
report.user_reports.each{ |user_report| user_report.comments.build }
report.save # will always save either everything or nothing, no inconsistencies
Note the use of #new and #build to avoid committing anything until the final line. The validates_associated lines in the models cause any validation errors on the child objects to propagate to the parent object, preventing it from saving even if the parent object itself passes validation.
Related
I have two ActiveRecord models having a HABTM relationship with eachother.
When I add an AccessUnit through a form that allows zones to be added by checking checkboxes I get an exception that the AccessUnitUpdaterJob can't be enqueued because the access unit passed can't be serialized (due to the fact that the identifier is missing). When manually calling save on the primary object, the issue is resolved but of course this is a workaround and not a proper fix.
TLDR; it seems the after_add callback is triggered before the main object is saved. I'm actually unsure if this is a bug in Rails or expected behavior. I'm using Rails 5.
The exact error I encounter is:
ActiveJob::SerializationError in AccessUnitsController#create
Unable to serialize AccessUnit without an id. (Maybe you forgot to call save?)
Here's some code so you can see the context of the issue:
class AccessUnit < ApplicationRecord
has_and_belongs_to_many :zones, after_add: :schedule_access_unit_update_after_zone_added_or_removed, after_remove: :schedule_access_unit_update_after_zone_added_or_removed
def schedule_access_unit_update_after_zone_added_or_removed(zone)
# self.save adding this line solves it but isn't a proper solution
puts "Access unit #{name} added or removed to zone #{zone.name}"
# error is thrown on this line
AccessUnitUpdaterJob.perform_later self
end
end
class Zone < ApplicationRecord
has_and_belongs_to_many :access_units
end
In my point of view it is not a bug. Every thing works as expected . You can create a complex graph of objects before you save this graph. During this creation phase, you can add objects to an association. This is the point in time where you want fire this callback, because it says after_add and not after_save.
For instance:
#post.tags.build name: "ruby" # <= now you add the objects
#post.tags.build name: "rails" # <= now you add the objects
#post.save! # <= now it is to late, for this callback, you added already multiple objects
Maybe with a before_add callback it makes more sense:
class Post
has_many :tags, before_add: :check_state
def check_state(_tag)
if self.published?
raise CantAddFurthorTags, "Can't add tags to a published Post"
end
end
end
#post = Post.new
#post.tags.build name: "ruby"
#post.published = true
#post.tags.build name: "rails" # <= you wan't to fire the before_add callback now, to know that you can't add this new object
#post.save! # <= and not here, where you can't determine which object caused the error
You can read a little bit about these callback within the book "The Rails 4 Way"
In your case you have to rethink your logic. Maybe you can use an after_savecallback.
My 2 cents: You consider switching from callbacks to service object.
Callbacks don't come without a cost. They are not always easy to debug and test.
I want to preview what the model will look like when saved without currently saving to the database.
I am using #event.attributes = because that assigns but does not save attributes for #event to the database.
However, when I also try to assign the audiences association, Rails inserts new records into the audiences_events join table. Not cool. Is there a way to preview what these new associations will look like without inserting into the join table?
Model
class Event < ActiveRecord::Base
has_and_belongs_to_many :audiences # And vice versa for the Audience model.
end
Controller
class EventsController < ApplicationController
def preview
#event = Event.find(params[:id])
#event.attributes = event_params
end
private
def event_params
params[:event].permit(:name, :start_time, :audiences => [:id, :name]
end
end
Possible Solutions?
Possible solutions that I thought of, but don't know how to do:
Using some sort of method that assigns associations, but does not persist them.
disabling all database writes for this one action (I dont know how to do that).
Rolling back all database changes at the end of this action
Any help with these would be great!
UPDATE:
After the reading the great answers below, I ended up writing this service class that assigns the non-nested attributes to the Event model, then calls collection.build on each of the nested params. I made a little gist. Happy to receive comments/suggestions.
https://gist.github.com/jameskerr/69cedb2f30c95342f64a
In these docs you have:
When are Objects Saved?
When you assign an object to a has_and_belongs_to_many association, that object is automatically saved (in order to update the join table). If you assign multiple objects in one statement, then they are all saved.
If you want to assign an object to a has_and_belongs_to_many association without saving the object, use the collection.build method.
Here is a good answer for Rails 3 that goes over some of the same issues
Rails 3 has_and_belongs_to_many association: how to assign related objects without saving them to the database
Transactions
Creating transactions is pretty straight forward:
Event.transaction do
#event.audiences.create!
#event.audiences.first.destroy!
end
Or
#event.transaction do
#event.audiences.create!
#event.audiences.first.destroy!
end
Notice the use of the "bang" methods create! and destroy!, unlike create which returns false create! will raise an exception if it fails and cause the transaction to rollback.
You can also manually trigger a rollback anywhere in the a transaction by raising ActiveRecord::Rollback.
Build
build instantiates a new related object without saving.
event = Event.new(name: 'Party').audiences.build(name: 'Party People')
event.save # saves both event and audiences
I know that this is a pretty old question, but I found a solution that works perfectly for me and hope it could save time to someone else:
class A
has_many :bs, class_name 'B'
end
class B
belongs_to :a, class_name: 'A'
end
a.bs.target.clear
new_bs.each {|new_b| a.bs.build new_b.attributes.except('created_at', 'updated_at', 'id') }
you will avoid autosave that Rails does when you do a.bs = new_bs
Rails 3.1.3 - ruby 1.9.3p194
I have 2 objects: Patient & Bill.
When Patient gets destroyed the corresponding Bill gets destroyed. Which is fine; however, these objects are also held remotely on Quickbooks and are updated through my application.
If a Bill is destroyed locally, my application deletes the object in Quickbooks. I do not destroy Patients in Quickbooks because there is most likely associated billing history stored there.
The problem arises when someone destroys a Patient locally, it calls destroy for all associated Bills, which fires the destroy method for Quickbooks Bills.
Is there a way to see if patient.destroy has been called from Bill model?
I am assuming you have something like this
class Patient < ActiveRecord::Base
has_many :bills, dependent: :destroy
end
class Bills < ActiveRecord::Base
belongs_to :patient
end
You could change this to
class Patient < ActiveRecord::Base
has_many :bills, dependent: :delete_all
end
destroy has callbacks which you are using to remove bills from quickbooks.
delete has no callbacks it is just a deletion from the database straight through.
Thus if you destroy a patient it will delete all the Bills locally but will not run the callbacks to remove it from quickbooks.
Another approach to this issue would be to remove the callbacks and to perform the deletion with a separate coordinating object, which would get called from your controller:
class PatientAndBillRemoteCleanup
def initialize(patient)
#patient = patient
end
def delete
Patient.transaction do
# fill out these methods
# transaction rewinds if you have failures
delete_patients_bills_in_quickbooks
delete_patients_bills_locally
delete_patient_locally
end
end
end
The advantages of this approach are:
It's much easier to test
You can isolate remote deletion to a known
process (so you don't get unintended effects at the Rails console
Any place where this code is used (such as the controller where web
users trigger deletion) is easy to understand, and someone else
won't accidentally write patient-deleting code elsewhere that
unexpectedly manipulates quickbooks
Callbacks are great for situations where you want to do something every time (such as set some default data that relies on another attribute), but you have a situation here where sometimes you want the callback behavior, and sometimes you don't. Move it out to a coordinating object and then you can easily control the behavior.
I have a model:
class A < ActiveRecord::Base
has_many :B
end
And I want to reset or update A's B association, but only save it later:
a = A.find(...)
# a.bs == [B<...>, B<...>]
a.bs = []
#or
a.bs = [B.new, B.new]
# do some validation stuff on `a` and `a.bs`
So there might be some case where I will call a.save later or maybe not. In the case I don't call a.save I would like that a.bs stay to its original value, but as soon as I call a.bs = [], the old associations is destroyed and now A.find(...).bs == []. Is there any simple way to set a record association without persisting it in the database right away? I looked at Rails source and didn't find anything that could help me there.
Thanks!
Edit:
I should add that this is for an existing application and there are some architecture constraint that doesn't allow us to use the the regular ActiveRecord updating and validation tools. The way it works we have a set of Updater class that take params and assign the checkout object the value from params. There are then a set of Validater class that validate the checkout object for each given params. Fianlly, if everything is good, we save the model.
In this case, I'm looking to update the association in an Updater, validate them in the Validator and finally, persist it if everything check out.
In summary, this would look like:
def update
apply_updaters(object, params)
# do some stuff with the updated object
if(validate(object))
object.save(validate: false)
end
Since there are a lot of stuff going on between appy_updaters and object.save, Transaction are not really an option. This is why I'm really looking to update the association without persisting right away, just like we would do with any other attribute.
So far, the closest solution I've got to is rewriting the association cache (target). This look something like:
# In the updater
A.bs.target.clear
params[:bs].each{|b| A.bs.build(b)}
# A.bs now contains the parameters object without doing any update in the database
When come the time to save, we need to persist cache:
new_object = A.bs.target
A.bs(true).replace(new_object)
This work, but this feel kind of hack-ish and can easily break or have some undesired side-effect. An alternative I'm thinking about is to add a method A#new_bs= that cache the assigned object and A#bs that return the cached object if available.
Good question.
I can advice to use attributes assignment instead of collection manipulation. All validations will be performed as regular - after save or another 'persistent' method. You can write your own method (in model or in separated validator) which will validate collection.
You can delete and add elements to collection through attributes - deletion is performed by additional attribute _destroy which may be 'true' or 'false' (http://api.rubyonrails.org/classes/ActiveRecord/NestedAttributes/ClassMethods.html), addition - through setting up parent model to accept attributes.
As example set up model A:
class A < ActiveRecord::Base
has_many :b
accepts_nested_attributes_for :b, :allow_destroy => true
validates_associated :b # to validate each element
validate :b_is_correct # to validate whole collection
def b_is_correct
self.bs.each { |b| ... } # validate collection
end
end
In controller use plain attributes for model updating (e.g update!(a_aparams)). These methods will behave like flat attribute updating. And don't forget to permit attributes for nested collection.
class AController < ApplicationController
def update
#a = A.find(...)
#a.update(a_attributes) # triggers validation, if error occurs - no changes will be persisted and a.errors will be populated
end
def a_attributes
params.require(:a).permit([:attr_of_a, :b_attributes => [:attr_of_b, :_destroy]])
end
end
On form we used gem nested_form (https://github.com/ryanb/nested_form), I recommend it. But on server side this approach uses attribute _destroy as mentioned before.
I finally found out about the mark_for_destruction method. My final solution therefor look like:
a.bs.each(&:mark_for_destruction)
params[:bs].each{|b| a.bs.build(b)}
And then I can filter out the marked_for_destruction? entry in the following processing and validation.
Thanks #AlkH that made me look into how accepts_nested_attributes_for was working and handling delayed destruction of association.
I know that before_create is called before the object gets commuted to the database and after_create gets called after.
The only time when before_create will get called and after_create while not is if the object fails to meet data base constants (unique key, etc.). Other that that I can place all the logic from after_create in before_create
Am I missing something?
In order to understand these two callbacks, firstly you need to know when these two are invoked. Below is the ActiveRecord callback ordering:
(-) save
(-) valid
(1) before_validation
(-) validate
(2) after_validation
(3) before_save
(4) before_create
(-) create
(5) after_create
(6) after_save
(7) after_commit
you can see that before_create is called after after_validation, to put it in simple context, this callback is called after your ActiveRecord has met validation. This before_create is normally used to set some extra attributes after validation.
now move on to after_create, you can see this is created after the record is stored persistently onto DB. People normally use this to do things like sending notification, logging.
And for the question, when should you use it? The answer is 'you should not use it at all'. ActiveRecord callbacks are anti-pattern and seasoned Rails developer consider it code-smell, you can achieve all of that by using Service object to wrap around. Here is one simple example:
class Car < ActiveRecord::Base
before_create :set_mileage_to_zero
after_create :send_quality_report_to_qa_team
end
can be rewritten in
# app/services/car_creation.rb
class CarCreation
attr_reader :car
def initialize(params = {})
#car = Car.new(params)
#car.mileage = 0
end
def create_car
if car.save
send_report_to_qa_team
end
end
private
def send_report_to_qa_team
end
end
If you have simple app, then callback is okay, but as your app grows, you will be scratching your head not sure what has set this or that attribute and testing will be very hard.
On second thought, I still think you should extensively use callback and experience the pain refactoring it then you'll learn to avoid it ;) goodluck
The before_create callback can be used to set attributes on the object before it is saved to the database. For example, generating a unique identifier for a record. Putting this in an after_create would require another database call.
before_create:
will be called before saving new object in db. When this method will return false it will prevent the creation by rolling back.
So when you need to do something like check something before saving which is not appropriate in validations you can use them in before_create.
For example: before creation of new Worker ask Master for permission.
before_create :notify_master
def notify_master
# notify_master via ipc and
# if response is true then return true and create this successfully
# else return false and rollback
end
Another use is as Trung LĂȘ suggested you want to format some attribute before saving
like capitalizing name etc.
after_create:
Called after saving object in database for first time. Just when you don't want to interrupt creation and just take a note of creation or trigger something after creation this is useful.
for example: After creating new user with role mod we want to notify other mods
after_create :notify_mod, :is_mod?
def notify_mod
# send notification to all other mods
end
EDIT: for below comment
Q: What's the advantage of putting notify_mod in after_create instead of before_create?
A: Sometimes while saving the object in database it can rollback due to database side validations or due to other issues.
Now if you have written notify_mod in before create then it will be processed even if the creation is not done. No doubt it will rollback but it generates overhead. so it's time consuming
If you have placed it in after_create then notify_mod will only execute if the record is created successfully. Thus decreasing the overhead if the rollback takes places.
Another reason is that it's logical that notification must be sent after user is created not before.