I'm trying to prevent deletion of models from the db and pretty much follow this guide (see 9.2.5.3 Exercise Your Paranoia with before_destroy) from a Rails 4 book.
I have a simple model:
class User < ActiveRecord::Base
before_destroy do
update_attribute(:deleted_at, Time.current)
false
end
and in the controller:
def destroy
#user = User.find(params[:id])
# #user.update!(deleted_at: Time.zone.now) # if I do it here it works
#user.destroy # if I also comment this line...
render :show
end
The callback gets called and the attribute gets set, but then the database transaction always gets rolled back. It I leave out the returning of false the model gets deleted because the execution of delete is not halted.
As you can see in the comments I can get it to work but what I really want to do is use a Service Object and put the logic out of the controller.
if your callback returns false the transaction will always be rollbacked.
For what you want you should not call to the destroy method on your arel object.
Instead, make your own method like soft_destroy or something like that and update your attribute.
And to prevent others from calling the destroy method on your arel object, just add a callback raising and exception for instance.
Your model is just an object. If you really want to change the concept of destroy, change it:
def destroy
condition ? alt_action : super
end
Related
I'm using Rails 4 with Oracle 12c and I need to update the status of an User, and then use the new status in a validation for another model I also need to update:
class User
has_many :posts
def custom_update!(new_status)
relevant_posts = user.posts.active_or_something
ActiveRecord::Base.transaction do
update!(status: new_status)
relevant_posts.each { |post| post.update_stuff! }
end
end
end
class Post
belongs_to :user
validate :pesky_validation
def update_stuff!
# I can call this from other places, so I also need a transaction here
ActiveRecord::Base.transaction do
update!(some_stuff: 'Some Value')
end
end
def pesky_validation
if user.status == OLD_STATUS
errors.add(:base, 'Nope')
end
end
end
However, this is failing and I receive the validation error from pesky_validation, because the user inside Post doesn't have the updated status.
The problem is, when I first update the user, the already instantiated users inside the relevant_posts variable are not yet updated, and normally all I'd need to fix this was to call reload, however, maybe because I'm inside a transaction, this is not working, and pesky_validation is failing.
relevant_users.first.user.reload, for example, reloads the user to the same old status it had before the update, and I'm assuming it's because the transaction is not yet committed. How can I solve this and update all references to the new status?
I have a model Country and therefore a table countries. The countries table act as a collection of iso country and currency codes and should never reduce there content (after I have filled it with seed data). Because Country is a subclass of ActiveRecord::Base it inherits class methods like destroy, delete_all and so forth which deletes records. I'm looking for a solution to prevent the deletion of records at the model level.
Ofc. I know that I can make use of the object oriented approach to solve this problem by overriding this methods (and raise for instance an error when they called), but this assumes that I have to know all the inherited methods of the base class. I would be glad if someone could offer a more elegant solution.
Taking inspiration from Mark Swardstrom answer, I propose the following that is working also on Rails > 5.0:
Within your model:
before_destroy :stop_destroy
def stop_destroy
errors.add(:base, :undestroyable)
throw :abort
end
The following will make all calls to model.destroy return false and your model will not be deleted.
You can argue that still, calls to model.delete will work, and delete your record, but since these are lower level calls this makes perfectly sense to me.
You could also delete the records directly from the database if you want, but the above solution prevents deletion from the application level, which is the right place to check that.
Rubocop checks your calls to delete or delete_all and raises a warning, so you can be 100% sure that if you call model.delete is because you really want it.
My solution works on latest Rails versions where you need to throw :abort instead of returning false.
There's a before_destroy callback, maybe you could take advantage of that.
before_destroy :stop_destroy
def stop_destroy
self.errors[:base] << "Countries cannot be deleted"
return false
end
Rails 4+ (including Rails 6)
In our case, we wanted to prevent an object from being destroyed if it had any associated records. This would be unexpected behaviour, so we wanted an exception raised with a helpful error message.
We used the native ActiveRecord::RecordNotDestroyed class with a custom message as described below.
class MyClass < ApplicationRecord
has_many :associated_records
before_destroy :check_if_associated_records
private
def check_if_associated_records
# set a guard clause to check whether the record is safe to destroy
return unless associated_records.exists?
raise ActiveRecord::RecordNotDestroyed, 'Cannot destroy because...'
end
end
Behaviour with destroy and destroy!
my_class = MyClass.first.destroy
# => false
my_class = MyClass.first.destroy!
# => ActiveRecord::RecordNotDestroyed (Cannot destroy because...)
Note: If you have a belongs_to or has_one instead of a has_many association, your method will look like:
def check_if_associated_records
# set a guard clause to check whether the record is safe to destroy
return unless associated_record
raise ActiveRecord::RecordNotDestroyed, 'Cannot destroy because...'
end
In Rails 6 this is how I prevent records from being deleted.
The exception raised will roll back the transaction that is being used by ActiveRecord therefore preventing the record being deleted
class MenuItem < ApplicationRecord
after_destroy :ensure_home_page_remains
class Error < StandardError
end
protected #or private whatever you need
#Raise an error that you trap in your controller to prevent your record being deleted.
def ensure_home_page_remains
if menu_text == "Home"
raise Error.new "Can't delete home page"
end
end
So the ensure_home_page_remains method raises an MenItem::Error that causes the transaction to be rolled back which you can trap in your controller and take whatever appropriate action you feel is necessary, normally just show the error message to the user after redirecting to somewhere. e.g.
# DELETE /menu_items/1
# DELETE /menu_items/1.json
def destroy
#menu_item.destroy
respond_to do |format|
format.html { redirect_to admin_menu_items_url, notice: 'Menu item was successfully destroyed.' }
format.json { head :no_content }
end
end
#Note, the rescue block is outside the destroy method
rescue_from 'MenuItem::Error' do |exception|
redirect_to menu_items_url, notice: exception.message
end
private
#etc...
I've a method named update inside my DailyOrdersController:
def update
if #daily_order.update( daily_order_params.merge({default_order:false}) )
respond_or_redirect(#daily_order)
else
render :edit
end
end
My DailyOrder model:
before_save :refresh_total
def refresh_total
# i do something here
end
What I'm trying to do now is, I want the refresh_total callback to be skipped if the update request is coming from current_admin.
I have 2 user model generated using Devise gem:
User (has current_user)
Admin (has current_admin)
I try to make it like this:
def update
if current_admin
DailyOrder.skip_callback :update, :before, :refresh_total
end
if #daily_order.update( daily_order_params.merge({default_order:false}) )
respond_or_redirect(#daily_order)
else
render :edit
end
end
But it's not working and still keep calling the refresh_total callback if the update request is coming from current_admin (when the logged-in user is admin user).
What should I do now?
I think this is all what you need:
http://guides.rubyonrails.org/active_record_callbacks.html#conditional-callbacks
If you skip callback, you should enable it later. Anyway, this does not look as the best solution. Perhaps you could avoid the callbacks otherwise.
One way would be to use update_all:
DailyOrder.where(id: #daily_order.id).update_all( daily_order_params.merge({default_order:false}) )
Or you could do something like this:
#in the model:
before_validation :refresh_total
#in the controller
#daily_order.assign_attributes( daily_order_params.merge({default_order:false}) )
#daily_order.save(validate: current_admin.nil?)
or maybe it would be the best to add a new column to the model: refresh_needed and then you would conditionally update that column on before_validation, and on before_save you would still call the same callback, but conditionally to the state of refresh_needed. In this callback you should reset that column. Please let me know if you would like me to illustrate this with some code.
This may come in handy:
http://www.davidverhasselt.com/set-attributes-in-activerecord/
UPDATE
Even better, you can call update_columns.
Here is what it says in the documentation of the method:
Updates the attributes directly in the database issuing an UPDATE SQL
statement and sets them in the receiver:
user.update_columns(last_request_at: Time.current)
This is the fastest way to update attributes because it goes straight to
the database, but take into account that in consequence the regular update
procedures are totally bypassed. In particular:
\Validations are skipped.
\Callbacks are skipped.
+updated_at+/+updated_on+ are not updated.
This method raises an ActiveRecord::ActiveRecordError when called on new
objects, or when at least one of the attributes is marked as readonly.
I have a basic authentication system just like in Michael Hartl's Ruby on Rails Tutorial. Basically, a remember token is stored in a cookie. I implemented Ryan Bate's Beta-Invitations from Railscast #124, where you can send a limited number of invitations. While doing that, I ran into the problem that the current user got logged out after sending an invitation. This was caused by this code in the invitation model:
invitation.rb
belongs_to :sender, :class_name => 'User'
[...]
before_create :decrement_sender_count, :if => :sender
[...]
def decrement_sender_count
sender.decrement! :invitation_limit
end
In the logs I saw that sender.decrement! not only updated the invitation_limit but the remember_token as well:
UPDATE "users" SET "invitation_limit" = 9982, "remember_token" = 'PYEWo_om0iaMjwltU4iRBg', "updated_at" = '2012-07-06 09:57:43.354922' WHERE "users"."id" = 1
I found an ugly workaround but I would love to know what the problem really is. Since I don't know where to start, I'll show you the update method from the users controller. What else could be relevant?
users_controller.rb
def update
#user = User.find(params[:id])
if #user.update_attributes(params[:user])
flash[:success] = t('success.profile_save')
sign_in #user
redirect_to #user
else
flash.now[:error] = t('error.profile_save')
render 'edit'
end
end
decrement! calls save which of course fires save callbacks. It looks like the book directs you to do
before_save :create_remember_token
def create_remember_token
self.remember_token = SecureRandom.urlsafe_base64
end
which means that saving a user will always invalidate the remember token. I assume this is so that when a user changes their password the remember token changes too, but it means that there is obviously some collateral damage.
You could use the decrement_counter which in essence does
update users set counter_name = counter_name - 1 where id =12345
without running any callbacks. This also avoids some race condition scenarios. However changing the token whenever the user changes is bound to change the token at times when you don't expect it - you might want to only change it when relevant (perhaps when credentials have changed)
In my opinion you're experiencing a common pitfall in ActiveRecord update methods:
Having a look at ActiveRecord documentation here you can see the actual implementation of the decrement! method:
def decrement!(attribute, by = 1)
decrement(attribute, by).update_attribute(attribute, self[attribute])
end
The interesting part is the update_attribute which is called upon the self object - although the method implies ActiveRecord will only update the specified attribute, it really updates all dirty attributes of the self object.
That means that if, at any point, you change the remember token attribute of the object, it will be saved to the DB during the update_attributes call.
If I'm right and that's the problem - just make sure no changes are made to the remember_token attribute at runtime.
Other than that you may consider using ActiveRecord's update_column method which will update a single column on the DB without performing the save method on the object
Is it possible to send variables in the the transition? i.e.
#car.crash!(:crashed_by => current_user)
I have callbacks in my model but I need to send them the user who instigated the transition
after_crash do |car, transition|
# Log the car crashers name
end
I can't access current_user because I'm in the Model and not the Controller/View.
And before you say it... I know I know.
Don't try to access session variables in the model
I get it.
However, whenever you wish to create a callback that logs or audits something then it's quite likely you're going to want to know who caused it? Ordinarily I'd have something in my controller that did something like...
#foo.some_method(current_user)
and my Foo model would be expecting some user to instigate some_method but how do I do this with a transition with the StateMachine gem?
If you are referring to the state_machine gem - https://github.com/pluginaweek/state_machine - then it supports arguments to events
after_crash do |car, transition|
Log.crash(car: car, crashed_by: transition.args.first)
end
I was having trouble with all of the other answers, and then I found that you can simply override the event in the class.
class Car
state_machine do
...
event :crash do
transition any => :crashed
end
end
def crash(current_driver)
logger.debug(current_driver)
super
end
end
Just make sure to call "super" in your custom method
I don't think you can pass params to events with that gem, so maybe you could try storing the current_user on #car (temporarily) so that your audit callback can access it.
In controller
#car.driver = current_user
In callback
after_crash do |car, transition|
create_audit_log car.driver, transition
end
Or something along those lines.. :)
I used transactions, instead of updating the object and changing the state in one call. For example, in update action,
ActiveRecord::Base.transaction do
if #car.update_attribute!(:crashed_by => current_user)
if #car.crash!()
format.html { redirect_to #car }
else
raise ActiveRecord::Rollback
else
raise ActiveRecord::Rollback
end
end
Another common pattern (see the state_machine docs) that saves you from having to pass variables between the controller and model is to dynamically define a state-checking method within the callback method. This wouldn't be very elegant in the example given above, but might be preferable in cases where the model needs to handle the same variable(s) in different states. For example, if you have 'crashed', 'stolen', and 'borrowed' states in your Car model, all of which can be associated with a responsible Person, you could have:
state :crashed, :stolen, :borrowed do
def blameable?
true
end
state all - [:crashed, :stolen, :borrowed] do
def blameable?
false
end
Then in the controller, you can do something like:
car.blame_person(person) if car.blameable?