I find myself in this situation very often. Sometimes I just take for granted that the record will be saved correctly if I'm in a rush, but I feel as that not being a good practice. I see sometimes placing the if save condition. The question arises here: what are the situations where a record cannot be saved?
what are the situations where a record cannot be saved?
If any of your validations fail. (Or of course HW failure, database connection loss etc occurs).
Should I throw an exception if an item cannot be saved?
If you want an invalid record to result in an exception being thrown, you don't need to do it yourself. Rails can already do it:
If you have a User model with a couple of validations (email and name must be present), you could:
user.save!
With save! validations always run. If any of them fail ActiveRecord::RecordInvalid gets raised.
But you probably don't want an exception to be raised in such a case. Because it is rather "common" for a user to not enter a valid password, for example. But
you should handle errors and the way this is commonly done is:
if user.save
#
else
# handle error
end
By default, save always run validations. If any of them fail the action is cancelled and save returns false.
As a general guideline for choosing between conditionals and exceptions I like this statement from DHH:
Why would the delivery of the emails fail? Because your SMTP server is down? That's an exceptional state, handle it with exceptions -- not with conditions.
Related
I'm trying to manage a HABTM relationship with a uniqueness constraint.
ie. I want my User to
has_and_belongs_to_many :tokens
But I don't want the same token to be associated with a given user more than once.
I put a unique index on the join table
add_index users_tokens [:user_id, :token_id], unique: true
which correctly results in a ActiveRecord::RecordNotUnique exception being thrown if the code tries to add the same token to a given user more than once.
In my code I was hoping to just silently catch/swallow this exception, something like this:
begin
user << token
rescue ActiveRecord::RecordNotUnique
# nothing to do here since the user already has the token
end
However, I'm running into a problem where the RecordNotUnique exception gets thrown much later in my code, when my user object gets modified for something else.
So some code calls something like
...
# The following line throws ActiveRecord::RecordNotUnique
# for user_tokens, even though
# we are not doing anything with tokens here:
user.update_counters
It's as if the association remembers that it's 'dirty' or unsaved, and then tries to save the record that didn't get saved earlier, and ends up throwing the exception.
Any ideas where to look to see if the association actually thinks it's dirty, and/or how to reset its 'dirty' state when I catch the exception?
ActiveRecord maintains in the application layer an object representation of the records in the database including relationships to other objects, and endevours to keep the application layer data representation in sync with the database. When you assign the token to the user like this:
user.tokens << token
first ActiveRecord looks for any application-level validations that would prevent the assignment, finding none it links the token to the user in the application layer, then it goes on to issue the DB request necessary to also make this connection in the DB layer. The DB has a constrant that prevents it, so an error is raised. You rescue from the error and continue, but the application level connection of the two objects is still in place. The next time that you make any edit to that same user object through ActiveRecord it will again try to bring the DB into sync with how the object is represented in the application, and since the connection to the token is still there it will make another attempt to insert this connection in the DB, but this time there is no rescue for the error that arises.
So when you do rescue from the database error you must also undo the application level change like this:
begin
user.toekns << token
rescue ActiveRecord::RecordNotUnique
user.tokens.delete(token)
end
I have a copy method that duplicates an object, and then changes some of it's attributes. When saving this, it gives me an ActiveRecord::RecordInvalid error on Name. However, the name attribute does not have a uniqueness constraint, so this should not be failing.
Furthermore, the name HAS been changed so it is unique, and debugging the method indicates this is the case. How can I be getting this error on a field that doesn't have a uniqueness constraint, and IS unique?
I've seen a bunch of questions about this related to RSpec, but this is not in a testing environment, so it's not a DB problem.
I realize I haven't posted code - I'm looking for general answers on what could possibly cause something like this.
It would be much easier to pinpoint actual problem if you could show your code and also Rails version. But if you are looking for general answer, the general answer is that RecordInvalid is raised by bang methods, mainly save! and validate! and other methods that calls those two underneath like create!, update!. This exception is raised by those methods when validation fails. And validation can fail from million of reasons that depends on your validation setup.
This exception can also be raised when those methods are called on invalid associated records.
I also think that validation may fail when you defined your own validation and return false.
I am not sure if I understand totally active record validation role.
Of course, if a user inputs data (like an email or a country), I can and should validate its existence, its uniqueness or its inclusion in a list of countries
But for example, if I have methods in the backend that change an attribute page_clicked or click_date or even the column update_at, that I "control" i.e 'is not generated by a user's input', should I use active record validations ?
I'm asking this because on a very 'hot database' (need speed for millions of frequent updates), I wonder if checking on each update that updated_at is a datetime, and that if a clicked column is true/false and nothing esle is really necessary as the user is not the one inputting/controlling these data but I am through Rails custom methods I wrote
Thanks
I don't think there is a general satisfying answer to your question. It's up to you to enforce validation or not.
Remember that you don't have to use ActiveRecord for validation, you can also use your DBMS to ensure that:
a value will never be NULL (one of the most annoying errors)
a value has the correct TYPE
a FOREIGN KEY always points to an existing row in another table
and depending on your DBMS, a lot more is possible
If you need high INSERT speed and want to go with raw SQL INSERTS, putting some validation in your database can prevent nasty application errors later.
Validations should guard your database and its job should be to stop saving the records that are considered invalid by your application.
There is no hard rule on what is valid record you have to decide it your self by adding the validations. If the record wont pass the validation step it is simply not going to be saved to the database.
From Active Record Callbacks:
3.1 Creating an Object
before_validation
after_validation
before_save
around_save
before_create
around_create
after_create
after_save
after_commit/after_rollback
3.2 Updating an Object
before_validation
after_validation
before_save
around_save
before_update
around_update
after_update
after_save
after_commit/after_rollback
You can see that validation hooks run at the beginning of the object life cycle.
So in your case instead of asking your self a question:
Should I use active record validations if the record is not generated by a user's input.
You should ask your self:
Is this record invalid without page_clicked or click_date(aka them being nil)
UPDATE
If you consider record to be invalid but worrying about speed problems with running validations I would do the validations to make sure that all the records in the database are valid and try to find the way to optimise the speed somewhere else. Plus not 100% sure but time spend on saving invalid records and filtering them later on will be probably much longer then validating in the first place.
When performance is really a priority and that I am sure that we developers / the server are the only ones who can manipulate specific attributes of a Model, I will
Make sure that I create a separate method / wrapper method for this specific action.
In this specific method, I call .save (validate: false) instead of the usual .save
I still write validations for the said attributes for developers' reference to prevent future development errors, and in case a new developer comes in and accidentally save an invalid record, precisely just because there's no validation to safeguard it.
Or, I will use .update_column instead of .save (validate: false) to perform a direct DB call, skipping Model validations and callbacks (If you also do not want callbacks to be called).
Note that .update_column is different from .update.
What's the correct way to rescue an exception and simply continue processing? I have an app that has Folders and Items, with a habtm relationship through a join table called folders_items. That table has a unique constraint ensuring that there are no duplicate item/folder combinations. If the user tries to add an item to the same folder several times, I obviously don't want the additional rows added; but I don't want to stop processing, either.
Postgres automatically throws an exception when the unique constraint is violated, so I tried to ignore it in the controller as follows:
rescue PG::Error, :with => :do_nothing
def do_nothing
end
This works fine on single insertions. The controller executes the render with a status code of 200. However, I have another method that does bulk inserts in a loop. In that method, the controller exits the loop when it encounters the first duplicate row, which is not what I want. At first, I thought that the loop must be getting wrapped in a transaction that's getting rolled back, but it isn't -- all the rows prior to the duplicate get inserted. I want it to simply ignore the constraint exception and move to the next item. How do I prevent the PG::Error exception from interrupting this?
In general, your exception handling should be at the closest point to the error that you can do something sensible with the exception. In your case, you'd want your rescue inside your loop, for example:
stuff.each do |h|
begin
Model.create(h)
rescue ActiveRecord::RecordNotUnique => e
next if(e.message =~ /unique.*constraint.*INDEX_NAME_GOES_HERE/)
raise
end
end
A couple points of interest:
A constraint violation inside the database will give you an ActiveRecord::RecordNotUnique error rather than the underlying PG::Error. AFAIK, you'd get a PG::Error if you were talking directly to the database rather than going through ActiveRecord.
Replace INDEX_NAME_GOES_HERE with the real name of the unique index.
You only want to ignore the specific constraint violation the you're expecting, hence the next if(...) bit followed by the argumentless raise (i.e. re-raise the exception if it isn't what you're expecting to see).
If you put a Rails validator on your model, then you can control your flow without throwing an exception.
class FolderItems
belongs_to :item
belongs_to :folder
validates_uniqueness_of :item, scope: [:folder], on: :create
end
Then you can use
FolderItem.create(folder: folder, item: item)
It will return true if the association was created, false if there was an error. It will not throw an exception. Using FolderItem.create! would throw an exception if the association is not created.
The reason you are seeing PG errors is because Rails itself thinks that the model is valid on save, because the model class does not have a uniqueness constraint in Rails. Of course, you have a unique constraint in the DB, which surprises Rails and causes it to blow up at the last minute.
If performance is critical then perhaps ignore this advice. Having a uniqueness constraint on a Rails model causes it to perform a SELECT before every INSERT in order for it to do uniqueness validation at the Rails level, potentially doubling the number of queries your loop is performing. Just catching the errors at the database level like you are doing might be a reasonable trade of elegance for performance.
(edit) TL;DR: Always have the unique constraint in the DB. Also having a model constraint will allow ActiveRecord/ActiveModel validation before the DB throws an error.
What does this do in Rails?
create! do |user|
#initialise user
end
I figured it creates a user objects and saves it to the database. How is it different from just saying user.new(...) and user.save()?
In a nutshell:
create! raises an exception while create returns the object (unsaved object if it does not pass validations).
save! raises an error while save returns true/false.
save does not take attributes, create does.
new does not save. new is similar to build in ActiveRecord context.
create saves to the database and returns true or false depending on model validations.
create! saves to the database but raises an exception if there are errors in model validations (or any other error).
When failed to create record, create! throws an exception, new and then save (or just create without exclamation mark) exit silently.
create takes attributes , so using a block here is somewhat unusual.
The code you mention is doing the initialization in a block that is passed to create!
It is in principal the same as new followed by the initialization and then a save!
There are many variations save, save!, create, ceate!, update, update!, etc.,
there are also variations in terms of validations, and call-backs
For details please check the API: (it is discussed in the first link)
http://api.rubyonrails.org/classes/ActiveRecord/Base.html
http://apidock.com/rails/ActiveRecord/Base
http://m.onkey.org/active-record-query-interface