I'm trying to use ActiveRecord's find_or_create_by_*column*, but I'm getting errors from Postgres letting me know that it occasionally fails to find the model, and tries to insert one anyways. It's really important that I keep this table unique, so I added a :unique => true attribute to its migration, so that Postgres would know that I was serious about it.
And, fail:
ActiveRecord::StatementInvalid: PGError: ERROR: duplicate key value violates unique constraint "index_marketo_leads_on_person_id" DETAIL: Key (person_id)=(9968932) already exists. : INSERT INTO "marketo_leads" ("mkt_person_id", "synced_at", "person_updated_at", "person_id") VALUES(NULL, NULL, '2011-05-06 12:57:02.447018', 9968932) RETURNING "id"
I have models like so:
class User < AR::Base
has_one :marketo_lead
before_save :update_marketo_lead
def update_marketo_lead
if marketo_lead
if (User.marketo_columns & self.changes.keys).any?
marketo_lead.touch(:person_updated_at)
end
elsif self.id
marketo_lead = MarketoLead.find_or_create_by_person_id(:person_updated_at => Time.now, :person_id => self.id)
end
end
end
class MarketoLead
belongs_to :user, :foreign_key => 'person_id'
end
The second model is used for linking our users accounts to the Marketo email server, and keeping a record of the last time certain fields of the user was modified, so that we can push changed records in batched background tasks.
I can't think of any reason for this callback, update_marketo_lead to fail, other than some kind of race condition that I can't quite imagine.
(please ignore the horribleness of 'user' sharing a primary key with 'person')
(using Rails 2.3.11, Postgres 9.0.3)
Its quite possible that when find_or_create was executed, matching person_id was not found, so create logic was used, however its possible that between find_or_create and actual user.save, another request managed to complete save transaction and at that point your Database constraint caused this exception.
What I would recommend is to catch StatementInvalid exception and to retry saving(up to a finite number of times...
begin
user.save!
rescue ActiveRecord::StatementInvalid => error
#save_retry_count = (#save_retry_count || 5)
retry if( (#save_retry_count -= 1) > 0 )
raise error
end
Note this should be executed wherever you try to save the user. All callbacks and validations are happening within save! transaction
P.S. Im assuming your version of rails supports transactions :) In Rails 3 its unnecessary to wrap save! in transaction because it already uses one internally
I'm hitting this inside a sidekick job that retries and gets the error repeatedly and eventually clears itself. I'm not convinced its a race condition from another request or it would be really rare and only happen once or twice but not 11 consecutive times like I'm seeing. The best explanation I've found is on a blog post here. The gist is that postgres keeps an internally stored value for incrementing the primary key that gets messed up somehow. This rings true for me because I'm setting the primary key and not just using an incremented value so maybe that's how this cropped up. The solution from the comments in the link above appears to be to call ActiveRecord::Base.connection.reset_pk_sequence!(table_name)
I can't verify this yet because I couldn't repro the issue, but my attempted fix, modified from Vladimir's fix above is:
begin
user.save!
rescue ActiveRecord::StatementInvalid => error
#save_retry_count = (#save_retry_count || 1)
ActiveRecord::Base.connection.reset_pk_sequence!(:user)
retry if( (#save_retry_count -= 1) >= 0 )
raise error
end
So if this doesn't fix it on the first try I'll see an error raised
Related
Currently there is a unique index on my column with name "index_unique_devices_on_dsn" . when i am saving duplicating record i am getting mysql exception and i am handling that exception using class ActiveRecord::RecordNotUnique . if in the future if i am adding multiple columns and each column having its own unique index then how can i programmatically identify for which column this uniqueness exception was raised. ? i don't want to use .valid? method of rails as it is going to run validation again.
Normally you'd use the non-exception versions like save or create instead of save! and create!. Then check which columns are invalid on the model's validation errors.
user = User.create(name: "Duplicate")
if user.model.errors.include?(:name)
...
end
However, the ActiveRecord::RecordNotUnique exception comes from the database. There's no validation error.
You can fix this by adding your own uniqueness validation to the model which runs before Rails tries to save the model.
class User < ApplicationRecord
validates :name, uniqueness: true
end
Now you'll get a validation error.
However, this requires a query to check uniqueness. That can affect performance, and also lead to race conditions. Imagine you try to create two Users with the same name at the same time. They both check if the name is taken, they both see that it is not, and they both try to insert. One succeeds, one fails with an ActiveRecord::RecordNotUnique.
Instead, use the excellent database_validations gem which turns database errors into validation errors. This is safer, faster, and you only need one way to check for validation errors.
class User < ApplicationRecord
validates :name, db_uniqueness: true
end
I have two Rails models / tables that I want insert as part of transaction.
For example, here are roughly my tables:
Posts: (id)
Comments: (id, post_id, comment_text, user_id)
Commenters: (id, post_id, user_id), unique constraint on (post_id,
user_id)
Right now I'm trying approximately equivalent to:
ActiveRecord::Base.transaction do
Comment.create!(post: post, user: user, comment_text: '...')
begin
Commenters.find_or_create_by!(post: post, user: user)
rescue PG::UniqueViolation
end
end
This works 99.9% of time, but sometimes two concurrent comments will trigger a PG::UniqueViolation.
Even though I'm catching and suppressing the PG::UniqueViolation, the entire transaction fails due to:
ERROR: current transaction is aborted, commands ignored until end of transaction block
I realize I could already achieve this by joining Post and Comment table, but this is a simplified example.
Is there a simpler way to ensure both inserts happen as part of a transaction while still ignoring the unique violation since we can assume that the record already exists?
An exception raised inside the transaction does two things:
Rolls back the transaction so none of the changes made inside the transaction will persist in the database.
The exception propagates outside the transaction block.
So you can move your exception handler outside the transaction call:
begin
ActiveRecord::Base.transaction do
Comment.create!(post: post, user: user, comment_text: '...')
Commenters.find_or_create_by!(post: post, user: user)
end
rescue PG::UniqueViolation
retry
end
You could include a counter to only retry a few times if you wanted more safety.
You should have the associations properly set inside your models, so rails does the validation for you. Then you can simply rescue the possible ActiveRecord::RecordInvalid (with message Validation failed: Attribute has already been taken).
If you want to read some more about uniqueness validtion this should come in handy: http://guides.rubyonrails.org/active_record_validations.html#uniqueness
To me it actually looks like your Commenter model is not really necessary. It just consists of derived information which can also be drawn directly from the Comments model (there you already store post_id and user_id) so you could drop the Commenter class entirely. Then take care to validate Comment on creation by for example setting
class Comment
belongs_to :post
belongs_to :user
validates :user_id, uniqueness: {scope: :post_id}
end
But this way you would allow a user to comment only once.
I propose you drop the uniqueness constraint and construct the information stored in Commenter by making distinct selects on the Comment model.
PS: Models in rails are written in Uppercase and singular while tables are referred to (by symbols) in lowercase and plural.
The error itself happens because of multi-threading behaviour of your app.
You need to rescue ActiveRecord::RecordNotUnique instead of PG-specific error.
Also perhaps put transaction inside begin rescue end block.
And retry to continue with another transaction within rescue block. Something like another answer suggested.
How do you write validations for a number of associations that is externally defined? I've so far written something like this:
class Document
validate :publication_count
private
def publication_count
if publications.count > template.component_count
errors.add(:articles, 'too many')
elsif publications.count < template.component_count
errors.add(:articles, 'not enough')
end
end
Both publications and template are associations. I just get a rollback error with this code, even though the record should be valid.
Your code appears correct, so it seems likely that the associations aren't being set or saved correctly.
Did you check that:
publications and template are both assigned to the Document instance before you save?
the rollback error isn't for a different reason, like uniqueness failure?
this is the actual validation that's failing rather than another one?
I have an fairly typical Order model, that has_many Lines
class Order < ActiveRecord::Base
has_many :lines
validates_associated :lines
Once the order is completed, it should not be possible to change any attributes, or related lines (though you can change the status to not completed).
validate do
if completed_at.nil? == false && completed_at_was.nil? == false
errors.add(:base, "You can't change once complete")
end
end
This works fine, but, if you add to, remove, or change the associated Lines, then this isn't prevented.
In my Line model, I have the following validation:
validate do
if order && order.completed_at.nil? == false
errors.add(:base, "Cannot change once order completed.")
end
end
This successfully stops lines in a completed order being modified, and prevents a line being added to a completed order.
So I need to also prevent lines being taken out of a completed order. I tried this in the Line model:
validate do
if order_id_was.nil? == false
if Order.find(order_id_was).completed_at.nil? == false
errors.add(:base, "Cannot change once order completed.")
end
end
end
This works fine to prevent a Line being taken out of an Order when modifying the Line directly. However when you are editing the Order and remove a Line, the validation never runs, as it has already been removed from the Order.
So... in short, how can I validate that the Lines associated with an Order do not change, and are not added to or removed?
I'm thinking I'm missing something obvious.
From the "Association Callbacks" section of ActiveRecord::Associations, you'll see that there are several callbacks that you can add to your has_many definition:
before_add
after_add
before_remove
after_remove
Also from the same docs:
Should any of the before_add callbacks throw an exception, the object does not get added to the collection. Same with the before_remove callbacks; if an exception is thrown the object doesn't get removed.
Perhaps you can add a callback method to before_add and before_remove that makes sure the order isn't frozen and throws an exception if it's not allowed.
has_many :lines,
before_add: :validate_editable!,
before_remove: :validate_editable!
private
def validate_editable_lines!(line)
# Define the logic of how `editable?` works based on your requirements
raise ActiveRecord::RecordNotSaved unless editable?(line)
end
Another thing worth trying would be to add a validation error and return false within validate_editable_lines! if your validation test fails. If that works, I'd recommend changing the method name to validate_editable_lines (sans ! bang), of course. :)
This is an interesting problem, and to the best of my knowledge slightly tricky to solve.
Here is one approach: http://anti-pattern.com/dirty-associations-with-activerecord
Another approach which I think is slightly cleaner would be to simply check at the controller level before you add/remove a Line, and not to use validations.
Yet another approach is you can add before_create and before_destroy callbacks to Line, and check if the Order instance has been completed.
Maybe add a locked attribute to the model, and, after the order is completed set the value of locked to true.
Then, in the controller, add a before_filter that will be triggered before the update action so it would check the value of the locked flag. If it is set to true then raise an error/notification/whatever to the user that that line item cannot be changed.
One of the models in a Rails 3.1 application I'm working on has a "code" attribute that is generated automatically when the record is created and that must be unique. The application should check the database to see if the generated code exists and, if it does, it should generate a new code and repeat the process.
I can ensure the field's uniqueness at the database level with add_index :credits, :code, :unique => true (which I am doing) and also in the model with validates_uniqueness_of, but both of these will simply return an error if the generated code exists. I need to just try again in the case of a duplicate. The generated codes are sufficiently long that duplicates are unlikely but I need to be 100% certain.
This code generation is handled transparently to the end user and so they should never see an error. Once the code is generated, what's the best way to check if it exists and to repeat the process until a unique value is found?
Here's a quick example, there is still technically a race condition here, though unless your seeing hundreds or thousands of creates per second it really shouldnt be a worry, worst case is your user gets a uniquness error if two creates are run in such a way that they both execute the find and return nil with the same Url
class Credit < ActiveRecord::Base
before_validation :create_code, :if => 'self.new_record?'
validates :code, :uniqueness => true
def create_code
self.code = code_generator
self.code = code_generator until Credit.find_by_code(code).nil?
end
end
If you absolutely needed to remove the race condition case where two creates are running in tandem and both trigger the find with the same code and return nil you could wrap the find with a table lock which requires DB specific SQL, or you could create a table that has a row used for locking on via pessimistic locking, but I wouldn't go that far unless your expecting hundreds of creates per second and you absolutely require that the user never ever sees an error, it's doable, just kind of overkill in most cases.
I am not sure if there is a built in way. I have always used a before_create.
Here is an example in the context of a UrlShortener.
class UrlShortener < Activerecord::Base
before_create :create_short_url
def create_short_url
self.short_url = RandomString.generate(6)
until UrlShortener.find_by_short_url(self.short_url).nil?
self.short_url = RandomString.generate(6)
end
end
end