I have the following validator in my model:
validates_uniqueness_of :item_id, conditions: -> { where.not(status: "published") }
What does the condition target here? Does it prevent the validator itself to look on the table row if there is status: "published" or is it an extension for the uniqueness validator to exclude the rows with status: "published" in the uniqueness (yes, there is a difference)?
Is there also a difference between the above validator and the following, assuming that status_published? is a method checking for the status to be "published" or not?
validates_uniqueness_of :item_id, :unless => lambda { status_published? }
And finally, if there is no difference, how can I accomplish the second case, where uniqueness validator will check if the value is unique only in the rows which are true for the condition?
According to the documentation, conditions limit the constraint to the set of records that match them, so the validation doesn't run at all if those said conditions aren't matched.
On a side-note, if you're doing this in Rails 4, you may want to look at the new validates syntax:
validates :item_id, uniqueness: true, unless: :status_published?
Related
I have a database in which email is set as unique validation with status. I have set scope as a int 'status' which is 1 when particular account is active.
So if user switches his country, his account status is switched to 0 and a new account is created with same email and status is set as 1.
So to solve this I have used a composite unique key on 'email' and 'status' as,
validates :email, uniqueness: { scope: :status }
So, let's have a small dry run.
currently, user is in X country and his email is abc#xyz.com
so database may look like
email: abc#xyz.com
status: 1
Now if he goes to country Y
database updates as,
[for company X]
email: abc#xyz.com
status: 0
[for company Y]
email:abc#xyz.com
status: 1
works fine right
but now if our Traveler goes to Country Z now
database will stuck as now for company Y status will update to 0 and as now this combination is not unique so will return error.
Also the challenging task here to only use email and status as the keys.
What I want to achieve is that for a particular email status:1 must be unique while status:0 can be any number of times.
So I was trying to condition the statement like
validates :email, uniqueness: { scope: :status }, if: :status == 1
but no luck as I cannot take status value of particular user in model.
Thanks, in advance!!
PS: I am new to ROR, so do provide relevant links so I can learn more xD.
You might want to rethink your database structure.
But one of the solutions is to do:
validates_uniqueness_of :email, scope: :status, if: :active?
def active?
status == 1
end
Check the api for this validation here.
But I strongly suggest you redo the database structure to prevent this from happening on the database-level. Something like:
inactive_emails table
email | company
--------------------------------
one#exmaple.com | A
one#example.com | B
one#example.com | C
two#example.com | A
active_emails table
email | company
--------------------------------
one#example.com | D
two#example.com | E
For the inactive emails model, you could have:
class InactiveEmails
validates_uniqueness_of :email, scope: :company
end
And for the active emails model:
class ActiveEmails
validates_uniqueness_of :emails, scope: :company
end
This way, you ensure that each company has unique active emails and each email can only be active for one company at a time.
Now it's up to you to switch emails between the two tables. You could use a callback for example:
before_save :check_if_email_is_inactive_for_company
def check_if_email_is_inactive_for_company
if InactiveEmail.where(email: email, company: company).exists?
# remove from inactive? or inform user this email was deactivated before?
end
end
Rails API page is your friend.
You can solve your problem in many ways but I will tell 2 ways
You can apply scope as follows
validates :email, uniqueness: { scope: [:status,:country]}
If you want to handle it with scope on single field
validates :email, uniqueness: { scope: :status }, unless: lambda{ |user| user.status == 0 }
If user id is x, the above validation allows user id: x, status: 0 for multiple times, but you can have only one user_id: x, status 1 combination.
I have two tables, Assets and Relationships. They look like this (plus other columns I'm omitting for brevity):
# Table name: relationships
#
# id :uuid not null, primary key
# asset1_id :uuid not null
# asset2_id :uuid not null
# type :string not null
# Table name: assets
#
# id :uuid not null, primary key
# type :string not null
# name :string not null
I want relationships of a type between two assets to be unique. So for example, let's say I have a relationship of type membership.
Relationship.create!(type: 'membership', asset1_id: '61d58a49-86a9-4d7f-b069-2ed1fa27b387', asset2_id: '1856df48-3193-45de-bef0-122cd9f58d7b')
If I try to create that record again, I can easily prevent it with validates :type, uniqueness: { scope: [:asset1_id, :asset2_id] } and add_index :relationships, [:type, :asset1_id, :asset2_id], unique: true, however when I use these the following case is not prevented:
Relationship.create!(type: 'membership', asset1_id: '1856df48-3193-45de-bef0-122cd9f58d7b', asset2_id: '61d58a49-86a9-4d7f-b069-2ed1fa27b387')
Note that this is the same as the previous record, only with the order of the asset ids reversed.
How can I prevent this (preferably at the DB level)?
You can do it at application level by adding custom validation
validates :type, uniqueness: { scope: [:asset1_id, :asset2_id] }
validate :reverse_type_uniqueness
def reverse_type_uniqueness
duplicate_present = self.class.where(type: type, asset1_id: asset2_id, asset2_id: asset1_id).exists?
errors.add(:base, "Duplicate present") if duplicate_present?
end
To implement 2 sided unique index at DB level, here is an example, not very straight forward though
https://dba.stackexchange.com/questions/14109/two-sided-unique-index-for-two-columns
If you want to validate it at the DB level, you need to setup a compound index on both fields on your join table. More info can be found here: How to implement a unique index on two columns in rails
Assuming you call your join table memberships, try the following migration:
add_index :memberships, [:relationship_id, :asset_id], unique: true
Alternatively, to let rails handle the validation:
class Membership < ActionRecord::Base
validates_uniqueness_of :relationship_id, scope: :membership_id
...
end
More reading on the rails validation:
https://apidock.com/rails/ActiveRecord/Validations/ClassMethods/validates_uniqueness_of
I'd like to create a unique index for a table, consisting of 3 columns, but
to check only if one of them has a specific value:
something like
add_index :table, [:col1, :col2, :col3], unique: true
but only if col3 = true,
otherwise I don't care about col1, col2, :col3 = false uniqueness.
is there a way to do it in a migration to keep it at the DB level, or can I only
validate this case in the model?
I don't believe you can have conditional uniqueness constraints at the database layer (via migrations). You can add this as a conditional validation at the AR layer though which should be sufficient for your purposes (though it should be noted this can introduce some race conditions). ie.
validates [:col1, :col2], uniqueness: true, if: ":col3 == true"
Hope that helps.
I have the following model
class Person
include Mongoid::Document
embeds_many :tasks
end
class Task
include Mongoid::Document
embedded_in :commit, :inverse_of => :tasks
field :name
end
How can I ensure the following?
person.tasks.create :name => "create facebook killer"
person.tasks.create :name => "create facebook killer"
person.tasks.count == 1
different_person.tasks.create :name => "create facebook killer"
person.tasks.count == 1
different_person.tasks.count == 1
i.e. task names are unique within a particular person
Having checked out the docs on indexes I thought the following might work:
class Person
include Mongoid::Document
embeds_many :tasks
index [
["tasks.name", Mongo::ASCENDING],
["_id", Mongo::ASCENDING]
], :unique => true
end
but
person.tasks.create :name => "create facebook killer"
person.tasks.create :name => "create facebook killer"
still produces a duplicate.
The index config shown above in Person would translate into for mongodb
db.things.ensureIndex({firstname : 1, 'tasks.name' : 1}, {unique : true})
Can't you just put a validator on the Task?
validates :name, :uniqueness => true
That should ensure uniqueness within parent document.
Indexes are not unique by default. If you look at the Mongo Docs on this, uniqueness is an extra flag.
I don't know the exact Mongoid translation, but you're looking for something like this:
db.things.ensureIndex({firstname : 1}, {unique : true, dropDups : true})
I don't believe this is possible with embedded documents. I ran into the same issue as you and the only workaround I found was to use a referenced document, instead of an embedded document and then create a compound index on the referenced document.
Obviously, a uniqueness validation isn't enough as it doesn't guard against race conditions. Another problem I faced with unique indexes was that mongoid's default behavior is to not raise any errors if validation passes and the database refuses to accept the document. I had to change the following configuration option in mongoid.yml:
persist_in_safe_mode: true
This is documented at http://mongoid.org/docs/installation/configuration.html
Finally, after making this change, the save/create methods will start throwing an error if the database refuses to store the document. So, you'll need something like this to be able to tell users about what happened:
alias_method :explosive_save, :save
def save
begin
explosive_save
rescue Exception => e
logger.warn("Unable to save record: #{self.to_yaml}. Error: #{e}")
errors[:base] << "Please correct the errors in your form"
false
end
end
Even this isn't really a great option because you're left guessing as to which fields really caused the error (and why). A better solution would be to look inside MongoidError and create a proper error message accordingly. The above suited my application, so I didn't go that far.
Add a validation check, comparing the count of array of embedded tasks' IDs, with the count of another array with unique IDs from the same.
validates_each :tasks do |record, attr, tasks|
ids = tasks.map { |t| t._id }
record.errors.add :tasks, "Cannot have the same task more than once." unless ids.count == ids.uniq.count
end
Worked for me.
You can define a validates_uniqueness_of on your Task model to ensure this, according to the Mongoid documentation at http://mongoid.org/docs/validation.html this validation applies to the scope of the parent document and should do what you want.
Your index technique should work too, but you have to generate the indexes before they brought into effect. With Rails you can do this with a rake task (in the current version of Mongoid its called db:mongoid:create_indexes). Note that you won't get errors when saving something that violates the index constraint because Mongoid (see http://mongoid.org/docs/persistence/safe_mode.html for more information).
You can also specify the index in your model class:
index({ 'firstname' => 1, 'tasks.name' => 1}, {unique : true, drop_dups: true })
and use the rake task
rake db:mongoid:create_indexes
you have to run :
db.things.ensureIndex({firstname : 1, 'tasks.name' : 1}, {unique : true})
directly on the database
You appear to including a "create index command" inside of your "active record"(i.e. class Person)
I have a string column in a table that can have a range of predefined values. It can also include a nil value. For ex: Dog, Cat, Bird, nil.
I want to write a validates_inclusion_of that checks to make sure all the values being entered fall within that predefined range. If, for ex, "Nasal Spray" is entered, it will throw an error.
What's the best way to do so?
Use the following validation within your model class:
validates_inclusion_of :animal, :in => %w(Dog Cat Bird), :allow_blank => true
—where :animal is the name of the column you want to validate.