How can I validate that there can only be one enter with the same doctor_id and patient_id? (a patient can only recommend a doctor once)
class DoctorRecommendation < ActiveRecord::Base
belongs_to :patient, :class_name => "User"
belongs_to :doctor, :class_name => "User"
validates :patient, presence: true
validates :doctor, presence: true
# does not work
validates_uniqueness_of :recommend, scope: [:doctor_id, :patient_id]
end
Rails makes this kind of validations fairly easy.
The first step is to define the validation on your model.
The uniqueness validation supports a scope option, that should contain the name of the other column you want to limit the uniqueness to (or an Array of other columns, if it's a >=3 column scoped uniqueness).
Your mistake is to declare it with a different name (:recommend).
This is what you want:
class DoctorRecommendation < ActiveRecord::Base
belongs_to :patient, class_name: "User"
belongs_to :doctor, class_name: "User"
validates :patient, presence: true
validates :doctor, presence: true
validates :patient_id, uniqueness: { scope: :doctor_id }
end
Considering that you already have :presence validations for the associated models, the :uniqueness validation can be limited to the IDs.
This will enable the validation on the application layer, that is, it will be verified in your Ruby process.
Unfortunately this is not enough in a real world scenario, where you can have multiple processes/threads modify the same table at the same time.
Imagine, for example that two requests reach your servers at the same time, both to create the same DoctorRecommendation. If the requests are served by two server processes/threads in parallel (or close enough), there is a chance that the Rails validation will pass in both cases.
In details:
both servers instantiate a new unsaved model in memory, and populate its fields
both read from the DB, to see if the uniqueness validation passes
there is no record yet with that patient_id and doctor_id pair, the validation passes in both processes
both servers save the record, and the data is written to the the DB
bang. your uniqueness constraint has been violated.
For this reason you need to enforce uniqueness on the DB layer as well, with a unique multi-column index.
More precisely, with:
class AddMultiIndexToDoctorRecommendations < ActiveRecord::Migration
def change
# using a custom name to make sure it's below the length limit
add_index :doctor_recommendations,
[:patient_id, :doctor_id],
unique: true,
name: 'index_docrecomm_on_patient_id_and_doctor_id'
end
end
This will define a [:patient_id, :doctor_id] index, with a unique constraint.
If you read the docs on multi column indexes (e.g. for postgres or mysql), you'll find that the order of the columns matters. The Migration I wrote uses the right order for the validation I defined above, which means that the validation queries will be correctly optimized. Make sure to modify the index if you want to invert the validation scope.
Now, back to the example above, on point 4 both server processes will try to save the record at the same time, but one will finish a few milliseconds before the other. The second will raise a ActiveRecord::RecordNotUnique exception, that you can rescue in your code.
For example:
begin
DoctorRecommendation.create(attributes)
rescue ActiveRecord::RecordNotUnique
# ops, let's retry again, to see if we get a validation error
end
I would use a migration and add a unique index for that combination.
add_index :doctor_recommenations, [:doctor_id, :patient_id], :unique => true
Related
I need to check presence validation for the associated attributes every time when the parent model gets updated.
In my user.rb
accepts_nested_attributes_for :histories
has_many :histories
I need to add validation for the histories when the user model gets updated, I know accepts_nested_attributes will take care of the validations while adding the user through forms, I need to check for the validation every time the user model gets updated even in the console,
If I add
validates :histories, presence: true
It will check for record in the histories table, If any record available for the user It will skip the validation for histories, I need to validate every time the object gets updated. Is there any way to validate whether the new record is being created when updating the parent model?
From your description, I think what you may be looking for is validates_associated:
class User < ApplicationRecord
has_many :histories
# validates that the association exists
validates :histories, presence: true
# validates that the objects in the associated collection are themselves valid
validates_associated :histories
end
the validates :attribute, presence: true validator is meant to validate first-class attributes on the model, not relationships. Such things like on a User class validates :email, presence: true, is where it works best.
Is your goal to validate the has_many relationship or test the relationship? If it's a test, you should make a spec that runs a test along the lines of it { should have_many(:histories) }.... obviously depending on your testing framework.
If you're goal is to validate the has many relationship, you may need to write a custom validate method. However, can you share a little more about what exactly your trying to accomplish/what about the has_many relationship you are trying to validate?
I create uniqueness validation in my model to garantice that user_id in table was unique, but I am not sure if association has_one do that.
User model
class User < ActiveRecord::Base
#association macros
has_one :balance
end
Balance Model
class Balance < ActiveRecord::Base
#association macros
belongs_to :user
#validation macros
validates :user_id, presence: true, uniqueness: true #uniqueness is necessary?
end
It is not necessary to have a validates_presence_of for that since it is handled in your database. However, to not have to handle a database error, it is better to do it in your model like you have. Rails built in error handlers for validation will then work.
If your table data shows that it cannot be null/nil, then the validation is on the database itself and will return an error which is much harder to handle. You will get a system error and the Rails 'better errors' message. Which basically is breaking your app.
If you do the model validation as you have in your Model using the...
validates :user_id, presence: true, uniqueness: true
then Rails will allow you to control these error messages within your app. You can choose to ignore them (bad) and have data entry almost silently fail. Or, you can turn on label_errors along with flash messages in your controller to allow users to see what is wrong with the data they are trying to enter on a form.
I have a 1-to-1 association between 2 mongoid models and I keep getting duplicates, that is having more than one child record(card) with same parent_id(that is user). I have tried validating uniqueness of the belongs_to association has shown below, but it doesn't work.
class User
include Mongoid::Document
field :name, type: String
has_one :card
end
The second model:
class Card
include Mongoid::Document
field :name, type: String
belongs_to :user
validates :user, :uniqueness => {:scope => :user_has_child}
def user_has_child
q = Segment.where(drop_id: {'$ne' => nil})
s = q.map(&:drop_id)
errors.add(:drop_id, "this user already has a card") if s.include?(:drop_id)
end
end
The syntax is more simple. You just want to make sure there are no 2 documents with the same user_id
class Card
belongs_to :user
validates_uniqueness_of :user
You need to use scope if you want the uniqueness of a tuple of n fields. For example, if a User can have at most one card per year, you can write
class Card
field :year
belongs_to :user
validates_uniqueness_of :user, scope: [:year] # A user can have one card per year
Note that validations apply when you save the model ie. you try to persist the changes. Calling .save will return true or false if some validations fail, but the object in memory is always modified! This is so, for example, you can display previous values in the HTML input fields, so the user knew what he wrote and can fix it (otherwise he'd have to re-write all his information in case of a single mistake)
Also, Mongoid by default handles dirty tracking (this is now the doc for v5.0 but it was the same for Mongoid 4). That is to say, you can call .changed? .changes, etc on the object in memory to see what are the changes compared to the object in the DB.
I have a newbie rails question. I'm trying to make sure a model has at least one association via a HABTM relationship. Basically I have created the following validation:
validate :has_tags?
def has_tags?
errors.add(:base, 'Must have at least one tag.') if self.tags.blank?
end
This works fine when I create a new record. The problem is when I take the model and try to remove the association, doing something like this:
tag = Tag.find(params[:tag_id])$
#command.tags.delete(tag)$
It is permitted, i.e. the association will be deleted. Based on my reading on HABTM associations (http://guides.rubyonrails.org/association_basics.html#the-has-and-belongs-to-many-association), I should "use has_many :through if you need validations, callbacks, or extra attributes on the join model."
I guess my question is how to perform validation on the .delete method for an association. Should I do this manually when I call delete (i.e. run a separate join to count the number of associations before executing a delete), or is there a way to use a validation model when deleting? Here is my model:
class Command < ActiveRecord::Base
has_many :tagmapsorters
has_many :tags, through: :tagmapsorters
validates :text, presence: true
validates :description, presence: true
validates :text, uniqueness: true
validate :has_tags?
def has_tags?
errors.add(:base, 'Must have at least one tag.') if self.tags.blank?
end
end
I appreciate you taking the time to help me.
Dan
Any callbacks that you need should be registered as before_destroy (for validations) or after_destroy (for cleanup) on the join model Tagmapsorter, as that is the record that is actually being destroyed.
I have a model with a foreign key. Even though there's a database constraint preventing a duplicate user_id from entering SomeClass's table, I'm repeating (pre-empting, really) that validation in the model so that it's handled more gracefully. So my model looks something like this:
class SomeClass < ActiveRecord::Base
belongs_to :user
validates_presence_of :user
validates_uniqueness_of :user_id
...
end
It took me a while to realize this is how it needs to be done. It seems to me that ActiveRecord should expect you to use either user or user_id consistently on both presence and uniqueness validation. That would make:
validates :user, :presence => true, :uniqueness => true
or:
validates :user_id, :presence => true, :uniqueness => true
possible, which is optimal for code maintainability since it groups all user constraints together. So why instead is it inconsistent?
I believe the reason lies in difference between user_id and user.
Presence of user makes sure not only existence of user_id, but also presence of the actual user.
As checking uniqueness of user_id and user, AFAIK they may have the same effect, checking of user_id is enough it make sure it is unique.