In my User model i have the following method :
def confirmation_token
self.confirmation = loop do
random_token = SecureRandom.urlsafe_base64(16, false)
break random_token unless User.exists?(confirmation: random_token)
end
end
this method will just create a random token to confirm user's email...
as you can see it loop while User.exists?(confirmation: random_token), which means it verify if there is no similar token already in user table.
my question is : if i have for example a lot of rows in "user table", i need to add index in this (confirmation) column for more performance ?
note (this method is executed just once per user ... the first time when user is sign up)
Yes. If you're doing many searches on any particular column (in this case confirmation), you should index that column.
If You are just searching by that value, then short answer was given to You.
Though there are some things to consider. First of all You probably would want that index to be unique, this will improve the performance quite a lot as table grows.
add_index :users, :confirmation_token, unique: true
Also You probably want a unique value instead of just random value. Though it's unlikely that it would generate a duplicate, it's still random, not unique value. One of the options would be generating and SHA using Digest class from some user column that You know is unique for this table, like this:
Digest::SHA1.hexdigest(user.email)
UPD: Asker is concerned about cases if someone would know that he uses email as a key and would use it to generate token.
This usually is solved by appending some unique key to the email before encrypting. You can generate such key with secure random and store it inside environment variable.
In your .bashrc/.profile/.bash_profile or any other You use, do this:
export EMAIL_TOKEN_SECRET="M9SyIuuOPhakX0b6gjvcRnsRHY="
Then do like this:
Digest::SHA1.hexdigest("#{user.email}-#{ENV['EMAIL_TOKEN_SECRET'}")
Related
In our Rails app, the user (or we on his behalf) load some data or even insert it manually using a crud.
After this step the user must validate all the configuration (the data) and "accept and agree" that it's all correct.
On a given day, the application will execute some tasks according the configuration.
Today, we already have a "freeze" flag, where we can prevent changes in the data, so the user cannot mess the things up...
But we also would like to do something like hash the data and say something like "your config is frozen and the hash is 34FE00...".
This would give the user a certain that the system is running with the configuration he approved.
How can we do that? There are 7 or 8 tables. The total of records created would be around 2k or 3k.
How to hash the data to detect changes after the approval? How would you do that?
I'm thinking about doing a find_by_user in each table, loop all records and use some fields (or all) to build a string and hash it at the end of the current loop.
After loop all tables, I would have 8 hash strings and would concatenate and hash them in a final hash.
How does it looks like? Any ideas?
Here's a possible implementation. Just define object as an Array of all the stuff you'd like to hash :
require 'digest/md5'
def validation_hash(object, len = 16)
Digest::MD5.hexdigest(object.to_json)[0,len]
end
puts validation_hash([Actor.first,Movie.first(5)])
# => 94eba93c0a8e92f8
# After changing a single character in the first Actors's biography :
# => 35f342d915d6be4e
I have records in the db table that can differ only by ids and creation/update time. How can I get only the unique records?
I tried this way, but it didn't work:
msg_to_user = user.messages.new_messages.uniq
I'll explain. User can follow post manually but also same post can be followed by user automatically. So I want to send only one message if post have been commented by someone.
1747 test message TamadaTours 12 new 2016-01-29 06:14:04.736869 2016-01-29 06:48:55.948529 32964382
1748 test message TamadaTours 12 new 2016-01-29 06:14:04.741184 2016-01-29 06:48:55.951371 32964382
All records in the database are uniq (at least because of ID column, which by default has a uniq constraint).
You would want to use DISTINCT:
Model.select('DISTINCT column_name1, column_name2')
Your question is flawed...
The point of having an id... otherwise known as primary_key... in a relational database is so that you can actively identify the unique records you want:
A primary key uniquely specifies a tuple within a table. In order for an attribute to be a good primary key it must not repeat
When you write... "How can I get only the unique records" ... the answer is to pull only the records based on their id.
If you refine your question to what you really want...
I want to send only one message if post have been commented by someone
--
In other words, you want to pull a collection of unique user_ids (no duplicates), to which you can send new messages?
To do this, you can use...
#recipients = Message.select(:user_id).distinct #-> all unique user_ids
If you're trying to pull the "new" messages for a user, but only show the first (if they're the same), you'll want to use something like the following:
#msg_to_user = user.messages.new_messages.uniq(:title)
Ref
A better pattern to implement would be to validate the uniqueness of new messages:
#app/models/message.rb
class Message < ActiveRecord::Base
validates :user_id, uniqueness: { scope: :message_id } #-> replace message_id with other unique identifier
end
This would ensure only one new message is present for a user.
you can also use group by in your query like this
Model.all.group("column_name")
One way to extract the data but get rid of the ORM context.
# maps every record to hash, remove the ID entry and then removes duplicates
msg_to_user = user.messages.new_messages.attributes.map { |e| e.except('ID') }.uniq
I have two tables one for members and the other for employees, both have an attribute called id_number this attribute is not required and can be null.
Is it possible to run a validation to ensure the uniqueness of the id_number, so that if an employee is added with the same id_number as an member or vice versa that it will give an error.
I am thinking of writing my own validation but hitting the db for each instance will be very slow as some companies upload 10's of thousands of employees at a time.
Yes that's possible with your own validation. I think you have to hit the database, otherwise you never could check if it exists already.
def your_validation
employee_ids = Employee.all.map(&:id_number)
member_ids = Member.all.map(&:id_number)
id = self.id_number
if employee_ids.include?(id) || member_ids.include?(id)
errors.add(:id_number, "is already taken")
end
end
I think adding an index to your id_number will be good.
UPDATE: The above method could be changed to following to improve the performance:
def your_validation
employee_ids = Employee.all.map(&:id_number)
if employee_ids.include?(self.id_number)
errors.add(:id_number, "is already taken")
else
member_ids = Member.all.map(&:id_number)
if member_ids.include?(self.id_number)
errors.add(:id_number, "is already taken")
end
end
end
The first one is cleaner, the second one should be faster. But check this out with a lot of db entries and a benchmark tool.
I think you'll want something like this:
def your_validation
if self.id_number.present?
if Employee.exists?(:id_number=>self.id_number) || Member.exists(:id_number=>self.id_number)
errors.add(:id_number, "is already taken")
end
end
end
if you have indices on the id_number columns this check should run very quickly and is the same check that validates_uniqueness_of would use within a single table. Solutions that involves fetching all ids into rails will start running into problems when the tables get large.
Another thing to note is that if your app runs multiple web server instances at a time these kinds of rails side checks can't 100% guarantee uniqueness as they are subject to races between threads. The only way to ensure uniqueness in such situations would be to use facilities built into your database or generate the id_numbers yourself from a source that precludes duplicates (such as a database sequence).
I want create a simple checking value from database. Here is my code:
def check_user_name(name, email)
db_name = Customers.find_by_name(name).to_s
db_email = Customers.find_by_email(email).to_s
if name == db_name && email == db_email
return 'yes'
else
return 'no'
end
end
But I have allways 'no' variant....why ?
Because you are calling to_s on your Customers model and not actually getting the name. The two fetch lines you have should be:
Customers.find_by_name(name).name.to_s # to_s probably not necessary if you know this field is a string
Customers.find_by_email(email).email
But, you're making two separate requests to the database. I don't know what the purpose of this is (as you could be selecting two different Customers) but you could do:
if Customers.where(name: name, email: email).exists?
"yes"
else
"no"
end
Since you are, however, selecting by name and email - I would highly recommend that you make sure those fields are indexed because large tables with those requests will bog the server and make that route rather slow (I would actually recommend that you pursue other routes that are more viable, but I wanted to mention this).
When you give Customers.find_by_name(name), you will not get name of a customer. Actually it will return activerecord object, so from this object you need to get the name and email of a customer, like below,
name = Customers.find_by_name(name).name
email = Customers.find_by_email(email).email
Now you will get the exact name and email of matched data from DB.
In my User model, I have:
validates_uniqueness_of :fb_uid (I'm using facebook connect).
However, at times, I'm getting duplicate rows upon user sign up. This is Very Bad.
The creation time of the two records is within 100ms. I haven't been able to determine if it happens in two separate requests or not (heroku logging sucks and only goes back so far and it's only happened twice).
Two things:
Sometimes the request takes some time, because I query FB API for name info, friends, and picture.
I'm using bigint to store fb_uid (backend is postgres).
I haven't been able to replicate in dev.
Any ideas would be extremely appreciated.
The signin function
def self.create_from_cookie(fb_cookie, remote_ip = nil)
return nil unless fb_cookie
return nil unless fb_hash = authenticate_cookie(fb_cookie)
uid = fb_hash["uid"].join.to_i
#Make user and set data
fb_user = FacebookUser.new
fb_user.fb_uid = uid
fb_user.fb_authorized = true
fb_user.email_confirmed = true
fb_user.creation_ip = remote_ip
fb_name_data, fb_friends_data, fb_photo_data, fb_photo_ext = fb_user.query_data(fb_hash)
return nil unless fb_name_data
fb_user.set_name(fb_name_data)
fb_user.set_photo(fb_photo_data, fb_photo_ext)
#Save user and friends to the db
return nil unless fb_user.save
fb_user.set_friends(fb_friends_data)
return fb_user
end
I'm not terribly familiar with facebook connect, but is it possible to get two of the same uuid if two separate users from two separate accounts post a request in very quick succession before either request has completed? (Otherwise known as a race condition) validates_uniqueness_of can still suffer from this sort of race condition, details can be found here:
http://apidock.com/rails/ActiveModel/Validations/ClassMethods/validates_uniqueness_of
Because this check is performed
outside the database there is still a
chance that duplicate values will be
inserted in two parallel transactions.
To guarantee against this you should
create a unique index on the field.
See add_index for more information.
You can really make sure this will never happen by adding a database constraint. Add this to a database migration and then run it:
add_index :user, :fb_uid, :unique => true
Now a user would get an error instead of being able to complete the request, which is usually preferable to generating illegal data in your database which you have to debug and clean out manually.
From Ruby on Rails v3.0.5 Module ActiveRecord::Validations::ClassMethods
http://s831.us/dK6mFQ
Concurrency and integrity
Using this [validates_uniqueness_of]
validation method in conjunction with
ActiveRecord::Base#save does not
guarantee the absence of duplicate
record insertions, because uniqueness
checks on the application level are
inherently prone to race conditions.
For example, suppose that two users
try to post a Comment at the same
time, and a Comment’s title must be
unique. At the database-level, the
actions performed by these users could
be interleaved in the following
manner: ...
It seems like there is some sort of a race condition inside your code. To check this, i would first change the code so that facebook values are first extracted and only then i would create a new facebook object.
Then i would highly suggest that you write a test to check whether your function gets executed once. It seems that it's executed two times.
And upon this, there seems to be a race condition upon waiting to get the facebook results.