How do I get the last updated date in an array? - ruby-on-rails

This code just displays the values inside the array model.request_reports
To get the most recent, I have to loop through and compare the current
report.updated_at with the last saved report.update_at value. One thing to find
out is what class the update_at field is and how to compare them against each other. The class is ActiveSupport::TimeZone
I need to keep track of the array index of the report that has the most recent updated_at as I loop so that I can access it after the loop.
The problem is, I don't know how to do this:
msg = ""
reports_arr = model.request_reports
reports_arr.each do |report|
updated_at = report.updated_at
if updated_at
msg = msg + "#{updated_at} --- "
msg = msg + "#{updated_at.class}---"
end
end
msg

To add to #meagar comment. You should be using the DB to do sorts on tables.
With that said we need to know what DB you are using as the exact command differs for each.
Mongo w/ Mongoid would be Model.order_by(:updated_at => 'desc').first

My loop had to go through the array and check by greatest date value because in the system Im using, it automatically sorts the reports array by the field "due_at" which is not the reports most recent updated record. Code below works for me.
msg = ""
reports_arr = model.request_reports
last_modified_report = model.last_modified_report
recent = nil
recent_report = nil
reports_arr.each_with_index do |report,index|
updated_at = report.updated_at
if index == 0
recent = updated_at
recent_report = report
end
if updated_at > recent
recent = updated_at
recent_report = report
end
last_modified_report = recent_report
end
msg = msg + "#{recent}---"
msg = msg + "#{recent_report}---"
msg = msg + "#{last_modified_report}"
model.last_modified_report = last_modified_report
model.save(validate: false)
msg

The OP's answer is only good if you absolutely cannot query the database for the info you want directly. I assume you only want the index so you can find the most recent one?
Even if automatic sorting is on one column, your query for the data can have it sorted on a different column.
model.request_reports.order_by(:updated_at => 'desc').first
If you have a default scope that's messing with your query, you can ask for an unscoped list, although I doubt a default ordering would cause any trouble.
model.unscoped.order_by(:updated_at => 'desc').first
You can string together queries that are already written: that can be useful even if request_reports is a query or scope you have somewhere.
It will be way less expensive than getting everything, and looping through it - you are always better off finding a way to get just the info you need in a db query if you can.

Related

Use gem 'postgres-copy' to import csv file

currently, I want to import above 55,000 records into my database from a CSV file. This is the code that I am using:
CSV.foreach(Rails.root.join('db/seeds/locations.csv'), headers: true) do |row|
val = Location.find_or_initialize_by(code: row[0])
val.name = row[1]
val.ecc = row[2] || 'MISSING'
val.created_by = User.find_by(name: 'anh')
val.updated_by = User.find_by(name: 'anh')
val.save!
end
However, it is too slow and I have just installed the gem 'postgres-copy'. I read the official documentation, and I believe I can use the class method copy_from to do the job, but if you read my current code, you can see that I am referring the data to the another table(association), and the documentation doesn't mention anything about association or validation. Therefore, I am wondering if there are any ways to solve it. This is the first time I use this gem. Thanks for reading.
I don't know that gem, but I would be very surprised if it can support multi-table copy since PostgreSQL's COPY works on a single table. 50K rows isn't all that many. You might try wrapping your insertions in transactions to avoid one commit per transaction. Doubt you want to wrap all 50K in a transaction though, but something like this:
User.connection.begin_transaction
i = 0
CSV.foreach(...) do |row|
... # your original code here
i += 1
if i % 500 == 0
User.connection.commit_transaction
User.connection.begin_transaction
end
end
User.connection.commit_transaction
This will insert your rows 500 records at a time and you should see a noticeable speed up. Play around with the value of 500 to find the sweet spot.
So, now I understand that I cannot take advantage of the COPY command in POSTGRESQL since it can't copy multiple tables. Therefore, I switch to the gem activerecord-import. Comparing with the method that Philip Hallstrom mentioned above, using activerecord-import give a faster result, 1m20s vs 1m54s to import above 8000 records.
This is my code after installing the gem activerecord-import. Hopefully, it can help other people.
locations = []
columns = [:code, :name, :ecc]
CSV.foreach(Rails.root.join('db/seeds/locations.csv'), headers: true) do |row|
val = Location.find_or_initialize_by(code: row[0])
val.name = row[1]
val.ecc = row[2] || 'MISSING'
val.created_by = User.find_by(name: 'anh')
val.updated_by = User.find_by(name: 'anh')
locations << val
end
Location.import columns, locations, validate: false

How to apply lock on particular column in ruby on rails?

How to apply lock on particular field so the same number is not generate again.
I have created algoritham in which it create string with using Year+000..+integer number
example : "20150001","20150002","20150003" etc.
The problem is that when the multiple user request for that number at that time the same number generated.
Following function i call
def get_algo_number(model_name,prefix) <br>
year = get_year
if model_name.count > 0
last_number = model_name.last.number
if last_number[2..5].to_i > year.to_i
return create_number(year,prefix)
else
# if letest generated number already exist then generate new number
return last_number.next
end
else
return create_number(year,prefix)
end
end
Please help if you have any solution regarding apply lock.
Thanks
Yes, i resolved this problem by using multi-threading.

Rails 3: object.save is writing the old values to database

I have code which is updating a model's property then calling save!. A Rails.logger.info call shows that the model thinks it has the new values. But the SQL write performed by the save! call is writing the old value to the database.
At first it wasn't writing anything to the database at all when I called save!. I thought it was that the object wasn't thinking its value had changed for some reason: changed? returned false, so I used a _will_change! notification to force a write. But now it is doing a write, but with the old values.
This doesn't happen from the "rails console" command line: there I'm able to update the property and it will return changed? of true, and let me save successfully.
Excerpt from the server log follows. Note that the object thinks it has log_ids of '1234,5678,1137', but writes to the database '1234,5678'.
current log ids are [1234, 5678]
new log ids are [1234, 5678, 1137]; writing log_ids of '1234,5678,1137' to NewsList 13 with dirty true
SQL (2.0ms) UPDATE "news_lists" SET "log_ids" = '1234,5678', "updated_at" = '2012-01-02 02:12:17.612283' WHERE ("news_lists"."id" = 13)
The object property in question is log_ids, which is a string containing several IDs of another kind of object.
The source code that produced the output above:
def add_log(new_log)
new_ids = get_log_ids
Rails.logger.info("current log ids are #{new_ids}")
if new_ids.length >= NewsList.MAX_LENGTH
new_ids.shift
end
log_ids_will_change!
new_ids.push new_log.id
log_ids = new_ids.join ","
Rails.logger.info("new log ids are #{new_ids}; writing log_ids of '#{log_ids}' to NewsList #{id} with dirty #{changed?}")
save!
end
def get_log_ids
if log_ids
log_ids.split(",").map &:to_i
else
[]
end
end
Can anyone suggest what might be going on here?
Add the self to self.log_ids = new_ids.join "," otherwise you will just be assigning to the local variable (namesake) instead of the db-persisted attribute (column).

find_or_create and race-condition in rails, theory and production

Hi I've this piece of code
class Place < ActiveRecord::Base
def self.find_or_create_by_latlon(lat, lon)
place_id = call_external_webapi
result = Place.where(:place_id => place_id).limit(1)
result = Place.create(:place_id => place_id, ... ) if result.empty? #!
result
end
end
Then I'd like to do in another model or controller
p = Post.new
p.place = Place.find_or_create_by_latlon(XXXXX, YYYYY) # race-condition
p.save
But Place.find_or_create_by_latlon takes too much time to get the data if the action executed is create and sometimes in production p.place is nil.
How can I force to wait for the response before execute p.save ?
thanks for your advices
You're right that this is a race condition and it can often be triggered by people who double click submit buttons on forms. What you might do is loop back if you encounter an error.
result = Place.find_by_place_id(...) ||
Place.create(...) ||
Place.find_by_place_id(...)
There are more elegant ways of doing this, but the basic method is here.
I had to deal with a similar problem. In our backend a user is is created from a token if the user doesn't exist. AFTER a user record is already created, a slow API call gets sent to update the users information.
def self.find_or_create_by_facebook_id(facebook_id)
User.find_by_facebook_id(facebook_id) || User.create(facebook_id: facebook_id)
rescue ActiveRecord::RecordNotUnique => e
User.find_by_facebook_id(facebook_id)
end
def self.find_by_token(token)
facebook_id = get_facebook_id_from_token(token)
user = User.find_or_create_by_facebook_id(facebook_id)
if user.unregistered?
user.update_profile_from_facebook
user.mark_as_registered
user.save
end
return user
end
The step of the strategy is to first remove the slow API call (in my case update_profile_from_facebook) from the create method. Because the operation takes so long, you are significantly increasing the chance of duplicate insert operations when you include the operation as part of the call to create.
The second step is to add a unique constraint to your database column to ensure duplicates aren't created.
The final step is to create a function that will catch the RecordNotUnique exception in the rare case where duplicate insert operations are sent to the database.
This may not be the most elegant solution but it worked for us.
I hit this inside a sidekick job that retries and gets the error repeatedly and eventually clears itself. The best explanation I've found is on a blog post here. The gist is that postgres keeps an internally stored value for incrementing the primary key that gets messed up somehow. This rings true for me because I'm setting the primary key and not just using an incremented value so that's likely how this cropped up. The solution from the comments in the link above appears to be to call ActiveRecord::Base.connection.reset_pk_sequence!(table_name) This cleared up the issue for me.
begin
result = Place.where(:place_id => place_id).limit(1)
result = Place.create(:place_id => place_id, ... ) if result.empty? #!
rescue ActiveRecord::StatementInvalid => error
#save_retry_count = (#save_retry_count || 1)
ActiveRecord::Base.connection.reset_pk_sequence!(:place)
retry if( (#save_retry_count -= 1) >= 0 )
raise error
end

Use a function in a conditions hash

I'm building a conditions hash to run a query but I'm having a problem with one specific case:
conditions2 = ['extract(year from signature_date) = ?', params[:year].to_i] unless params[:year].blank?
conditions[:country_id] = COUNTRIES.select{|c| c.geography_id == params[:geographies]} unless params[:geographies].blank?
conditions[:category_id] = CATEGORY_CHILDREN[params[:categories].to_i] unless params[:categories].blank?
conditions[:country_id] = params[:countries] unless params[:countries].blank?
conditions['extract(year from signature_date)'] = params[:year].to_i unless params[:year].blank?
But the last line breaks everything, as it gets interpreted as follows:
AND ("negotiations"."extract(year from signature_date)" = 2010
Is there a way to avoid that "negotiations"." is prepended to my condition?
thank you,
P.
For something like this, you'll probably have to write your own SQL with find_by_sql. Still wrap it in a method in your model so your model's friends can access it nicely.

Resources