Resque can't update objects - rails - ruby-on-rails

I am trying to make a simple import background job to import some csv information using resque.
The job runs, but once I edit an object it seems like its ruined or something...
Once this job runs, it finds the user with the first lin... then at #user.name it doesn't get set, and then it can't save. I can run this code in my console and it works great, only broken in resque. Is there a limitation that you can't work with object or write to a database with resque? wasted 6 hours so far trying myriads of things...please help.
def self.perform(chunk)
chunk.each do |i|
#user = User.where(:email => i[:email].to_s).first_or_initialize
#user.name = i[:name].to_s
if #user.save
puts #user.email
entity = Entity.find_by_email_domain(#user.email)
eur = EntityUser.where(:entity_id => entity.id, :user_id => #user.id).first_or_initialize
if eur.save
puts "Start: BLARGITY End"
topic = Topic.where(:number => i[:course_number].to_s).first_or_initialize
eu = TopicUser.where(:topic_id => topic.id, :user_id => user.id, :role_i => 1).first_or_initialize
else
eu = TopicUser.where(:topic_id => topic.id, :user_id => user.id).first_or_initialize
end
eu.save
end
end
end
end
I just tried with doing a find instead like this, and you can see the error now..
NoMethodError: undefined method `name=' for nil:NilClass
#user = User.find_by_email(i[:email])
puts "sdfdsf"
#user.name = i[:name]
puts #user.name

I was using smarter_csv to send in 'chunk' which was a hash. I guess when redis serializes a hash, you need to symbolize it before you start using it again.
chunk.each do |i|
i.symbolize_keys!
#user = User.find_by_email(i[:email])
#user.name = i[:name]
works after symbolizing it!

Related

Delayed Job and S3 Upload cant finish process , but also dont give any error

I am working on rails project, and I am generating excel files form User model and then uploading to amazon s3! Everything so far work's perfect, but I want to use also delayed job, and there is where the problem comes,
when I call the method with delayed, the job cant complete, but also I don't get any error! If I run delayed jobs log i got this:
[Worker(host:pure pid:11063)] Starting job worker
[Worker(host:pure pid:11063)] Job Delayed::PerformableMethod (id=77) RUNNING
[Worker(host:pure pid:11063)] Job Delayed::PerformableMethod (id=78) RUNNING
[Worker(host:pure pid:11063)] Job Delayed::PerformableMethod (id=79) RUNNING
[Worker(host:pure pid:11063)] Job Delayed::PerformableMethod (id=80) RUNNING
it seems like the job has stuck and can't finishes the process!
Here are all of my Methods:
User's Controller:
def export
if params[:search]
#users = User.all.order('created_at DESC')
else
#users = User.search(params[:search]).order('created_at DESC')
end
User.delay.export_users_to_xlsx(#users, current_user)
redirect_to action: :index
end
User's Model:
def self.export_users_to_xlsx(users, user)
file = User.create_excel(users)
s3_upload = FileManager.upload_to_s3(file)
link = s3_upload[:file].presigned_url(:get, expires_in: 10.days) if s3_upload
end
def self.create_excel(users)
Spreadsheet.client_encoding = 'UTF-8'
filename = "#{Rails.root}/tmp/#{ DateTime.now.strftime("%m%d%Y%H%M%S") }_users.xlsx"
array = [["Name", "Balance","State","Address"]]
users.each do |user|
array.push([user.name,user.balance,user.state,user.address])
end
ExportXls.create_spreadsheet(array,filename)
end
and here is the lib for creating the spreadsheet:
class ExportXls
def self.create_spreadsheet(content,filename)
Spreadsheet.client_encoding = 'UTF-8'
book = Spreadsheet::Workbook.new
sheet1 = book.create_worksheet
content.each_with_index do |row,index|
row.each do |column|
sheet1.row(index).push column
end
end
book.write filename
return filename
end
end
And this is the method for uploading the file to S3 , which is also in lib :
def self.upload_to_s3(temp_file)
s3 = Aws::S3::Resource.new
begin
obj = s3.bucket(ENV['S3_BUCKET']).object(File.basename(temp_file))
obj.upload_file(temp_file)
File.delete(temp_file) if File.exists?(temp_file)
{ :result => 'success', :file => obj }
rescue Aws::S3::Errors::ServiceError => error
{ :result => 'failed', :message => error.message }
end
end
Any suggestions why i cant get the delayed job to get work?
I am using delayed_job_active_record gem!
Sorry for my bad English!

Rails 4 - How to send newsletter-like emails with delayed_job?

I want to send a summary of our new lists to our users every morning. What's the best approach to do that with Ruby On Rails 4, ActiveRecord (using SendGrid) and Delayed Job?
I am currently doing it this way:
In controller:
def yesterday_listings_for_users
yesterday_listings = Listings.where('status = "0" AND (DATE(created_at) = ?)', Date.today - 1)
if yesterday_listings.count > 0
NotificationMailer.delay.yesterday_listings_for_users_notification
end
render :nothing => true
end
And then in the mailer:
def yesterday_listings_for_users_notification
#listings = Listing.where('status = "0" AND (DATE(created_at) = ?)', Date.today-1)
mail(to: 'myemail#gmail.com', subject: "Latest Listings", from: 'no-reply#mywebsite.com')
end
With using a CRON job, this sends me the report every morning on my email address. I have a few hundreds of users in the database and I would like to send them this email as well.
How to do that? I am wondering about something like this:
def yesterday_listings_for_users_notification
#listings = Listing.where('status = "0" AND (DATE(created_at) = ?)', Date.today-1)
Users.all.each do |user|
mail(to: user.email, subject: "Latest Listings", from: 'no-reply#mywebsite.com')
end
end
However, is looping through hundreds of records in database and sending hundreds of emails in a delayed mailer method recommened (or appropriate)?
Is there a better way to do that?
Thank you in advance!
I usually prefer to use Sidekiq along with Sidetiq but if you want to use delayed_job I would advice you to use the whenever gem for simplicity.
Whenever is a Ruby gem that provides a clear syntax for writing and
deploying cron jobs.
Add gem 'whenever' to your gemfile
run the command wheneverize . which will generate a file config/schedule.rb
In your config/schedule.rb do the following.
every 1.day, :at => '11:30 am' do
runner "User.delay.send_daily_newsletter"
end
In your user.rb define the method send_daily_newsletter and use find_each instead of all.each (batches)
def self.send_daily_newsletter
listings = Listing.where('status = "0" AND (DATE(created_at) = ?)', Date.today - 1).select(:title).to_json
User.select(:id, :email).find_each do |u|
NotificationMailer.delay.send_daily_newsletter(u.email, listings)
end
end
In your notification_mailer.rb define send_daily_newletter
def send_daily_newsletter(user_email, listings)
#listings = listings
mail(to: user_email, subject: "Latest Listings", from: 'no-reply#mywebsite.com')
end
This way you will have one delayed job to get all users and send each email using a separate worker which is the most optimal way to do this task.
Note: Do not forget to change the methods for listings in your view
from, for example, listing.title to listing[:title] since we are
passing the listings as json.
If you do not want to pass them as json every time to every delayed
task just cache the listings in Rails.cache and clear it
after you finish sending.
EDIT:
If you would like to use the cache method since you ran into a problem in the delayed_job gem, edit your send_daily_newsletter method in your mailer. (That's is why I would go to redis-based Sidekiq rather than mysql-based delayed_job.
def send_daily_newsletter(user_email)
#listings = Rails.cache.fetch('today_listings') { Listing.where('status = "0" AND (DATE(created_at) = ?)', Date.today - 1) }
mail(to: user_email, subject: "Latest Listings", from: 'no-reply#mywebsite.com')
end
and in your user.rb
def self.send_daily_newsletter
User.select(:id, :email).find_each do |u|
NotificationMailer.delay.send_daily_newsletter(u.email)
end
Rails.cache.clear('today_listings')
end
Good luck. I have been doing these email newsletters for a while now and they are truly pain :D

Strong parameters in test issue

Ruby 2.1.1p76 on Rails 4.1.1.
Please check out my controller:
def update
begin
current_user.update_settings user_settings_params unless params[:user_setting].blank?
current_user.update_attribute :district_id, params[:user][:district_id] unless params[:user].blank? || params[:user][:district_id].blank?
flash[:success] = "Preferencje zostały zaktualizowane"
redirect_to subscription_index_path
rescue UserLevelException => exception
flash[:alert] = "Sprytnie, za karę zostałeś wylogowany ;)"
session[:user_id] = nil
redirect_to root_path
return
end
end
private
def user_settings_params
params.require(:user_setting).permit(
:inquiry_subject, :inquiry_body,
:offer_subject, :offer_body,
:only_companies_with_email,
{:district_ids => []},
# {:district_ids => params[:user_setting][:district_ids].try(:keys)},
:delivery_address,
)
end
See the commented line? In the form above - user_settings_params will not return :district_ids array of ids, and this is fine, since I can use the line below instead to have them (got it from guides).
The problem I have is when running this test:
test 'should set user level10 districts' do
user = login_user :paid10
post :update, :user_setting => {:district_ids => [districts(:zachodniopomorskie).id, districts(:slaskie).id]}
assert_equal nil, flash[:alert]
assert_equal 'Preferencje zostały zaktualizowane', flash[:success]
db_user_districts = User.find(user.id).settings.districts.all
assert db_user_districts.include? districts(:zachodniopomorskie)
assert db_user_districts.include? districts(:slaskie)
assert_equal 2, db_user_districts.count
end
It passes. When debugging user_settings_param has :district_ids available as if strong parameters were disabled or something. I wanted to submit an issue to rails but most probably I'm doing something wrong and can't figure it out.
I've found it - it was because of quirky way I was creating checkboxes for HABTM
= check_box_tag "user_setting[district_ids][#{district.id}]", district.id, user.settings.district_ids.include?(district.id)
= label_tag "user_setting[district_ids][#{district.id}]", district.name
For no particular reason I've inserted ids into params keys AND values. And because of that those were passed to params object as hash. In test though those were sent as array. So it was the view to blame.
= check_box_tag "user_setting[district_ids][]", district.id, user.settings.district_ids.include?(district.id)
= label_tag "user_setting[district_ids][]", district.name

Speeding up Ruby code to make faster/more API calls

I have the following code:
list_entities = [{:phone => '0000000000', :name => 'Test', :"#i:type => '1'},{:phone => '1111111111', :name => 'Demo', :"#i:type => '1'}]
list_entities.each do |list_entity|
phone_contact = PhoneContact.create(list_entity.except(:"#i:type"))
add_record_response = api.add_record_to_list(phone_contact, "API Test")
if add_record_response[:add_record_to_list_response][:return][:list_records_inserted] != '0'
phone_contact.update(:loaded_at => Time.now)
end
end
This code is taking an array of hashes and creating a new phone_contact for each one. It then makes an api call (add_record_response) to do something with that phone_contact. If that api call is successful, it updates the loaded_at attribute for that specific phone_contact. Then it starts the loop over.
I am allowed something like 7200 api calls per hour with this service - However, I'm only able to make about 1 api call every 4 seconds right now.
Any thoughts on how I could speed this code block up to make faster api calls?
I would suggest using a thread pool. You can define a unit of work to be done and the number of threads you want to process the work on. This way you can get around the bottleneck of waiting for the server to response on each request. Maybe try something like (disclaimer: this was adapted from http://burgestrand.se/code/ruby-thread-pool/)
require 'thread'
class Pool
def initialize(size)
#size = size
#jobs = Queue.new
#pool = Array.new(#size) do |i|
Thread.new do
Thread.current[:id] = i
catch(:exit) do
loop do
job, args = #jobs.pop
job.call(*args)
end
end
end
end
end
def schedule(*args, &block)
#jobs << [block, args]
end
def shutdown
#size.times do
schedule { throw :exit }
end
#pool.map(&:join)
end
end
p = Pool.new(4)
list_entries.do |list_entry|
p.schedule do
phone_contact = PhoneContact.create(list_entity.except(:"#i:type"))
add_record_response = api.add_record_to_list(phone_contact, "API Test")
if add_record_response[:add_record_to_list_response][:return][:list_records_inserted] != '0'
phone_contact.update(:loaded_at => Time.now)
end
puts "Job #{i} finished by thread #{Thread.current[:id]}"
end
at_exit { p.shutdown }
end

Rails validations running against original record during update

I'm trying to figure out an inconsistency between what's happening in a functional test and what is happening in my development environment. I have a custom validation method unique_entry that is essentially a specialized version of validates_uniqueness_of. It looks like this:
def unique_entry
matched_entry = Entry.first(:conditions => ['LOWER(field_one) = LOWER(?) AND LOWER(field_two) = LOWER(?)', self.field_one, self.field_two])
errors.add_to_base('Duplicate detected') if matched_entry && (matched_entry.id != self.id)
end
The update action in the controller is very basic:
def update
if #entry.update_attributes(params[:entry])
flash.now[:success] = 'Success'
render :action => 'show'
else
flash.now[:error] = 'Error'
render :action => 'edit'
end
end
This works just fine when I'm creating a new record. When I update a record, however, I get inconsistent behavior. If I test it from a browser in my development environment, it correctly renders the edit action with an error message, but in my functional test, it accepts the update as successful. Here is the test:
test "should not update entry and should render edit view if invalid update" do
put :update, { :id => 1, :field_one => 'new_value', :field_two => 'new_value' } # 'new values' are the same as another existing record to trigger the duplication check
assert_template :edit
assert_not_nil flash[:error]
end
I looked at the test log and discovered that the values unique_entry is using are the record's original values instead of the values it should be attempting to update with. That is, the first line of unique_entry generates an SQL query like this:
SELECT * FROM "entries" WHERE (LOWER(field_one) = LOWER('original_value_of_field_one') AND LOWER(field_two) = LOWER('original_value_of_field_two')) LIMIT 1
What am I missing here? Why do my validations seem to be running against the original record instead of the new values only in the test environment?
In your test, shouldn't there be some reference to :entry, since that is what you are looking for in the controller params[:entry] ?

Resources