How to exclude associations build with model.associated.build? - ruby-on-rails

I'm currently upgrading an existing application from rails 3 to 4. I've encountered changed behaviour, but I'm not sure how to fix this.
say the following exists;
class Program
has_many :pauses
end
class Pause
belongs_to :program
def dates
(starts_at...ends_at)
end
validate :validate_without_overlap
def validate_without_overlap
return if (program.pauses.map(&:dates).flatten & [starts_at, ends_at]).blank?
# set errors...
end
end
program = Program.create
program.pauses.build starts_at: 1.week.ago.to_date, ends_at: Date.today
# ...
program.save
To verify the pause does not have a overlap with existing pauses, within a
validation method the following happens:
program.pauses.map(&:dates)
This already includes the builded pause record. Which triggers a validation error because it overlaps itself. How to kill groundhog day?

Just exclude new records use something like this:
program.pauses.select { |o| !o.new_record? }.map(&:dates)
or
program.pauses.reject(&:new_record?).map(&:dates)

Related

Rails methods not initialized in time for worker

Earlier, I had posted this question – and thought it was resolved:
Rails background worker always fails first time, works second
However, after continuing with tests and development, the error is back again, but in a slightly different way.
I'm using Sidekiq (with Rails 3.2.8, Ruby 1.9.3) to run background processes, after_save. Below is the code for my model, worker, and controller.
Model:
class Post < ActiveRecord::Base
attr_accessible :description,
:name,
:key
after_save :process
def process
ProcessWorker.perform_async(id, key) if key.present?
true
end
def secure_url
key.match(/(.*\/)+(.*$)/)[1]
end
def nonsecure_url
key.gsub('https', 'http')
end
end
Worker:
class ProcessWorker
include Sidekiq::Worker
def perform(id, key)
post = Post.find(id)
puts post.nonsecure_url
end
end
(Updated) Controller:
def create
#user = current_user
#post = #user.posts.create(params[:post])
render nothing: true
end
Whenever jobs are first dispatched, no matter the method, they fail initially:
undefined method `gsub' for nil:NilClass
Then, they always succeed on the first retry.
I've come across the following github issue, that appears to be resolved – relating to this same issue:
https://github.com/mperham/sidekiq/issues/331
Here, people are saying that if they create initializers to initialize the ActiveRecord methods on the model, that it resolves their issue.
To accomplish this, I've tried creating an initializer in lib/initializers called sidekiq.rb, with the following, simply to initialize the methods on the Post model:
Post.first
Now, the first job created completes successfully the first time. This is good. However, a second job created fails the first time – and completes upon retry... putting me right back to where I started.
This is really blowing my mind – has anyone had the same issue? Any help is appreciated.
Change your model callback from after_save to after_commit for the create action. Sometimes, sidekiq can initialize your worker before the model actually finishes saving to the database.
after_commit :process, :on => :create

Rails 3: Should I explicitly save an object in an after_create callback?

Relevant Code: http://pastebin.com/EnLJUJ8G
class Task < ActiveRecord::Base
after_create :check_room_schedule
...
scope :for_date, lambda { |date| where(day: date) }
scope :for_room, lambda { |room| where(room: room) }
scope :room_stats, lambda { |room| where(room: room) }
scope :gear_stats, lambda { |gear| where(gear: gear) }
def check_room_schedule
#tasks = Task.for_date(self.day).for_room(self.room).list_in_asc_order
#self_position = #tasks.index(self)
if #tasks.length <= 2
if #self_position == 0
self.notes = "There is another meeting in
this room beginning at # {#tasks[1].begin.strftime("%I:%M%P")}."
self.save
end
end
end
private
def self.list_in_asc_order
order('begin asc')
end
end
I'm making a small task app. Each task is assigned to a room. Once I add a task, I want to use a callback to check to see if there are tasks in the same room before and or after the task I just added (although my code only handles one edge case right now).
So I decided to use after_create (since the user will manually check for this if they edit it, hence not after_save) so I could use two scopes and a class method to query the tasks on the day, in the room, and order them by time. I then find the object in the array and start using if statements.
I have to explicitly save the object. It works. But it feels weird that I'm doing that. I'm not too experienced (first app), so I'm not sure if this is frowned upon or if it is convention. I've searched a bunch and looked through a reference book, but I haven't see anything this specific.
Thanks.
This looks like a task for before_create to me. If you have to save in your after_* callback, you probably meant to use a before_* callback instead.
In before_create you wouldn't have to call save, as the save happens after the callback code runs for you.
And rather than saving then querying to see if you get 2 or more objects returns, you should be querying for one object that will clash before you save.
In psuedo code, what you have now:
after creation
now that I'm saved, find all tasks in my room and at my time
did I find more than one?
Am I the first one?
yes: add note about another task, then save again
no: everything is fine, no need to re-save any edits
What you should have:
before creation
is there at least 1 task in this room at the same time?
yes: add note about another task
no: everything is fine, allow saving without modification
Something more like this:
before_create :check_room_schedule
def check_room_schedule
conflicting_task = Task.for_date(self.day)
.for_room(self.room)
.where(begin: self.begin) # unsure what logic you need here...
.first
if conflicting_task
self.notes =
"There is another meeting in this room beginning at #{conflicting_task.begin.strftime("%I:%M%P")}."
end
end

Custom Model Method, setting scope for automatic sending of mail

There are several stages to this, and as I am relatively new to rails I am unsure if I am approaching this in the best way.
Users follow Firms, Firms applications open and close on certain days. If a user follows a firm I would like them to automatically get an email when a) the firms application opens, b) a week before the firms applications close, c) on the day that the firms applications close.
I have tried using named scope. I have the following model method (I presume this will need a little work) setting each firms scope, depending on the date.
model firms.rb
def application_status
if open_date == Today.date
self.opening = true
else
self.opening = false
end
if ((close_day - Today.date) == 7)
self.warning = true
else
self.warning = false
end
if close_day == Today.date
self.closing = true
else
self.closing = false
end
end
I would like this method to be called on each firm once a day, so that each firm has the appropriate scope - so I have tried using the whenever gem (cron) and the following code. Running the above model method on each firm.
Schedule.rb
every 1.day do
runner "Firm.all.each do |firm|
firm.application_status
end"
end
Then for each of the scopes opening, warning, closing i have a method in the whenever schedules file, For simplicity I shall show just the opening methods. The following queries for all firms that have had the opening scope applied to them, and runs the application_open_notification method on them.
Schedule.rb
every 1.day do
runner "Firm.opening.each do |firm|
firm.application_open_notification
end"
end
This calls the following method in the Firm.rb model
def application_open_notification
self.users.each do |user|
FirmMailer.application_open(user, self).deliver
end
end
Which in turn calls the final piece of the puzzle... which should send the user an email, including the name of the firm.
def application_open(user,firm)
#firm = firm
#user = user
mail to: #user.email, subject: #firm' is now accepting applications'
end
end
Is this a viable way to approach this problem? In particular I am not very familiar with coding in the model.
Many thanks for any help that you can offer.
I'll guess that opening, warning and closing are database fields, and you have scopes like:
class Firm < ActiveRecord::Base
scope :opening, :where => { :opening => true }
# etc
end
There is a general rule for database (and, well, all storage): don't store things you can caculate, if you don't have to.
Since an application's status can be dermined from the day's date and the open_date and close_day fields, you could calculate them as needed instead of creating extra fields for them. You can do this with SQL and Active Record:
scope :opening, :where { :open_date => (Date.today .. Date.today+1) }
scope :warning, :where { :close_day => (Date.today+7 .. Date.today+8) }
scope :closing, :where { :close_day => (Date.today .. Date.today+1) }
(Note that these select time ranges. They may have to be changed depending on if you are using date or time fields.)
But there is another issue: what happens if, for some reason (computer crash, code bug etc) your scheduled program doesn't run on a particular day? You need a way of making sure notices are sent eventually even if something breaks. There are two solutions:
Write your schedule program to optionally accept a date besides today (via ARGV)
keep flags for each firm for whether each kind of notice has been sent. These will have to be stored in the databse.
Note that scopes aren't necessary. You are able to do this:
Firm.where(:open_date => (Date.today .. Date.today+1)).each do |firm|
#...
end
but the scope at least encapsulates the details of identifying the various sets of records.

Use before_destroy to delete another model's entry?

So I have a model, let's call it Notes. On the notes, you can but several posts. So the notes model has the line:
has_many :posts
And the posts model has the line
belongs_to :note
Now, when a User destroys a post, I want for the note to be destroyed IF it not longer has any other posts.
I thought i would write this code into the post model with before_destroy:
def delete_note_if_last_post
if self.note.posts.count == 1
self.note.destroy
end
end
This doesn't work. It shuts down the server based on an "Illegal Instruction". Is there some way to accomplish what I am trying to do?
EDIT: changed the code, as I noticed an error, and now the problem is slightly different.
you can return false to prevent a model from destruction in before_destroy filter
before_destroy :has_no_post
then in has_no_post
def has_no_post
#You can prevent this from deletion by using these options
#Option1 return false on certain condition
return false if posts.any?
#or add an error to errors
errors << "Can not delete note if it has post" if posts.any?
#raise an exception
raise "Cant delete ..." if blah blah
end
I would suggest putting this kind of logic into an observer. Something like
class PostObserver < ActiveRecord::Observer
def after_destroy(post)
note = Note.find(post.note_id)
note.destroy if note.posts.count == 0
end
end
You'd have to register the observer in your config/application.rb file as well. One thing to note is that if your callback returns any value that can be evaluated as false (e.g. nil or false) the rest of your callbacks will not run.

How to save something to the database after failed ActiveRecord validations?

Basically what I want to do is to log an action on MyModel in the table of MyModelLog. Here's some pseudo code:
class MyModel < ActiveRecord::Base
validate :something
def something
# test
errors.add(:data, "bug!!")
end
end
I also have a model looking like this:
class MyModelLog < ActiveRecord::Base
def self.log_something
self.create(:log => "something happened")
end
end
In order to log I tried to :
Add MyModelLog.log_something in the something method of MyModel
Call MyModelLog.log_something on the after_validation callback of MyModel
In both cases the creation is rolled back when the validation fails because it's in the validation transaction. Of course I also want to log when validations fail. I don't really want to log in a file or somewhere else than the database because I need the relationships of log entries with other models and ability to do requests.
What are my options?
Nested transactions do seem to work in MySQL.
Here is what I tried on a freshly generated rails (with MySQL) project:
./script/generate model Event title:string --skip-timestamps --skip-fixture
./script/generate model EventLog error_message:text --skip-fixture
class Event < ActiveRecord::Base
validates_presence_of :title
after_validation_on_create :log_errors
def log_errors
EventLog.log_error(self) if errors.on(:title).present?
end
end
class EventLog < ActiveRecord::Base
def self.log_error(event)
connection.execute('BEGIN') # If I do transaction do then it doesn't work.
create :error_message => event.errors.on(:title)
connection.execute('COMMIT')
end
end
# And then in script/console:
>> Event.new.save
=> false
>> EventLog.all
=> [#<EventLog id: 1, error_message: "can't be blank", created_at: "2010-10-22 13:17:41", updated_at: "2010-10-22 13:17:41">]
>> Event.all
=> []
Maybe I have over simplified it, or missing some point.
Would this be a good fit for an Observer? I'm not sure, but I'm hoping that exists outside of the transaction... I have a similar need where I might want to delete a record on update...
I've solved a problem like this by taking advantage of Ruby's variable scoping. Basically I declared an error variable outside of a transaction block then catch, store log message, and raise the error again.
It looks something like this:
def something
error = nil
ActiveRecord::Base.transaction do
begin
# place codez here
rescue ActiveRecord::Rollback => e
error = e.message
raise ActiveRecord::Rollback
end
end
MyModelLog.log_something(error) unless error.nil?
end
By declaring the error variable outside of the transaction scope the contents of the variable persist even after the transaction has exited.
I am not sure if it applies to you, but i assume you are trying to save/create a model from your controller. In the controller it is easy to check the outcome of that action, and you most likely already do to provide the user with a useful flash; so you could easily log an appropriate message there.
I am also assuming you do not use any explicit transactions, so if you handle it in the controller, it is outside of the transaction (every save and destroy work in their own transaction).
What do you think?
MyModelLog.log_something should be done using a different connection.
You can make MyModelLog model always use a different connection by using establish_connection.
class MyModelLog < ActiveRecord::Base
establish_connection Rails.env # Use different connection
def self.log_something
self.create(:log => "something happened")
end
end
Not sure if this is the right way to do logging!!
You could use a nested transaction. This way the code in your callback executes in a different transaction than the failing validation. The Rails documentations for ActiveRecord::Transactions::ClassMethods discusses how this is done.

Resources