Rails database changes aren't persisting through tests - ruby-on-rails

I'm writing tests for my current rails project and I'm running into an issue whereby changes made to the test database through a post call aren't persisting long enough to test the changes.
Basically, I have objects which have a barcode assigned to them. I have put in a form whereby a user can scan in several barcodes to change multiple objects at a time. This is the code.
objecta_controller.rb:
def change_many
#barcodes = params[:barcodes].split()
#objects = ObjectA.order("barcode").where(barcode: #barcodes)
#objects.each do |b|
if can? :change, b
b.state_b()
end
end
end
(Note: #barcodes is a string of barcodes seperated by whitespace)
objecta_controller_test.rb:
test "change object" do
sign_in_as(:admin_staff)
b = ObjectA.new(
barcode: "PL123456",
current_status: "state_a")
post :change_many, { barcodes: b.barcode }
assert_equal("state_b", b.current_status, "Current status incorrect: #{b.to_s}")
end
Using byebug, I've ascertained that the objects do change state in the change_many method, but once it gets back to the test the object's state reverts back to its old one and the test fails.

First off, you are holding an in memory object not yet saved to the database, so first:
Add b.save before your post
Second, your in memory object will not automatically reflect changes in the database. You have to tell it to refresh its state, so:
Add b.reload before your assert

Related

RSpec: factory not persisting after attribute changes?

Im fairly new to RSpec but I am running into an issue (This is on Rails 4 fwiw). I have a simple model/factory test:
context "Scopes" do
let (:widget) {create(:core_widget)}
it ":active" do
puts "Last Created widget:"
puts pp(Core::Widget.last)
widget.type = "active"
widget.object_status_id = 15
puts "After attribute change:"
puts pp(Core::Widget.last)
#puts pp(widget.attributes)
expect(Core::Widget.active.ids).to include(widget.id)
end
end
This is testing out a very simple scope:
scope :active, -> { where(type: 'active', object_status_id:
[25, 15])
Pretty basic. However I noticed that checking (via the puts of the factory objectdoes NOT show the attribute changes (Changing.typetoactiveand.object_status_idto15`) when I re-print it?
I was told that let is lazily evaluated, and I understand that....but when I view the different object_id's when printing they are completely different. let should still have the same object reference in the same it block right?
Initially I thought it was a problem because I was doing build_stubbed on the factory creation. I also tried let! because I thought maybe that was the problem. Neither worked.
I think what is happening here is that you are updating the attributes of the model in memory without saving the changes to your database. Then when your active scope is called, a query is made to the database but since your changes haven't been saved to the database yet, the expected record is not found.
I would recommend checking out the various update* functions as a way to persist your changes to the database, or make sure you call save to save your changes.
For example, you can update your test to:
it ":active" do
puts "Last Created widget:"
puts pp(Core::Widget.last)
widget.type = "active"
widget.object_status_id = 15
widget.save! # Add this line here to explicitly save the record to the DB
puts "After attribute change:"
puts pp(Core::Widget.last) # Now this should find the changed record in the DB as expected
expect(Core::Widget.active.ids).to include(widget.id)
end

Database not updating correctly in Rails

I was hoping you could help me with a problem I've been stuck on for quite a while now. I have a database with tickets. These tickets contain information, like a status. My application uses the Zendesk API to get info from support tickets and store them into my database.
What I want to do is store the previous and current status of a ticket into my database. I am trying to accomplish this by storing the old values before updating my database. At first this seemed to work great. Whenever I change the status in Zendesk, my app changes the previous_state to the old state value and the actual state to the one it gathers from Zendesk.
However, it goes wrong whenever I refresh my page. When that happens (and the method gets called again), for some reason it puts both the previous_state and state on the same value. I must be doing something wrong in one of my update or store lines but I can't figure out what. I hope someone of you can help me out.
Ticket is the Ticket database, client is the zendesk connection. The last loop checks if the status and previous_status are the same and if so, tries to put the previous state back to the previous state before the big update with zendesk. The idea is that the previous state remains unchanged until the actual state changes.
previousTickets = Ticket.all
Ticket.all.each do |old|
old.update(:previous_status => old.status)
end
client.tickets.each do |zt|
Ticket.find_by(:ticket_id => zt.id).update(
subject: zt.subject,
description: zt.description,
ticket_type: zt.type,
status: zt.status,
created: zt.created_at,
organization_id: zt.organization_id,
)
end
Ticket.all.each do |newTicket|
if(newTicket.status == newTicket.previous_status)
b = previousTickets.find_by(:ticket_id => newTicket.ticket_id)
c = b.previous_status
newTicket.update(:previous_status => c)
end
end
Your last loop isn't working because previousTickets does not contain previous tickets, but current ones. This is due to the fact that Ticket.all returns only an ActiveRecord relation. Such is relation loads data in a lazy way : unless you use the content of the relation, it won't be loaded from the database.
You could explicitly load all tickets by converting the relation to an array:
previousTickets = Ticket.all.to_a
But I think you could achieve everything in one single loop: instead of populating all previous_status in the first loop and reverting it in the last, you should simply change the previous_status when you change the current one:
client.tickets.each do |zt|
ticket = Ticket.find_by(:ticket_id => zt.id)
previous_status = ticket.previous_status
previous_status = ticket.status if zt.status != ticket.status
ticket.update(
subject: zt.subject,
description: zt.description,
ticket_type: zt.type,
previous_status: previous_status,
status: zt.status,
created: zt.created_at,
organization_id: zt.organization_id,
)
end

Database Record not saving/persisting within a Rails test function?

I have an update-type method that I am trying to test in Rails using MiniTest and FactoryGirl. My problem is that although I can see that the update is happening correctly within the update function, it doesn't seem to carry over back into the test-function properly.
These is the objects we are working with, (obj is given a default location to start off with:
location1 = create :location
location2 = create :location
obj = create :object, location: location1
And then we call the update function, which takes id's:
obj.update_location(obj.id, location2.id)
The update function:
def update_location(obj_id, loc_id)
#obj = Object.find(obj_id)
#obj.location_id = loc_id
#obj.save
end
But when, back in the test file, I try to assert the change…
assert_equal obj.location_id, location2.id
...I get a failure. The console tells me that obj.location_id still equals location1.id! Why is this?
It seems that the #obj.save is working properly because I inserted puts #obj.inspect in the update-function and it outputs the correctly updated location_id.
I don't think it has to do with transaction-rollbacks because this is all taking place within a single test. (And I am under the impression that transaction rollbacks only happen in between tests).
In summary, my question is: Why doesn't my update persist back into the rails MiniTest test function?
EDIT: Here is the whole test:
test "update_location" do
location1 = create :location
location2 = create :location
#give the obj a starting `type`
obj = create :obj, location: location1
obj.update_location(obj.id, location2.id)
assert_equal obj.location_id, location2.id
end
Issue is resolved, and here is my understanding:
The issue was in how rails' ORM work. The original object 'obj' is loaded up with data from the DB when it is created…but later changes to the DB are not automatically stored to obj. I had thought that obj.some_attribute looks at the database for the record obj in order to find the attribute value--but in actuality I think that the attributes are simply loaded into the variable upon its creation and the database is NOT accessed if I call obj.some_attribute at a later point in time.
Normally, when an update is done through the variable itself, Rails knows to update the variable for you. So when you write obj.some_attribute = 5, and then later on write obj.some_attribute, Rails knows the value is supposed to be 5. But that Rails magic doesn't happen in this situation because it's not updating the record through the variable.
So what needs to be done is simply to reload the object from the database.
So assert_equal obj.reload.location_id, location2.id works!
2 questions that might help you find an answer:
Why are you using an instance variable #obj instead of a local variable obj
Why do you have a special method just to update 1 column. Would the update_column not work for you?

Run rails code after an update to the database has commited, without after_commit

I'm trying to battle some race cases with my background task manager. Essentially, I have a Thing object (already exists) and assign it some properties, and then save it. After it is saved with the new properties, I queue it in Resque, passing in the ID.
thing = Thing.find(1)
puts thing.foo # outputs "old value"
thing.foo = "new value"
thing.save
ThingProcessor.queue_job(thing.id)
The background job will load the object from the database using Thing.find(thing_id).
The problem is that we've found Resque is so fast at picking up the job and loading the Thing object from the ID, that it loads a stale object. So within the job, calling thing.foo will still return "old value" like 1/100 times (not real data, but it does not happen often).
We know this is a race case, because rails will return from thing.save before the data has actually been commit to the database (postgresql in this case).
Is there a way in Rails to only execute code AFTER a database action has commit? Essentially I want to make sure that by the time Resque loads the object, it is getting the freshest object. I know this can be achieved using an after_commit hook on the Thing model, but I don't want it there. I only need this to happen in this one specific context, not every time the model has commit changed to the DB.
You can put in a transaction as well. Just like the example below:
transaction do
thing = Thing.find(1)
puts thing.foo # outputs "old value"
thing.foo = "new value"
thing.save
end
ThingProcessor.queue_job(thing.id)
Update: there is a gem which calls After Transaction, with this you may solve your problem. Here is the link:
http://xtargets.com/2012/03/08/understanding-and-solving-race-conditions-with-ruby-rails-and-background-workers/
What about wrapping a try around the transaction so that the job is queued up only upon success of the transaction?
I had a similar issue, where by I needed to ensure that a transaction had commited before running a series of action. I ended up using this Gem:
https://github.com/Envek/after_commit_everywhere
It meant that I could do the following:
def finalize!
Order.transaction do
payment.charge!
# ...
# Ensure that we only send out items and perform after actions when the order has defintely be completed
after_commit { OrderAfterFinalizerJob.perform_later(self) }
end
end
One gem to allow that is https://github.com/Ragnarson/after_commit_queue
It is a little different than the other answer's after_commit_everywhere gem. The after_commit_everywhere call seems decoupled from current model being saved or not.
So it might be what you expect or not expect, depending on your use case.

Delayed Job object not properly deserialized

I'm having a hard time believing what I'm seeing, but it sure looks like DJ is failing to deserialize an object properly. I look at the DJ record in mongo and I see in the YAML that the object has its text field set, but when the code runs, the text field is not set. Here is some minimal repro code:
class Board
include Mongoid::Document
field :text, type: String
def process_text_field
if not self.text
raise "Text field is blank"
end
# Text field gets processed
end
end
# in a controller
def start_doing_something_slow
board = Board.find(params[:id])
board.text = "Text field is set"
board.save!
raise "Text disappeared!" unless board.text
board.delay.process_text_field
render json: {:result=>'ok'}
end
I invoke the controller method with the browser, and check the DJ record directly in mongo. I see in the YAML that the Board object has the text field correctly set. But when it executes in DJ, it raises the Text field is blank exception.
Somehow it's not deserializing the object properly.
Well this took me about a week to figure out, so I'm posting it here to help others who fall into this trap. Turns out this is a known bug in delayed_job_mongoid. And it's had a simple fix listed right there in the bug report for 10 months.
The problem arises if you use the identity map in mongoid, which acts as an in-process caching layer to the database. For normal web requests, the cache gets cleared between each request, so your controller methods don't use stale versions of the objects. But delayed_job_mongoid doesn't clear the cache between jobs without this patch (which I just put together): https://github.com/collectiveidea/delayed_job_mongoid/pull/38
The result is your delayed jobs are sometimes using old versions of the objects, depending on what ran before them, which creates truly bizarre, mysterious failures that are extremely difficult to track down until you understand what's happening.

Resources