How to use meck in different ExUnit test file - erlang

I would like to use meck in different ExUnit test file.
For example,
[x_test.exs]
def setup do
:meck.new(Hoge, [:passthrough])
 on_exit(fn -> :meck.unload end)
 :ok
end
def teardown do
:meck.unload
end
test "foo" do
:meck.expect(Hoge, :foo, fn -> 1 end)
assert Hoge.foo == 1
end
[y_test.exs]
def setup do
:meck.new(Hoge, [:passthrough])
 on_exit(fn -> :meck.unload end)
 :ok
end
def teardown do
:meck.unload
end
test "foo" do
:meck.expect(Hoge, :foo, fn -> 2 end)
assert Hoge.foo == 2
end
Sometimes, x_test.exs is fail, but sometimes, x_test.exs is success...
(y_test.exs is same)
Can I use mock to same function in another test file?

meck currently compiles and loads code you specified with your expectations. As only one current version of the code could be loaded inside the beam you should execute all tests that race for the same mocked function sequentially.
As ExUnit documentation states that test cases are executed in parallel, you probably have to merge all the tests that should be executed serially in the single test case (i.e. single test module).
Alternatively, you could set number of test cases that could be executed in parallel to 1. However it could slow down your test run
ExUnit supports the following options:
:max_cases - maximum number of cases to run in parallel; defaults to :erlang.system_info(:schedulers_online)

Related

Trouble mocking `Resolv::DNS.open`

I'm trying to mock the code below using MiniTest/Mocks. But I keep getting this error when running my test.
Minitest::Assertion: unexpected invocation: #<Mock:0x7fa76b53d5d0>.size()
unsatisfied expectations:
- expected exactly once, not yet invoked: #<Mock:0x7fa76b53d5d0>.getresources("_F5DC2A7B3840CF8DD20E021B6C4E5FE0.corwin.co", Resolv::DNS::Resource::IN::CNAME)
satisfied expectations:
- expected exactly once, invoked once: Resolv::DNS.open(any_parameters)
code being tested
txt = Resolv::DNS.open do |dns|
records = dns.getresources(options[:cname_origin], Resolv::DNS::Resource::IN::CNAME)
end
binding.pry
return (txt.size > 0) ? (options[:cname_destination].downcase == txt.last.name.to_s.downcase) : false
my test
::Resolv::DNS.expects(:open).returns(dns = mock)
dns.expects(:getresources)
.with(subject.cname_origin(true), Resolv::DNS::Resource::IN::CNAME)
.returns([Resolv::DNS::Resource::IN::CNAME.new(subject.cname_destination)])
.once
Right now you are testing that Resolv::DNS receives open returns your mock but
since you seem to be trying to test that the dns mock is receiving messages you need to stub the method and provide it with the object to be yielded
Try this instead:
dns = mock
dns.expects(:getresources)
.with(subject.cname_origin(true), Resolv::DNS::Resource::IN::CNAME)
.once
::Resolv::DNS.stub :open, [Resolv::DNS::Resource::IN::CNAME.new(subject.cname_destination)], dns do
# whatever code actually calls the "code being tested"
end
dns.verify
The second argument to stub is the stubbed return value and third argument to stub is what will be yielded to the block in place of the original yielded.
In RSpec the syntax is a bit simpler (and more semantic) such that:
dns = double
allow(::Resolv::DNS).to receive(:open).and_yield(dns)
expect(:dns).to receive(:getresources).once
.with(subject.cname_origin(true), Resolv::DNS::Resource::IN::CNAME)
.and_return([Resolv::DNS::Resource::IN::CNAME.new(subject.cname_destination)])
# whatever code actually calls the "code being tested"
You can write more readable integration tests with DnsMock instead of stubbing/mocking parts of your code: https://github.com/mocktools/ruby-dns-mock

Rails / RSpec: 'allow_any_instance_of' doesn't return multiple values

I have this code in one of my tests:
it 'returns ids when successful' do
allow_any_instance_of(Importer).to receive(:import).and_return('12589', '12590', '12591', '12592', '12593', '12594')
expect(#dispatcher.run).to eq(['12589', '12590', '12591', '12592', '12593', '12594'])
end
The test fails because it only return the first value:
expected: ["12589", "12590", "12591", "12592", "12593", "12594"]
got: ["12589", "12589", "12589", "12589", "12589", "12589"]
I just saw that #and_return's capability of returning multiple values only works when used with #allow.
What can I do for #allow_any_instance_of to get this behaviour?
EDIT:
The class I am testing is called Dispatcher. It takes an xml file, and splits it into parts concerning exactly one object. Each of those splitted parts is taken by the Importer which returns exactly one ID. The Dispatcher then creates an Array from those Ids. So, no, I am not expecting an array to be returned by the Importer.
The class I am testing, Dispatcher, calls Importer for every file it finds in an input directory.
Here's what should work (intercept importer creation)
class Dispatcher
def run
files.each do |file|
create_importer(file).import
end
end
def create_importer(file)
::Importer.new(file)
end
end
# spec
let(:fake_importer) { ::Importer.new }
before do
allow(#dispatcher).to receive(:create_importer).and_return(fake_importer)
allow(fake_importer).to receive(:import).and_return(your, multiple, values, here)
end

How can my block/yield pass a changing variable?

I'm writing the following module to capture SIGTERM that gets occasionally sent to my Delayed Job workers, and sets a variable called term_now that lets my job gracefully terminate itself before it's complete.
The following code in my module works perfect if I put it inline in my job, but I need it for several jobs and when I put it in a module it doesn't work.
I assume it's not working because it only passes term_now one time (when it's false), and even when it returns true it doesn't pass it again, therefore it never stops the job.
module StopJobGracefully
def self.execute(&block)
begin
term_now = false
old_term_handler = trap('TERM') do
term_now = true
old_term_handler.call
end
yield(term_now)
ensure
trap('TERM', old_term_handler)
end
end
end
Here's the working inline code how it's normally used (this is the code I'm trying to convert to a module):
class SMSRentDueSoonJob
def perform
begin
term_now = false
old_term_handler = trap('TERM') do
term_now = true
old_term_handler.call
end
User.find_in_batches(batch_size: 1000) do
if term_now
raise 'Gracefully terminating job early...'
end
# do lots of complicated work here
end
ensure
trap('TERM', old_term_handler)
end
end
end
you basically answered it yourself. in the example code you provided, term_now will only become true when the trap snapped before yield is called.
what you need to do is provide a mechanism that periodically fetches the information, so that you can check within the runs of ie find_in_batches.
so instead of yielding the result, your module should have a term_now method that might return an instance variable #term_now.

TestDataConfig.groovy not found, build-test-data plugin proceeding without config file

I am getting the following error when including in Mixin Build in unit tests:
TestDataConfig.groovy not found, build-test-data plugin proceeding without config file
it works like charm in the integration tests but not part of unit tests. I mean, 'build' plugin works itself in unit test but the 'TestDataConfig' is not populating default values
Thank You
First you should verify the version from build-test-data in your BuildConfig.groovy
test ":build-test-data:2.0.3"
Second, check your test. If you want build objects you need:
import grails.buildtestdata.mixin.Build
...
#TestFor(TestingClass)
#Build([TestingClass, SupportClass, AnotherClass])
class TestingClassTest{
#Test
void testMethod{
def tc1 = TestingClass.build()
def sc1 = SuportClass.build()
def ac1 = AnotherClass.build()
}
}
Third, check the domains constraints, you could have some properties validations like unique that fails when you build two instances. You need set that properties in code:
def tc1 = TestingClass.build(uniqueProperty: 'unique')
def tc2 = TestingClass.build(uniqueProperty: 'special')
I guess the dependency should be:
test ":build-test-data:2.0.3"
Since is just used for testing, right?

Grails Quartz Job Integration test - Not autowired Job

I'm writing the Integration test for a Quartz Job in a grails application.
I've the Job in grails-app/jobs folder, and if I start the application it works. The problem is that I want to get it in an integration test, but the autowire won't work. The test is like:
class MyJobTest{
MyJob myJob
def setUp(){
assert myJob != null
}
def testExecute(){
//test logic
}
}
but it fails because myJob is null...some help?
Quartz Jobs are not autowired like services are under the test environment. The documentation for the Quartz job also explicitly states that by default it will not execute on schedule under the test environment (you could change that if you want to but I wouldn't). I would just instantiate myJob = new MyJob() in your setUp and call the execute() method to test it. If you're trying to test the triggers you may want to find a way to look at what is inside the triggers {} maybe inspecting the metaClass?
EDIT IN RESPONSE TO COMMENT:
I've never gotten the services out of the application context so that might work. The way I would probably test it is as follows:
Assuming your class looks something like this:
class MyJob {
def myServiceA
def myServiceB
def execute() {
if(myJobLogicToDetermineWhatToDo) {
myServiceA.doStuff(parameter)
} else {
myServiceB.doStuff(parameter)
}
}
}
What you're really wanting to test here is the myJobLogicToDetermineWhatToDo. I would assume that you have (or can easily write) integration and/or unit tests against your services myServiceA and myServiceB to ensure that they are working correctly. I would then write unit tests to test the logic/wiring of your Job to the appropriate service.
#Test
void routeOne() {
def job = new MyJob()
def myServiceA = new Object()
def expectedParameter = "Name"
def wasCalled = false
myServiceA.metaClass.doStuff = {someParameter ->
assert expectedParameter == someParameter
wasCalled = true
}
job.myServiceA = myServiceA
//Setup data to cause myServiceA to be invoked
job.execute()
assert wasCalled
}
Then repeat this process for all of the routes you have through your Job. This way you can isolate your tests down to the smallest part possible and test the logic of the object that you're invoking not the services it is using. I would assume you're using a service because the logic in there is being used by another part of the system. If you're testing the service through this job and for some reason the job goes away then you have to re-write your tests to invoke the service directly. The way that I've proposed you have tests testing the service directly and tests that mock out those service calls. If the job goes away you would simply delete the tests associated with it and you won't loose any test coverage. Kinda long winded but that's how I would approach testing it.

Resources