How to run the same tsqlt tests on multiple stored procedures? - stored-procedures

I have lots of stored procedures, and each stored procedure has its own tsqlt test class. In each test class, roughly 5 of the tests are exactly the same for all the stored procedures, and roughly 5 of the tests are unique to the stored procedure.
Occasionally I want to change one of the "common" tests, and I have to change it in 10 or more files, which is a nuisance.
Is there some way that I can define a group of tests in a single file, and then call these tests from another test class, so that the tests are run on the stored procedure that is being tested by the calling test class?

One solution might be to create a TestHelpers class, add your common test code to this class but without the "test" prefix so tSQLt doesn't run them automatically. These procedures would need input parameters such as the name of the procedure to test etc. and would also include the standard tSQLt assertions.
Then within your procedure-specific test classes you would just call the TestHelper version from within the test.
I'm not sure of your exact use case but let us assume that one of the common factors between these procedures is that they all return the same result set (but with different contents) and you want a common test to assert that the result set structure is as it should be.
You might create a helper procedure like this
create procedure [TestHelpers].[ValidateResultSet]
(
#ProcedureToTest nvarchar(200)
)
as
begin
--! common set up omitted for brevity
exec tSQLt.AssertResultSetsHaveSameMetaData...
end
Then in your procedure test classes you might create tests that look like this:
create procedure [MyFirstProcedureTests].[test result set]
as
begin
--! MyFirstProcedure-specific set up omitted
exec TestHelpers.ValidateResultSet 'MyFirstProcedure';
end
or
create procedure [MySecondProcedureTests].[test result set]
as
begin
--! MySecondProcedure-specific set up omitted
exec TestHelpers.ValidateResultSet 'MySecondProcedure';
end
I don't have access to a database right now to prove this but it should work as I've done similar things in the past.

Related

Way to ensure maintenance plan attempts every stored procedure?

I have a stored procedure that is run based on plant name.
I have a maintenance plan that calls the stored procedures for all plants in a timezone like
execute dbo.runPlantSP('plant1')
execute dbo.runPlantSP('plant3')
execute dbo.runPlantSP('plant55')
The issue I am facing is if there is an error that occured while execute dbo.runPlantSP('plant1') runs then it never runs for 'plant3' or 'plant55'. Is there a setting in the maintenance plan I can change to make it still attempt the next line? Or do I need to change the internals of my stored procedure to handle errors so that if something happens in the stored procedure for 'plant1', then we have a catch that handles it and it does not bubble up to the maintenance plan stopping it?

Why does Rubocop prefer `have_received` to `receive`?

I have tests of the form:
expect(ClassA).to receive(:method)
ClassB.perform
Rubocop would prefer if I refactored this to use have_received, which requires ClassA to be mocked. In other words, I need to set up:
allow(ClassA).to receive(:method)
ClassB.perform
expect(ClassA).not_to have_received(:method)
What's the point? Just following the Arrange Act Assert format?
Refactoring to used have_received allowed me to move a lot of the set-up into a before block, and put tests after the action following the Arrange Ace Assert format.
The code noticeably reads better.

informix 12.10 create multiple stored procedures with single file

I have two stored procedure : sub_Proc and Proc
But when I try to create them in a single file, I get into trouble
--file1.sql
CREATE PROCEDURE sub_Proc()
--do Something;
End PROCEDURE
CREATE PROCEDURE Proc() --Syntax Error!
CALL sub_Proc();
End PROCEDURE
I get a syntax error,
of course they can be made in two separate files,
but can they be made in a single file?
Can anybody help?

Possible with multiple database connections

New to the tSQLt world (great tool set) and encountered a minor issue with a stored procedure I am setting up a test for.
If I for some reason have a stored procedure which connects to mutiple databases or even multiple SQL servers (Linked Servers).
Is it possible to do unit tests with tSQLt in such a scenario?
I commented already, but I would like to add some more. So as I said already, that you can do anything that fits into the single transaction.
But for your case I would suggest to create synonyms for every cross database/instance object and then use synonyms everywhere.
I've created following function to mock view/tables synonyms. It has some limitations but at least it can handle simple use cases.
CREATE PROCEDURE [tSQLt].[FakeSynonymTable] #SynonymTable VARCHAR(MAX)
AS
BEGIN
DECLARE #NewName VARCHAR(MAX)= #SynonymTable+REPLACE(CAST(NEWID() AS VARCHAR(100)), '-', '');
DECLARE #RenameCmd VARCHAR(MAX)= 'EXEC sp_rename '''+#SynonymTable+''', '''+#NewName+''';';
EXEC tSQLt.SuppressOutput
#RenameCmd;
DECLARE #sql VARCHAR(MAX)= 'SELECT * INTO '+#SynonymTable+' FROM '+#NewName+' WHERE 1=2;';
EXEC (#sql);
EXEC tSQLt.FakeTable
#TableName = #SynonymTable;
END;
Without you providing sample code I am not certain of your exact use case but this information may help.
The alternative approach for cross-database testing (assuming both databases are on the same instance) is to install tSQLt in both databases. Then you can mock the objects in the remote database in the same way that you would if they were local.
E.g. If you had a stored procedure in LocalDb that referenced a table in RemoteDb, you could do something like this:
Imagine you have a procedure that selects a row from a table called localTable in the local database and inserts that row in to a table called remoteTable in the remote database (on the same instance)
create procedure [myTests].[test mySproc inserts remoteTable from local table]
as
begin
-- Mock the local table in the local database
exec tSQLt.FakeTable 'dbo.localTable' ;
-- Mock the remote table (not the three part object reference to remoteDb)
exec RemoteDb.tSQLt.FakeTable 'dbo.remoteTable' ;
--! Data setup ommitted
--! exec dbo.mySproc #param = 'some value' ;
--! Get the data from the remote table into a temp table so we can test it
select * into #expected from RemoteDb.dbo.remoteTable;
--! Assume we have already populated #actual with our expected results
exec tSQLt.AssertEqualsTable '#expected', '#actual' ;
end
The above code demonstrates the basics but I blogged about this in more detail some years ago here.
Unfortunately this approach will not work across linked servers,

Why can't bcp execute procedures having temp table(#tempTable)?

Recently I was tasked with creating a SQL Server Job to automate the creation of a CSV file. There was existing code, which was using an assortment of #temp tables.
When I set up the job to execute using BCP calling the existing code (converted into a procedure), I kept getting errors:
SQLState = S0002, NativeError = 208
Error = [Microsoft][SQL Native Client][SQL Server]Invalid object name #xyz
As described in other post(s), to resolve the problem lots of people recommend converting all the #tempTables to #tableVariables.
However, I would like to understand WHY BCP doesn't seem to be able to use #tempTables?
When I execute the same procedure from within SSMS it works though!? Why?
I did do a quick and simple test using global temp tables within a procedure and that seemed to succeed via a job using BCP, so I am assuming it is related to the scope of the #tempTables!?
Thanks in advance for your responses/clarifications.
DTML
You are correct in guessing that it's a scope issue for the #temp tables.
BCP is spawned as a separate process, so the tables are no longer in scope for the new processes. SSMS likely uses sub-processes, so they would still have access to the #temp tables.

Resources