New to the tSQLt world (great tool set) and encountered a minor issue with a stored procedure I am setting up a test for.
If I for some reason have a stored procedure which connects to mutiple databases or even multiple SQL servers (Linked Servers).
Is it possible to do unit tests with tSQLt in such a scenario?
I commented already, but I would like to add some more. So as I said already, that you can do anything that fits into the single transaction.
But for your case I would suggest to create synonyms for every cross database/instance object and then use synonyms everywhere.
I've created following function to mock view/tables synonyms. It has some limitations but at least it can handle simple use cases.
CREATE PROCEDURE [tSQLt].[FakeSynonymTable] #SynonymTable VARCHAR(MAX)
AS
BEGIN
DECLARE #NewName VARCHAR(MAX)= #SynonymTable+REPLACE(CAST(NEWID() AS VARCHAR(100)), '-', '');
DECLARE #RenameCmd VARCHAR(MAX)= 'EXEC sp_rename '''+#SynonymTable+''', '''+#NewName+''';';
EXEC tSQLt.SuppressOutput
#RenameCmd;
DECLARE #sql VARCHAR(MAX)= 'SELECT * INTO '+#SynonymTable+' FROM '+#NewName+' WHERE 1=2;';
EXEC (#sql);
EXEC tSQLt.FakeTable
#TableName = #SynonymTable;
END;
Without you providing sample code I am not certain of your exact use case but this information may help.
The alternative approach for cross-database testing (assuming both databases are on the same instance) is to install tSQLt in both databases. Then you can mock the objects in the remote database in the same way that you would if they were local.
E.g. If you had a stored procedure in LocalDb that referenced a table in RemoteDb, you could do something like this:
Imagine you have a procedure that selects a row from a table called localTable in the local database and inserts that row in to a table called remoteTable in the remote database (on the same instance)
create procedure [myTests].[test mySproc inserts remoteTable from local table]
as
begin
-- Mock the local table in the local database
exec tSQLt.FakeTable 'dbo.localTable' ;
-- Mock the remote table (not the three part object reference to remoteDb)
exec RemoteDb.tSQLt.FakeTable 'dbo.remoteTable' ;
--! Data setup ommitted
--! exec dbo.mySproc #param = 'some value' ;
--! Get the data from the remote table into a temp table so we can test it
select * into #expected from RemoteDb.dbo.remoteTable;
--! Assume we have already populated #actual with our expected results
exec tSQLt.AssertEqualsTable '#expected', '#actual' ;
end
The above code demonstrates the basics but I blogged about this in more detail some years ago here.
Unfortunately this approach will not work across linked servers,
Related
I have created a Babelfish-enabled Postgres database in RDS.
I connected with SSMS and created a Database named 'demo'.
Within 'demo' I created a Schema named 'biz'.
I created my tables and stored procedures in the 'biz' schema.
The stored procedures used unqualified table names.
Finally, I wrote a .Net program to do some testing.
I use the System.Data.SqlClient Connection and Command classes and I can connect to the database.
When I execute a stored procedure I get the 'relation "X" does not exist.' error.
If I alter my stored procedure and qualify the table names with the 'biz' schema the error goes away.
How do I avoid having to qualify the table names with the schema?
For example:
After creating a Babelfish enabled Postgres cluster I executed these statements in SSMS:
create database demo
use demo
create schema biz
create table [biz].[cities](
[city] varchar(128),
[state] varchar(128)
)
create procedure [biz].[p_getcities] as
begin
select * from cities
end
insert into [biz].[cities](city, state) values ('Portland', 'OR')
insert into [biz].[cities](city, state) values ('Richmond', 'VA')
exec [biz].p_getcities
And I get this error message after running p_getcities:
Msg 33557097, Level 16, State 1, Line 21
relation "cities" does not exist
When I switch to pgAdmin and try to run the stored procedure like this:
CALL biz.p_getcities()
I get a similar error:
ERROR: relation "cities" does not exist
LINE 1: select * from cities
^
QUERY: select * from cities
CONTEXT: PL/tsql function biz.p_getcities() line 2 at SQL statement
SQL state: 42P01
However, when I set the search_path like this:
set search_path to biz
And the execute the stored procedure I get the expected results:
Portland OR
Richmond VA
Is there an equivalent to search_path in Babelfish?
This explanation has been provided by Rob Verschoor of rcv-aws
What is happening here is that the name resolution inside the procedure biz.p_getcities does not correctly resolve the table name. It resolves it to 'dbo' schema while it should resolve it to 'biz' schema. As you noted, this is related to the search_path setting, and this is not set correctly in this case.
This is a known bug and we hope to fix it soon.
Until then, the workaround is to qualify the table name with the schema name, i.e. select * from biz.cities
I have lots of stored procedures, and each stored procedure has its own tsqlt test class. In each test class, roughly 5 of the tests are exactly the same for all the stored procedures, and roughly 5 of the tests are unique to the stored procedure.
Occasionally I want to change one of the "common" tests, and I have to change it in 10 or more files, which is a nuisance.
Is there some way that I can define a group of tests in a single file, and then call these tests from another test class, so that the tests are run on the stored procedure that is being tested by the calling test class?
One solution might be to create a TestHelpers class, add your common test code to this class but without the "test" prefix so tSQLt doesn't run them automatically. These procedures would need input parameters such as the name of the procedure to test etc. and would also include the standard tSQLt assertions.
Then within your procedure-specific test classes you would just call the TestHelper version from within the test.
I'm not sure of your exact use case but let us assume that one of the common factors between these procedures is that they all return the same result set (but with different contents) and you want a common test to assert that the result set structure is as it should be.
You might create a helper procedure like this
create procedure [TestHelpers].[ValidateResultSet]
(
#ProcedureToTest nvarchar(200)
)
as
begin
--! common set up omitted for brevity
exec tSQLt.AssertResultSetsHaveSameMetaData...
end
Then in your procedure test classes you might create tests that look like this:
create procedure [MyFirstProcedureTests].[test result set]
as
begin
--! MyFirstProcedure-specific set up omitted
exec TestHelpers.ValidateResultSet 'MyFirstProcedure';
end
or
create procedure [MySecondProcedureTests].[test result set]
as
begin
--! MySecondProcedure-specific set up omitted
exec TestHelpers.ValidateResultSet 'MySecondProcedure';
end
I don't have access to a database right now to prove this but it should work as I've done similar things in the past.
Recently I was tasked with creating a SQL Server Job to automate the creation of a CSV file. There was existing code, which was using an assortment of #temp tables.
When I set up the job to execute using BCP calling the existing code (converted into a procedure), I kept getting errors:
SQLState = S0002, NativeError = 208
Error = [Microsoft][SQL Native Client][SQL Server]Invalid object name #xyz
As described in other post(s), to resolve the problem lots of people recommend converting all the #tempTables to #tableVariables.
However, I would like to understand WHY BCP doesn't seem to be able to use #tempTables?
When I execute the same procedure from within SSMS it works though!? Why?
I did do a quick and simple test using global temp tables within a procedure and that seemed to succeed via a job using BCP, so I am assuming it is related to the scope of the #tempTables!?
Thanks in advance for your responses/clarifications.
DTML
You are correct in guessing that it's a scope issue for the #temp tables.
BCP is spawned as a separate process, so the tables are no longer in scope for the new processes. SSMS likely uses sub-processes, so they would still have access to the #temp tables.
I have created a stored procedure PROCA in Database A with user USERA and given
execute permission to USERB and I could execute this stored proc in Database A when logged in with USERB.
Now I logged in to Database X and created a dblink Akink and this dblink conntects to
Database A with user USERB. Now when I execute stored proc using below syntax ,
it got executed without any error but whatever DML operations stored proc have done,
are not committed.
Code to invoke stored proc from Databse X
declare
begin
USERA.PROCA#Alink();
COMMIT;
end;
Please suggest what could be the issue.
It seems there are no good solutions for such situations.
But here is a suggestion for you; try using this:
Exec dbms_utility.exec_ddl_statement#db_link('some ddl sql statment');
For example:
Exec dbms_utility.exec_ddl_statement#db_link('truncate table test_tab');
I'm considering converting an app from php/MySQL to web2py (and MySQL or Postgres). The only SQL code in the php codebase for this app are calls to stored procedures...no SELECTs, no INSERTs, etc., in the php codebase. All SQL source in the php codebase is on the order of "CALL proc_Fubar(args...);"
How do I tell web2py, "Here's my INSERT stored procedure; here's my SELECT..."? I know I can executesql, but how about the returned rowset from a SELECT...I'd like to have that data returned as if it were the results of a web2py query from a table.
Yes, I know. I'm trying to get all the neat stuff that web2py does without keeping up my end of the bargain (by defining my SQL as web2py wants to see it).
You might try the following. First, define a model that matches the fields returned by your stored procedure (set migrate=False so web2py doesn't try to create that table in the db).
db.define_table('myfaketable', ..., migrate=False)
Then do:
raw_rows = db.executesql('[SQL code to execute stored procedure]')
rows = db._adapter.parse(raw_rows,
fields=[field for field in db.myfaketable],
colnames=db.myfaketable.fields)