I have two stored procedure : sub_Proc and Proc
But when I try to create them in a single file, I get into trouble
--file1.sql
CREATE PROCEDURE sub_Proc()
--do Something;
End PROCEDURE
CREATE PROCEDURE Proc() --Syntax Error!
CALL sub_Proc();
End PROCEDURE
I get a syntax error,
of course they can be made in two separate files,
but can they be made in a single file?
Can anybody help?
Related
I have lots of stored procedures, and each stored procedure has its own tsqlt test class. In each test class, roughly 5 of the tests are exactly the same for all the stored procedures, and roughly 5 of the tests are unique to the stored procedure.
Occasionally I want to change one of the "common" tests, and I have to change it in 10 or more files, which is a nuisance.
Is there some way that I can define a group of tests in a single file, and then call these tests from another test class, so that the tests are run on the stored procedure that is being tested by the calling test class?
One solution might be to create a TestHelpers class, add your common test code to this class but without the "test" prefix so tSQLt doesn't run them automatically. These procedures would need input parameters such as the name of the procedure to test etc. and would also include the standard tSQLt assertions.
Then within your procedure-specific test classes you would just call the TestHelper version from within the test.
I'm not sure of your exact use case but let us assume that one of the common factors between these procedures is that they all return the same result set (but with different contents) and you want a common test to assert that the result set structure is as it should be.
You might create a helper procedure like this
create procedure [TestHelpers].[ValidateResultSet]
(
#ProcedureToTest nvarchar(200)
)
as
begin
--! common set up omitted for brevity
exec tSQLt.AssertResultSetsHaveSameMetaData...
end
Then in your procedure test classes you might create tests that look like this:
create procedure [MyFirstProcedureTests].[test result set]
as
begin
--! MyFirstProcedure-specific set up omitted
exec TestHelpers.ValidateResultSet 'MyFirstProcedure';
end
or
create procedure [MySecondProcedureTests].[test result set]
as
begin
--! MySecondProcedure-specific set up omitted
exec TestHelpers.ValidateResultSet 'MySecondProcedure';
end
I don't have access to a database right now to prove this but it should work as I've done similar things in the past.
New to the tSQLt world (great tool set) and encountered a minor issue with a stored procedure I am setting up a test for.
If I for some reason have a stored procedure which connects to mutiple databases or even multiple SQL servers (Linked Servers).
Is it possible to do unit tests with tSQLt in such a scenario?
I commented already, but I would like to add some more. So as I said already, that you can do anything that fits into the single transaction.
But for your case I would suggest to create synonyms for every cross database/instance object and then use synonyms everywhere.
I've created following function to mock view/tables synonyms. It has some limitations but at least it can handle simple use cases.
CREATE PROCEDURE [tSQLt].[FakeSynonymTable] #SynonymTable VARCHAR(MAX)
AS
BEGIN
DECLARE #NewName VARCHAR(MAX)= #SynonymTable+REPLACE(CAST(NEWID() AS VARCHAR(100)), '-', '');
DECLARE #RenameCmd VARCHAR(MAX)= 'EXEC sp_rename '''+#SynonymTable+''', '''+#NewName+''';';
EXEC tSQLt.SuppressOutput
#RenameCmd;
DECLARE #sql VARCHAR(MAX)= 'SELECT * INTO '+#SynonymTable+' FROM '+#NewName+' WHERE 1=2;';
EXEC (#sql);
EXEC tSQLt.FakeTable
#TableName = #SynonymTable;
END;
Without you providing sample code I am not certain of your exact use case but this information may help.
The alternative approach for cross-database testing (assuming both databases are on the same instance) is to install tSQLt in both databases. Then you can mock the objects in the remote database in the same way that you would if they were local.
E.g. If you had a stored procedure in LocalDb that referenced a table in RemoteDb, you could do something like this:
Imagine you have a procedure that selects a row from a table called localTable in the local database and inserts that row in to a table called remoteTable in the remote database (on the same instance)
create procedure [myTests].[test mySproc inserts remoteTable from local table]
as
begin
-- Mock the local table in the local database
exec tSQLt.FakeTable 'dbo.localTable' ;
-- Mock the remote table (not the three part object reference to remoteDb)
exec RemoteDb.tSQLt.FakeTable 'dbo.remoteTable' ;
--! Data setup ommitted
--! exec dbo.mySproc #param = 'some value' ;
--! Get the data from the remote table into a temp table so we can test it
select * into #expected from RemoteDb.dbo.remoteTable;
--! Assume we have already populated #actual with our expected results
exec tSQLt.AssertEqualsTable '#expected', '#actual' ;
end
The above code demonstrates the basics but I blogged about this in more detail some years ago here.
Unfortunately this approach will not work across linked servers,
Trying to create xsodata via dbprocedure
"X"."SHOPLIST/Header" as "Header"
navigates ("ToItem" as "ItemRef")
create using "X"."SHOPLIST.shoplist::create";
it says Syntax error at line: 3, column: 15.
create using "SHOPLIST.shoplist::create";
it says Unknown object "SHOPLIST.shoplist::create".
Procedure name is "X"."SHOPLIST.shoplist::create" (it works ok in console).
So it seems that xsodata doesn't work with DB artifacts. I had to create .hdbprocedure and then it would accept it without schema prefix...
Recently I was tasked with creating a SQL Server Job to automate the creation of a CSV file. There was existing code, which was using an assortment of #temp tables.
When I set up the job to execute using BCP calling the existing code (converted into a procedure), I kept getting errors:
SQLState = S0002, NativeError = 208
Error = [Microsoft][SQL Native Client][SQL Server]Invalid object name #xyz
As described in other post(s), to resolve the problem lots of people recommend converting all the #tempTables to #tableVariables.
However, I would like to understand WHY BCP doesn't seem to be able to use #tempTables?
When I execute the same procedure from within SSMS it works though!? Why?
I did do a quick and simple test using global temp tables within a procedure and that seemed to succeed via a job using BCP, so I am assuming it is related to the scope of the #tempTables!?
Thanks in advance for your responses/clarifications.
DTML
You are correct in guessing that it's a scope issue for the #temp tables.
BCP is spawned as a separate process, so the tables are no longer in scope for the new processes. SSMS likely uses sub-processes, so they would still have access to the #temp tables.
I have created a stored procedure PROCA in Database A with user USERA and given
execute permission to USERB and I could execute this stored proc in Database A when logged in with USERB.
Now I logged in to Database X and created a dblink Akink and this dblink conntects to
Database A with user USERB. Now when I execute stored proc using below syntax ,
it got executed without any error but whatever DML operations stored proc have done,
are not committed.
Code to invoke stored proc from Databse X
declare
begin
USERA.PROCA#Alink();
COMMIT;
end;
Please suggest what could be the issue.
It seems there are no good solutions for such situations.
But here is a suggestion for you; try using this:
Exec dbms_utility.exec_ddl_statement#db_link('some ddl sql statment');
For example:
Exec dbms_utility.exec_ddl_statement#db_link('truncate table test_tab');