Execute stored procedure created by a different user over dblink - dblink

I have created a stored procedure PROCA in Database A with user USERA and given
execute permission to USERB and I could execute this stored proc in Database A when logged in with USERB.
Now I logged in to Database X and created a dblink Akink and this dblink conntects to
Database A with user USERB. Now when I execute stored proc using below syntax ,
it got executed without any error but whatever DML operations stored proc have done,
are not committed.
Code to invoke stored proc from Databse X
declare
begin
USERA.PROCA#Alink();
COMMIT;
end;
Please suggest what could be the issue.

It seems there are no good solutions for such situations.
But here is a suggestion for you; try using this:
Exec dbms_utility.exec_ddl_statement#db_link('some ddl sql statment');
For example:
Exec dbms_utility.exec_ddl_statement#db_link('truncate table test_tab');

Related

Way to ensure maintenance plan attempts every stored procedure?

I have a stored procedure that is run based on plant name.
I have a maintenance plan that calls the stored procedures for all plants in a timezone like
execute dbo.runPlantSP('plant1')
execute dbo.runPlantSP('plant3')
execute dbo.runPlantSP('plant55')
The issue I am facing is if there is an error that occured while execute dbo.runPlantSP('plant1') runs then it never runs for 'plant3' or 'plant55'. Is there a setting in the maintenance plan I can change to make it still attempt the next line? Or do I need to change the internals of my stored procedure to handle errors so that if something happens in the stored procedure for 'plant1', then we have a catch that handles it and it does not bubble up to the maintenance plan stopping it?

Stored procedures with unqualified table names not working with Babelfish

I have created a Babelfish-enabled Postgres database in RDS.
I connected with SSMS and created a Database named 'demo'.
Within 'demo' I created a Schema named 'biz'.
I created my tables and stored procedures in the 'biz' schema.
The stored procedures used unqualified table names.
Finally, I wrote a .Net program to do some testing.
I use the System.Data.SqlClient Connection and Command classes and I can connect to the database.
When I execute a stored procedure I get the 'relation "X" does not exist.' error.
If I alter my stored procedure and qualify the table names with the 'biz' schema the error goes away.
How do I avoid having to qualify the table names with the schema?
For example:
After creating a Babelfish enabled Postgres cluster I executed these statements in SSMS:
create database demo
use demo
create schema biz
create table [biz].[cities](
[city] varchar(128),
[state] varchar(128)
)
create procedure [biz].[p_getcities] as
begin
select * from cities
end
insert into [biz].[cities](city, state) values ('Portland', 'OR')
insert into [biz].[cities](city, state) values ('Richmond', 'VA')
exec [biz].p_getcities
And I get this error message after running p_getcities:
Msg 33557097, Level 16, State 1, Line 21
relation "cities" does not exist
When I switch to pgAdmin and try to run the stored procedure like this:
CALL biz.p_getcities()
I get a similar error:
ERROR: relation "cities" does not exist
LINE 1: select * from cities
^
QUERY: select * from cities
CONTEXT: PL/tsql function biz.p_getcities() line 2 at SQL statement
SQL state: 42P01
However, when I set the search_path like this:
set search_path to biz
And the execute the stored procedure I get the expected results:
Portland OR
Richmond VA
Is there an equivalent to search_path in Babelfish?
This explanation has been provided by Rob Verschoor of rcv-aws
What is happening here is that the name resolution inside the procedure biz.p_getcities does not correctly resolve the table name. It resolves it to 'dbo' schema while it should resolve it to 'biz' schema. As you noted, this is related to the search_path setting, and this is not set correctly in this case.
This is a known bug and we hope to fix it soon.
Until then, the workaround is to qualify the table name with the schema name, i.e. select * from biz.cities

Possible with multiple database connections

New to the tSQLt world (great tool set) and encountered a minor issue with a stored procedure I am setting up a test for.
If I for some reason have a stored procedure which connects to mutiple databases or even multiple SQL servers (Linked Servers).
Is it possible to do unit tests with tSQLt in such a scenario?
I commented already, but I would like to add some more. So as I said already, that you can do anything that fits into the single transaction.
But for your case I would suggest to create synonyms for every cross database/instance object and then use synonyms everywhere.
I've created following function to mock view/tables synonyms. It has some limitations but at least it can handle simple use cases.
CREATE PROCEDURE [tSQLt].[FakeSynonymTable] #SynonymTable VARCHAR(MAX)
AS
BEGIN
DECLARE #NewName VARCHAR(MAX)= #SynonymTable+REPLACE(CAST(NEWID() AS VARCHAR(100)), '-', '');
DECLARE #RenameCmd VARCHAR(MAX)= 'EXEC sp_rename '''+#SynonymTable+''', '''+#NewName+''';';
EXEC tSQLt.SuppressOutput
#RenameCmd;
DECLARE #sql VARCHAR(MAX)= 'SELECT * INTO '+#SynonymTable+' FROM '+#NewName+' WHERE 1=2;';
EXEC (#sql);
EXEC tSQLt.FakeTable
#TableName = #SynonymTable;
END;
Without you providing sample code I am not certain of your exact use case but this information may help.
The alternative approach for cross-database testing (assuming both databases are on the same instance) is to install tSQLt in both databases. Then you can mock the objects in the remote database in the same way that you would if they were local.
E.g. If you had a stored procedure in LocalDb that referenced a table in RemoteDb, you could do something like this:
Imagine you have a procedure that selects a row from a table called localTable in the local database and inserts that row in to a table called remoteTable in the remote database (on the same instance)
create procedure [myTests].[test mySproc inserts remoteTable from local table]
as
begin
-- Mock the local table in the local database
exec tSQLt.FakeTable 'dbo.localTable' ;
-- Mock the remote table (not the three part object reference to remoteDb)
exec RemoteDb.tSQLt.FakeTable 'dbo.remoteTable' ;
--! Data setup ommitted
--! exec dbo.mySproc #param = 'some value' ;
--! Get the data from the remote table into a temp table so we can test it
select * into #expected from RemoteDb.dbo.remoteTable;
--! Assume we have already populated #actual with our expected results
exec tSQLt.AssertEqualsTable '#expected', '#actual' ;
end
The above code demonstrates the basics but I blogged about this in more detail some years ago here.
Unfortunately this approach will not work across linked servers,

How to make Teradata DDL and DCL statements transactional?

How to group Teradata DDL/DCL statement (in a stored procedure) in a transaction so that all works or nothing? For example
Begin Transaction
Create User aUser ...;
Grant some permissions to aUser;
End Transaction;
In the above example either both create user and grant statements are executed or both rollback in case of an error.
Thanks

stored procedures in web2py

I'm considering converting an app from php/MySQL to web2py (and MySQL or Postgres). The only SQL code in the php codebase for this app are calls to stored procedures...no SELECTs, no INSERTs, etc., in the php codebase. All SQL source in the php codebase is on the order of "CALL proc_Fubar(args...);"
How do I tell web2py, "Here's my INSERT stored procedure; here's my SELECT..."? I know I can executesql, but how about the returned rowset from a SELECT...I'd like to have that data returned as if it were the results of a web2py query from a table.
Yes, I know. I'm trying to get all the neat stuff that web2py does without keeping up my end of the bargain (by defining my SQL as web2py wants to see it).
You might try the following. First, define a model that matches the fields returned by your stored procedure (set migrate=False so web2py doesn't try to create that table in the db).
db.define_table('myfaketable', ..., migrate=False)
Then do:
raw_rows = db.executesql('[SQL code to execute stored procedure]')
rows = db._adapter.parse(raw_rows,
fields=[field for field in db.myfaketable],
colnames=db.myfaketable.fields)

Resources