Why can't bcp execute procedures having temp table(#tempTable)? - stored-procedures

Recently I was tasked with creating a SQL Server Job to automate the creation of a CSV file. There was existing code, which was using an assortment of #temp tables.
When I set up the job to execute using BCP calling the existing code (converted into a procedure), I kept getting errors:
SQLState = S0002, NativeError = 208
Error = [Microsoft][SQL Native Client][SQL Server]Invalid object name #xyz
As described in other post(s), to resolve the problem lots of people recommend converting all the #tempTables to #tableVariables.
However, I would like to understand WHY BCP doesn't seem to be able to use #tempTables?
When I execute the same procedure from within SSMS it works though!? Why?
I did do a quick and simple test using global temp tables within a procedure and that seemed to succeed via a job using BCP, so I am assuming it is related to the scope of the #tempTables!?
Thanks in advance for your responses/clarifications.
DTML

You are correct in guessing that it's a scope issue for the #temp tables.
BCP is spawned as a separate process, so the tables are no longer in scope for the new processes. SSMS likely uses sub-processes, so they would still have access to the #temp tables.

Related

Use other database connection and execute query

In our app, we need to switch to read replica database and read from it for some read-only APIs.
We decided to use the around_action filter for that:
Switch DB to read_replica before the action
Yield
Switching back to master.
We decided to use establish_connection for switching, which did the job but later we noticed that it's not thread-safe i.e it causes our other threads to face "#<ActiveRecord::ConnectionNotEstablished: No connection pool with 'primary' found.>" issue. So this solution would have worked in the case of single-threaded servers.
Later we tried to create a new connection pool, as below which is thread-safe:
databases = Rails.configuration.database_configuration
resolver = ActiveRecord::ConnectionAdapters::ConnectionSpecification::Resolver.new(databases)
spec = resolver.spec(:read_replica)
pool = ActiveRecord::ConnectionAdapters::ConnectionPool.new(spec)
pool.with_connection { |conn|
execute SQL query here.
}
The only problem with the above approach is, we can only execute queries using execute method like conn.execute(sql_query) any AR ORM query we execute inside this with_connection block run on the original DB and not read_replica.
Seems like ActiveRecord do have its default connection and it's using it when we run AR ORM queries.
Not sure how can we execute the AR ORM query inside the with_connection block as User.where(id: 1..10).
Please note:
I am aware that we can do this natively in rails 6, need to skip that for now.
I am also aware of the Octopus gem, again need to skip on that.
Appreciate any help, Thanks.

Sybase database - Issue running stored proc

I am trying to run a stored proc "dbname..rr_deck_type_wrap_proc" through SQLDeveloper worksheet using the following code:
declare #return int, #message varchar(255)
exec #return = DBname..rr_deck_type_wrap_proc
#new_id = '123456789',
#remarks = 'Test',
#error_cd = #message output
select #return "#return", #message "#message"
However, I always get the message:
Error report - Invalid JDBC escape syntax at line position 22 '='
character expected.
This code works perfectly fine when run through another software/interface. It is probably a syntax issue with SQLDeveloper, tried different things but nothing worked. Can someone suggest anything. Thank you!
You're using 'Oracle SQL Developer' - the keyword there, being 'Oracle.'
You're connected to a SAP Sybase database. Which is perfectly fine. We provide this connectivity support for one reason, and one reason only:
To allow you to migrate it to Oracle Database.
It is NOT a T-SQL IDE. It does not provide any support really for running, debugging, playing with T-SQL.
If you have ANSI-SQL, then our parser will probably be OK with it.
If you need/want a hack, and are desperate to make your code go, then you can provide a hint to our parser, to just let the code go through to the driver, and hope it works.
An example:
/sqldev:query*/sp_help;
You could also try adding this
/*sqldev:stmt*/
This advice applies to MySQL, DB2, SQL Server, Teradata, or any other 3rd party JDBC driver we allow you to use in SQL Developer. It's there to help you move your schema and application code over into an Oracle Database.

Possible with multiple database connections

New to the tSQLt world (great tool set) and encountered a minor issue with a stored procedure I am setting up a test for.
If I for some reason have a stored procedure which connects to mutiple databases or even multiple SQL servers (Linked Servers).
Is it possible to do unit tests with tSQLt in such a scenario?
I commented already, but I would like to add some more. So as I said already, that you can do anything that fits into the single transaction.
But for your case I would suggest to create synonyms for every cross database/instance object and then use synonyms everywhere.
I've created following function to mock view/tables synonyms. It has some limitations but at least it can handle simple use cases.
CREATE PROCEDURE [tSQLt].[FakeSynonymTable] #SynonymTable VARCHAR(MAX)
AS
BEGIN
DECLARE #NewName VARCHAR(MAX)= #SynonymTable+REPLACE(CAST(NEWID() AS VARCHAR(100)), '-', '');
DECLARE #RenameCmd VARCHAR(MAX)= 'EXEC sp_rename '''+#SynonymTable+''', '''+#NewName+''';';
EXEC tSQLt.SuppressOutput
#RenameCmd;
DECLARE #sql VARCHAR(MAX)= 'SELECT * INTO '+#SynonymTable+' FROM '+#NewName+' WHERE 1=2;';
EXEC (#sql);
EXEC tSQLt.FakeTable
#TableName = #SynonymTable;
END;
Without you providing sample code I am not certain of your exact use case but this information may help.
The alternative approach for cross-database testing (assuming both databases are on the same instance) is to install tSQLt in both databases. Then you can mock the objects in the remote database in the same way that you would if they were local.
E.g. If you had a stored procedure in LocalDb that referenced a table in RemoteDb, you could do something like this:
Imagine you have a procedure that selects a row from a table called localTable in the local database and inserts that row in to a table called remoteTable in the remote database (on the same instance)
create procedure [myTests].[test mySproc inserts remoteTable from local table]
as
begin
-- Mock the local table in the local database
exec tSQLt.FakeTable 'dbo.localTable' ;
-- Mock the remote table (not the three part object reference to remoteDb)
exec RemoteDb.tSQLt.FakeTable 'dbo.remoteTable' ;
--! Data setup ommitted
--! exec dbo.mySproc #param = 'some value' ;
--! Get the data from the remote table into a temp table so we can test it
select * into #expected from RemoteDb.dbo.remoteTable;
--! Assume we have already populated #actual with our expected results
exec tSQLt.AssertEqualsTable '#expected', '#actual' ;
end
The above code demonstrates the basics but I blogged about this in more detail some years ago here.
Unfortunately this approach will not work across linked servers,

Rails Brakeman SQL injection warning while accessing an oracle view/function

I have rails code that is consuming an oracle view/function.
This is my code:
def run_query
connection.exec_query(
"SELECT * FROM TABLE(FN_REQ(#{demo_type_param},#{demo_tid_param}}))")
end
When run Brakeman analyzer it warns of possible "sql injection attack"
I need to understand if this is a valid warning, if so, how do I remediate it?
Since this is a function & not an actual table, I am not sure what's the right way.
If it was a normal model, i would have just followed this pattern:
Model.where("mycolumn1= ? AND mycolumn2= ?", demo_type_param, demo_tid_param).first
Yes, it is real. Almost every time, you build any SQL query from simply concatenating variables, you are vulnerable to SQL injection. Generally, an SQL injection happens each time when data inserted into the query can look like valid SQL and can result in additional queries executed.
The only solution is to manually enforce appropriate escaping or to use prepared statements, with the latter being the preferred solution.
With ActiveRecord / Rails, you can use exec_query with binds directly
sql = 'SELECT * FROM TABLE(FN_REQ(?,?))'
connection.exec_query(sql, 'my query', [demo_type_param, demo_tid_param])
Here, Rails will prepare the statement on the database and add the parameters to it on execution, ensuring that everything is correctly escaped and save from SQL injection.

stored procedures in web2py

I'm considering converting an app from php/MySQL to web2py (and MySQL or Postgres). The only SQL code in the php codebase for this app are calls to stored procedures...no SELECTs, no INSERTs, etc., in the php codebase. All SQL source in the php codebase is on the order of "CALL proc_Fubar(args...);"
How do I tell web2py, "Here's my INSERT stored procedure; here's my SELECT..."? I know I can executesql, but how about the returned rowset from a SELECT...I'd like to have that data returned as if it were the results of a web2py query from a table.
Yes, I know. I'm trying to get all the neat stuff that web2py does without keeping up my end of the bargain (by defining my SQL as web2py wants to see it).
You might try the following. First, define a model that matches the fields returned by your stored procedure (set migrate=False so web2py doesn't try to create that table in the db).
db.define_table('myfaketable', ..., migrate=False)
Then do:
raw_rows = db.executesql('[SQL code to execute stored procedure]')
rows = db._adapter.parse(raw_rows,
fields=[field for field in db.myfaketable],
colnames=db.myfaketable.fields)

Resources