SQL Server stack trace for a stored procedure - stored-procedures

I would like to see all the input values passed into a particular user-defined stored procedure to SQL Server. I am using SQL Server 2008 Management Studio. Sometimes I don’t have access to the client code that calls this procedure, and the bug is likely inside the procedure, therefore, I want to know when and what the input values my procedure executes with. If this information, I can debug the procedure without running the whole complicated process.

SQL Profiler will give you this information. You need ALTER TRACE permissions on the database to run Profiler.
http://msdn.microsoft.com/en-us/library/ms181091%28v=sql.105%29.aspx
In addition to the Start Menu link, you can run it from the "Tools" menu in SQL Server Management Studio.

Related

Stored procedure hangs on statement.execute()

Why would a Snowflake stored procedure hang on a statement that, when executed outside the stored procedure, works? Further info: I remove that statement from the stored procedure, then the SP also runs properly. How can this sort of thing be debugged?
(One more piece of info: running as a different user on a different schema, the SP works as intended.)
Update: running the SP on a different warehouse worked, so it might be a problem with the warehouse, not the schema.
Why would a Snowflake stored procedure hang on a statement that, when executed outside the stored procedure, works?
There can be multiple reasons: Query gets queued due to lack of resources, is awaiting a lock to free (if its a transactional query), etc.
How can this sort of thing be debugged?
Check the Query History UI page on Snowflake. If your procedure-executed statement is showing a queued status, you're likely running into a warehouse size limit or a maximum concurrency limit, which can be resolved by reconfiguring your warehouse (via auto-scaling and/or using higher warehouse sizes).

Receiving a SQL stored procedure error about changing schema but no changes have been mad

We have several data feeds that run every evening using SSIS packages with SQL table data sources. Part of this standard process is a data engine we've built using stored procedures that run for each data feed and returns that customers data based on their specific parameters. That engine dumps the data into a SQL table where it is retrieved by the package and then remains there until the next evenings run.
Maybe two weeks ago we started to intermittently get the following error running these stored procedures (which are executed using SQL Agent):
"INSERT EXEC failed because the stored procedure altered the schema of the target table. [SQLSTATE 42000] (Error 556). The step failed."
These stored procedures have been running for months, some even years, without being changed. These errors just started intermittently occurring. Inside the stored procedure we do have a temporary table being used that receives the engine data and a table that is dropped and re-created using a statement like this:
SELECT field1, field2 INTO sqlTable FROM #tempTable
I looked for a SQL updated or something that may have changed to cause these errors all of a sudden but can't find anything. It's occurred to several different stored procedures, intermittently, that all have this same kind of structure but I can't identify any particular reason. It will happen one night and not another, to one stored procedure and not another just like it. Any idea what could cause this?
We are running Microsoft SQL Server 2016 Standard 64-bit (13.0.4604.0) on Windows Server 2016 Datacenter 10.0 (Build 14393: ) (Hypervisor). This is all on a VM in the Azure environment.
If you are using "INSERT ... EXEC" and has enabled Query Store, it might be the reason.
The Query Data Store periodically runs auto-cleanup.
This has turned out to cause problems when a stored procedure makes a call to another stored procedure by using "INSERT…EXEC" syntax.
This is only an issue with SQL Server 2016
For more details and possible solution, see: https://support.microsoft.com/en-us/help/4465511/error-556-insert-exec-failed-stored-procedure-altered-table-schema

AS400 / SQL Server 2008 R2 Data Transfer Performance Improvement

We have recently converted our JD Edwards EnterpriseOne system from and AS400/DB2 platform to Windows & SQL Server. In the old system we had a RPG/CL program that would transfer data from AS400 library to the accounting system for further processing. The end users needed to initiate this process so it was executed via a menu command.
To replicate this behavior after the conversion I created a stored procedure in SQL Server 2008 R2 that inserts records into the SQL Server database from the as400 via a linked server and then updates the records on the as400 to indicate that the records have been processed. To allow the end users the ability to execute this process, I created a SSRS (2005) report that executes the stored procedure.
When the SSRS report is executed interactively, we intermittently get an error 'For security resasons DTD is prohibited in this XML document' which from my research is caused by SSRS running out of memory.
Does anyone know of another/better way to transfer the data?
The transfer/update of the stored procedures is essentially
INSERT INTO [SQL DEST TABLE]
SELECT *
FROM [AS400 Linked Server/Table]
UPDATE OPENQUERY (AS400_LINK, 'MY SELECT QUERY')
SET FLAG = PROCESSED;
You should get better database performance can allow a server to perform work with it's own data, where possible, rather than having to transmit it back and forth.
I will make a few guesses about the circumstances, and if you correct me, I will gladly adjust the answer to fit your situation. This sounds like you are extracting data from an accounting transaction table in DB2, and that when done, you want to update the flag in those same records. That might indicate that the records could stay in that table essentially forever, or perhaps that some other process clears them out. There is no WHERE on your SELECT, so I will assume they do get removed. I will assume that we don't know if more records might get added to the transaction table at any time, including any period between extracting the info and updating the flag.
I wonder if you could update the flag immediately upon extraction, before they have actually been processed SQL Server? Would this be allowed logistically, and within your business requirements?
Suppose you...
extract the DB2 unprocessed transaction data into a workfile,
transfer the workfile to SQL Server,
perform whatever processing you want to do in SQL Server
tell DB2 to update the transaction table based on the workfile
So in DB2, #1 might look like
INSERT INTO workfile
SELECT *
FROM transactions
WHERE flag = unprocessed
During step 3, your SQL Server job could update the flag in the workfile to an error status, for any records that SQL Server cannot process properly.
Step 4 on DB2 could be
UPDATE transactions
SET flag = processed
WHERE transid IN (SELECT transid
FROM workfile
WHERE flag <> error
)
Hopefully, processing errors on SQL Server would be fairly rare. If that process updates transactions in the workfile, only on an error, this should be faster than transmitting updates for each success. The UPDATE statement above, should be able to to run faster on DB2, since it is driven by the workfile on the same server, rather than data be transmitted back up to DB2.

Backup Stored Procedures

Currently, if I want to make a backup of a stored procedure, using Mircosoft SQL Server Management Studio 2008 R2, I right click on my stored procedure, choose modify and then in the stored procedure change alter proc part to create proc, and add the word "backup" to the end of the name of the stored procedure. Is there a better way to do this? In a perfect world, I would like to be able to backup all the stored procedures in a database and keep them maybe somewhere locally. I don't like how my list of stored procedures is getting sloppy (for lack of a better word) with all these backups I have made. If you can't tell, I am exteremly new to writing stored procedures and want to be able to have this to safeguard the existing stored procedures from any mistakes I might make.
Thanks in advance for all your help!
There are multiple ways to keep backups of your stored procedures apart from your live database. Here are just a few:
When you backup your database, all the stored procedures are included in that backup. If you need to revert to an older version, you can restore to another database and script the procedure to a new editor tab or file or whatever. Hopefully, you have a live and test database anyway, so you could just go to your live database and script the stored procedure there rather than having to restore from backup.
You can script each version of your stored procedure to separate files as you create them and name and append a date to the name of the file. You can script all existing stored procedures by looking at the answer to this question.
You can use a version control product. I'm not sure if I'm allowed to point you to one here, but just do a search on "SQL source control" and you will find a very good one in the search results.

BDE, Delphi, ODBC, SQL Native Client & Dead lock

We have some Delphi code that uses the BDE to Access SQL Server 2008 through the SQL Server Native Client ODBC driver (2005 version). Our issue is that we're experiencing some deadlock issues in a loop doing inserts to multiple tables.
The whole loop is done within a [TDatabase].StartTransaction. Looking at the SQL Server Profiler we clearly see that at one point during the loop the SPID (Session ID?) change, and then we naturally end up with a deadlock. (Both SPID doing inserts to the same table)
It seems like the BDE at some point does a second connection to the DB...
(Although I would love to skip the BDE, it's currently not possible. )
Anyone with experiences to share?
In case your app is multithreaded: BDE is not threadsafe. You have to use a separate BDE session (explicitly created instance of TSession) for each thread; the global Session created automatically for the main thread is not sufficient. Also, all database access components (TDatabase, TQuery, etc.) can only be used in the context of the thread where their corresponding instance of TSession has been created.
Verify in the ODBC installation if SQL Server driver is configured to do connection pooling.
Appear that Native Client installation activates it for default... (At least, mine installation had connection pooling active and I don't activated it).
This probably comes too late for the asker, but maybe it helps others.
Everytime there is a cursor that doesn't get closed, the BDE/ODBC combo will establish a new connection for successive querys. The "spid change" is probably the result of a non-closed cursor.
To solve this problem you have to find the BDE-component that caused this stil-opened cursor. Then you call a method that will eventually close the cursor (TTable.Close, TTable.Last ...).
After that the "spid change" should be gone and therefore the deadlock.
Some tips to find that component:
During the lock, execute the following statement (for example using Management Studio):
EXEC sp_who2.
Look in column BlkBy. The blocked connection has a number in it.
This number is the spid (Server Process ID) of the blocking connection.
Then you execute DBCC INPUTBUFFER(spid).
In column EventInfo you will find the sql-statement that has been issued by your programm.
With that information you should be able to find the BDE-component that causes your trouble.

Resources