ATOMIC INSERT in STORED PROCEDURE - stored-procedures

I'm fairly new to stored procedure. I have to design a stored procedure for ATOMIC insert (Mass insert). I'm using COBOL program to call the stored procedure in DB2. I will store values in array and have to insert all at one shot. Below is the query we are using in COBOL program and which I have to convert to stored procedure.
INSERT INTO TABLE_NAME
(COLUMN1
,COLUMN2
,COLUMN3
,COLUMN4
,COLUMN5)
VALUES
(VALUE1
,VALUE2
,VALUE3
,VALUE4
,VALUE5)
FOR WS-SUB ROWS
ATOMIC
VALUE1,VALUE2,VALUE3,VALUE4,VALUE5 are array elements and WS-SUB is number of occurrence.
I want to know, If I can handle array in stored procedure or want to know if its possible to do ATOMIC insert in DB2 stored procedure.
Thanks in advance.

Following the documentation for DB2 on z/OS 12.0.0:
A DB2 Stored Procedure may be configured to use arrays as parameter types, see Example of using arrays in an SQL procedure.
However, if your intention is calling this from COBOL you may run into issues as the documentation for Supported SQL data types in COBOL embedded SQL applications indicates:
Arrays are not supported by the COBOL precompiler
Another approach would be to pass your data as something like a CLOB or VARCHAR delimited by a character and then parse it within your stored procedure.
By default DB2 Stored Procedures does not commit on return, so another option would be to iterate your COBOL tables and call the stored procedure repeatedly.

Related

Can I have table variable in stored procedure in redshift?

We are planning to create a procedure for our logic what should be in PL SQL in redshift (using workbench).
Can we use a table variable to traverse through the rows of the table ? Like we have dataframe in Python.
Sort of. Redshift implements a RECORD data type for stored procedures. Variables with this type can hold an arbitrary sets of rows.
"Overview of stored procedures in Amazon Redshift" > "Record types"
However, note that you cannot currently SELECT from a RECORD typed variable, only loop over the content.
There are several examples of using a RECORD variable in our GitHub repository: "amazon-redshift-utils/src/StoredProcedures/"

How to use temp Tables in a DB2 Function

I've to write some MsSQL procedures and functions into DB2 procedures and functions.
Now I've the problem, that i can not use a temp table in a function with begin atomic.
Is there an other method to use a temp table in DB2 functions?
CREATE OR REPLACE FUNCTION abc( )
RETURNS TABLE (test INTEGER)
LANGUAGE SQL
MODIFIES SQL DATA
NO EXTERNAL ACTION
NOT DETERMINISTIC
BEGIN atomic
DECLARE GLOBAL TEMPORARY TABLE SESSION.StringParts (indexNumber int, stringPart nvarchar(4000)) ON COMMIT DELETE ROWS NOT LOGGED WITH REPLACE;
END
Db2-LUW up to (and including) version 11.5 has limitations on usage of DGTT (Declare Global Temporary Table) in RETURNS TABLE functions, according to the documentation.
You cannot use being atomic with declare global temporary table (i.e. that statement is not supported currently in Compound SQL(inlined) blocks).
If you use MODIFIES SQL DATA then it has limitations on Compound SQL(compiled) for table functions, it works for scalar functions.
It may be better to use SQL PL stored procedures instead of table functions in this case.
You can also circumvent some limitations if you use pipelined functions, and CGTTs can help in some cases.

Insert array of JSON objects in db2 stored procedure

I'm trying to pass an array of JSON objects to the Stored procedure, so that each JSON object will be stored in a single row of the table.
for an example:
to insert a single entry: INSERT INTO SCHEMA_NAME.TABLE_NAME (id,name) values (1,'my_name')
now i have an array of such data which I want to push into stored procedure to get all data inserted into table without hitting SP again and again.
array example: [{id:1,name:'name1'},{id:2,name:'name2'}]
now i can have number of object to be stored.
so either i would hit the SP again and again to push each object individually or i would pass whole array and run a loop inside a stored procedure to get it done.
I have DB2 10.5 LUW with fixpack 7 installed and using node.js
DB2 10.5 does not support JSON as a native data type, so you won't be able to do this without passing the JSON as a VARCHAR and having the stored procedure parse/deconstruct the JSON. The obvious way to do this would with a C or Java stored procedure.
That said, there are some JSON capabilities (see this series of articles for more details, but this is almost certainly not what you want.

What order are result sets returned when using JDBC callable stmt to stored proc?

(1) When you open multiple cursors in a stored procedure, and then use a JDBC callable statement to iterate through the result sets, each in turn, are the order in which they are returned the same order in which they cursors are opened in the stored procedure? Or reverse of that? Or....?
(2) Is there a way to specify by sequence number or name which result set to process first?
For 1: The order of returned resultsets is undefined for JDBC, so it will depend on your actual database system. That said, it would be highly illogical for a stored procedure to return results in a different order than the order they are produced by the stored procedure.
For 2: Once again, this is not defined by JDBC. However I haven't heard of database systems that would allow you to control the order of returned results by any means other than their order in the stored procedure.

Interbase 6 and stored procedure array parameters

Does Interbase 6 database support passing array parameter to stored procedure? If yes what is the syntax of declaring such variable and using it in the body of procedure?
It looks like even the current version of IB does not support array parameters.
The documentation includes an example of processing array data within the stored procedure to flatten it out, then passing several flat records out of the stored procedure. Of course you could do all this in IB 6 as well.
This example is in the Data Definition manual for XE. See Using stored procedures.
The Interbase documentation on this topic may be useful.

Resources