Sending JSON and parsing it in DB2 - stored-procedures

I have researched and couldn't find possible solutions.Is it possible to send JSON as input to DB2 stored procedure. If yes can you point me to an example ?

Here is an example that I found. It says that
DB2 JSON is a driver-based solution available with DB2 for Linux, UNIX, and Windows 10.5 and with the IBM DB2 Accessories Suite V3.1 to work with DB2 on z/OS V10. It facilitates using DB2 as a JSON document store
and gives steps how to prepare the environment
http://www.ibm.com/developerworks/data/library/techarticle/dm-1306nosqlforjson4/

Related

Importing sql server 2017 database to neo4j

i am trying to search for a way to import database from sql server to neo4j. I searched for a way if exist but i didn't find what i want. My case is : database contains many tables so i want to bring them to neo4j in one click ( if exist ) using a tools or writing dotnet program
There is no 'one-click' solution - but you said you're happy to write some code, so -- here we go!
Neo4j can import via JDBC, via a set of procedures called APOC (https://neo4j-contrib.github.io/neo4j-apoc-procedures/#_load_jdbc). To do this, you'll need the SQL Server JDBC driver (https://learn.microsoft.com/en-us/sql/connect/jdbc/microsoft-jdbc-driver-for-sql-server?view=sql-server-2017).
And here (https://tbgraph.wordpress.com/2017/07/24/neo4j-apoc-load-jdbc/) is a blog describing the process, so that should get you all up and running!

I have a parameterized stored procedure on an AS400 and cannot get results in SSRS

I have a working stored procedure that returns results in Microsoft query and iSeries Navigator. When I call the same stored procedure using Microsoft Report Builder 3.0, I either get no results or an error saying one of the temporary files used in the procedure is in use.
Is there something special that needs to be done using Report Builder?
I am using an ODBC connection to the AS400, if that's relevant.
Thanks.
Recommended ... different approach. Locate your stored procedure on your SSRS Reports SQL Server. Debug this stored procedure with SSMS. Once that is working, you will have no issue getting the data into SSRS.

SSIS Data flow failing - calling a Sybase ASE stored procedure using temp tables

I have been battling with a SSIS task (2012), which has a series of Data flow tasks run in series.
In particular, there is one data flow task that is failing at the Data Source stage. I am Calling a Sybase ASE (15) stored procedure, using the OLE DB driver. This stored procedure is called via a "Command Via Variable" call.
What is happening is the stored procedure is failing at the end of the Pre-Execution phase, with an "Unable To Retrieve Column Information From The Data Source" error message.
The failure only occurs when I run this from the SQL server (2014) that I have deployed the process to. It appears to run OK in my dev environment (Visual Studio 2012).
The only clue I have at the moment points to the ASE Stored Proc. I have commented out a whole bunch of updates and other proc internal Temp Table populates and it all works fine. I start randomly adding them back and the process fails - there appears to be no rhyme or reason.....
Has anyone come across this issue before? Is there any setting on either SSIS, SQL Server or ASE that I could set that might assist?
Thanks
As per my comment above, the issue came down to driver incompatibility between my dev environment and the SQL server that I deployed the SSIS project to.
In short, My dev environment is using the ASE 15.7.0.1260 suite of drivers. The SQL server that I was deploying to had the ASE 15.7.0.501 suite of drivers installed.
Once the SQL server had its drivers updated, all has started working faultlessly.

Get version of Service in WMI

I installed a windows service using InstallShields generated msi file.
How am I able to get the version of the service using WMI-Query? I already checked win32_Service but there's no version number displayed. Is this information queryable using WMI or do I have to use the registry for this scenario?
Okay, got it. The information is stored in Win32_Product.

Problems with Characterset when using DB2 SYSPROC.ADMIN_CMD for database import

I am running a Java Application that transfers the files I need to import to the server the DB2 is on. Then the Java Application creates a JDBC Connection to the database and runs:
CALL SYSPROC.ADMIN_CMD('import from <filename> of del modified by decpt, coldel; messages on server inert into <view>')
The problem I have seems somehow conencted to the charset of either the database of the user the database uses to import the files (using the admin_cmd stored procedure). That problem is:
"Umlaute", like ä,ö,ü get messed up by this import. I had this sort of problem in the past and solution always was to set the LC_CTYPE of the user importing the data to de_DE.iso88591
What I already ruled out as the source of the problem:
- The file transfer to the database server. (Umlaute are still ok after that)
- The JDBC Connection (I simply inserted a line through the sql command instead of reading from a file)
The thing is I don't now what user DB2 uses to import files through ADMIN_CMD. And I also don't believe it could somehow be connected to the DB2 settings, since with every other way of inserting,loading ... data into it, everthing works fine.
And yes, I need to use ADMIN_CMD. The DB2 Command Line Tool is a performance nightmare ..
The best approach (for sanity):
Create all databases as UTF-8
Make sure all operating system locales are UTF-8
Get rid of all applications that don't handle their data as UTF-8
Slaughter every developer and vendor not adhering to UTF-8. Repeat and rinse until 100% completed.
DB2 indeed attempts to be smart and convert your input data for you (the import command basically pipes your data into insert clauses - which always get handled like that). The link I gave will outline the basic principle, and give you a few commands to try out. Also, there is official explanation to the similar. According it you could attempt setting the environment variable db2codepage to correspond with your delimited data files, and that should help. Also, the IXF format exports might work better since they have encoding related information attached in every file.
Thanks for your response.
I finally fixed the issue by adding a
MODIFIED BY CODEPAGE=1252
to my JDBC - ADMIN_CMD Import Command. This seems to override any codepage settings the db was using before. It also appears the default codepage of the database didn't matter, since it is set to 1252. The only thing I can think of right now for being the reason for all this could be a linux setting DB2 uses when importing through ADMIN_CMD.

Resources