i am working with WebSphere MQ FTE.... is it possible to give z/OS properties for the dataset? Consider the transfer is from UNIX to mainframe....
Absolutely! Please see Transferring files and data sets between z/OS and distributed systems in the WMQ FTE Infocenter for details.
Related
I have previous experience in using Ethereum and Solidity, but now I want to try writing smart contracts for Hyperledger.
I have few considerations:
First one is regarding supported databases. According to their documentation (http://hyperledger-fabric.readthedocs.io/en/latest/ledger.html) they use LevelDB for storing contract data and CouchDB support is still in beta. Does anyone have any experience using CouchDB in Hyperledger?
Second, I see that Go is mostly used for the specification of smart contracts, but they have support for Java too. Is Java still in beta too, and is there support for any other programming language?
Also, what operating system do you suggest for production server running Hyperledger?
Thank you for the answers.
Is Java still in beta too, and is there support for any other
programming language?
Hyperledger V1.0 doesn't support Java Chaincode.
There will be support for it in the future.
You can ask around in https://chat.hyperledger.org/channel/fabric for ETAs.
There is also a work in progress to have node.js support.
Also, what operating system do you suggest for production server
running Hyperledger?
Ubuntu 16.04 LTS works well
Does anyone have any experience using CouchDB in Hyperledger?
Yes, CouchDB works well if your data is modeled as JSON and you would like to query the content of the data. The default goleveldb state database only supports key-based queries.
You should take a look at Hyperleger Composer that helps you create blockchain applications on Hyperledger Fabric quite easy. (It works better with Ubuntu).
It has is own Modeling Language.
Hyperledger initially was build using Go language.
the aim of the hyperledger team is to support as many languages as possible. currently the hyperledger composer( tool for developing blockchain application) supports javascript for defining the assets, transactions and chaincode.
the transactions log, state data and backed by Level DB and Couch DB
Note :
LevelDB and CouchDB are fully integrated in fabric framework,
currently you can't replace them with other database
Chaincode runs in a secured Docker container, the chaincode (aka smart-contract)can be programmed in Go,Node,Java currently Go is stable and fully supported language.
considering the operating system i have tested the fabric network running on Microsoft Azure platform where created a image of ubuntu 16.04 and installed Fabric framework.till now got no issue on it.
I have a distributed monitoring system that collects and gathers monitoring data like CPU utilization, database performance metrics, network performance into a backend store. Other applications need to consume these data like real-time calculating(for a resource scheduler) , for system monitoring(to system administrator using monitoring dashboard), for historical analytic(to operation and analyzer program to modeling the resource using pattern for future capacity planning and business system activity analysis).
The dataset size is about 1.2 billion entries in the data store for 9 months. (all in OpenTSDB like format)
Previously I used an Elasticsearch cluster as the backend data store solution and decide to find a better one.
I am looking at Couchbase or VoltDB cluster but still in investigation stage so need some input from here who has the similar experience.
Major questions are as below:
Which backend store solution is good for my scenario? (Couchbase or VoltDB)?
I have to rewrite my data aggregator code (which is in golang). Couchbase provide a good golang SDK client but VoltDB's go driver is only in community level with limited function. So are there any better implementation to communicate with voltdb in golang?
Any suggestion or best practice on it?
There isn't too much in the way of usage patterns here, but it sounds like the kind of app people use VoltDB for.
As for the Golang client, we'd love some feedback as to how to make it more suitable if it's specifically missing something you need. You can also use the HTTP/JSON query interface from any language, including Golang. More info on that here:
http://docs.voltdb.com/UsingVoltDB/ProgLangJson.php
If you would like to leverage your existing model, take a look at Axibase Time-Series Database. It supports both tcollector network and http protocols. Rule engine and visualization are built-in.
The fact that ATSD is based on HBase may be an asset or a liability depending on your prior experience with it :)
URL to tcollector integation: http://axibase.com/products/axibase-time-series-database/writing-data/tcollector/
Disclosure: I work for the company developing ATSD.
I'm trying to work out how these two pieces of the jigsaw interact and fit together when connecting to an MS-SQL server on linux.
As I understand it, FreeTDS is protocol (i.e. a set of rules) for talking to MS-SQL and it is the thing that actually does the talking. Unixodbc is a driver that implements the ODBC API, i.e. implements a set of functions I guess.
Why are both things necessary? Can anyone elaborate on my sketchy understanding of what these two things actually do?
unixODBC is a 'DriverManager' for ODBC. You can use unixODBC when on a Linux or *nix system to connect to any ODBC-capable database. Doing so means that you can write one lot of database queries which you should be able to use between different databases. If you were not on Unix, you would use a different Driver Manager, for example the built-in MS Office one.
To make all the components clear: if you're using a language, let's say Python, to connect to SQL Server, your connection might pass from Python's pyodbc (translates python objects to and from unixODBC), to unixODBC (manages drivers, such as FreeTDS), to FreeTDS (translates unixODBC objects to and from the TDS protocol, which Microsoft embraces) to SQL Server.
The unixODBC website http://www.unixodbc.org/ says:
An ODBC application makes ODBC calls to the DriverManager. The DriverManager carries out a number of tasks for the application such as:
ensuring the proper driver is loaded/unloaded
validation tasks
3.5 to 3.0 to 2.0 call and data mapping
Most calls to the DriverManager get passed onto the loaded Driver to be further processed but that is of little concern to the application.
Some advantages to using an ODBC DriverManager include:
portable data access code
runtime binding to a Data Source
ability to easily change the Data Source
Briefly, it is the Driver Manager which reads your DSN, looks at the configured data sources, and decides where and how to connect.
Depending on which database you use, you will need a different driver. This piece of code 'translates' your requests made using ODBC to the right protocol for the relevant database management system. This is the component that would need to be different for different data sources. In your case, TDS is the protocol used by MS SQL Server. FreeTDS is a free software implementation of this protocol.
See also Wikipedia https://en.wikipedia.org/wiki/Open_Database_Connectivity (emphasis kept):
ODBC accomplishes DBMS independence by using an ODBC driver as a translation layer between the application and the DBMS. The application uses ODBC functions through an ODBC driver manager with which it is linked, and the driver passes the query to the DBMS. An ODBC driver can be thought of as analogous to a printer driver or other driver, providing a standard set of functions for the application to use, and implementing DBMS-specific functionality. An application that can use ODBC is referred to as "ODBC-compliant". Any ODBC-compliant application can access any DBMS for which a driver is installed. Drivers exist for all major DBMSs, many other data sources like address book systems and Microsoft Excel, and even for text or CSV files.
FreeTDS is the implementation of the TDS protocol, it handles everything and is fully capable of operating without unixODBC.
ODBC is a wrapper around FreeTDS and provides a common API that is better documented than FreeTDS' internal API.
As FreeTDS.org/faq.html indicates:
FreeTDS offers three client libraries and one internal one (libtds). We generally encourage people to use one of the client libraries, and discourage writing to libtds, because the latter is evolving, more subject to change, less well documented, and harder to use.
Both pieces aren't necessary, it's just that some people prefer to use wrappers rather than learn a new API to do something similar to what they already know in a different API. As FreeTDS' FAQ indicates, they support other open APIs other than ODBC and their internal library is fully capable of handling the connection all on its own.
I recently started using IBM Doors program, I also did start writing scripts for it in DXL. However when I checked the eclipse main page, I realized that a tool called MDAccess for Doors exsists. My question is that is it possible to write code in java for Doors if so, what are the disadvantages compared to DXL?
Yes, it is possible to write Java code for DOORS. You already found the solution: MDAccess is a commercial product provided by Sodius. According to the product specs and some marketing presentation it provides access to a DOORS server using the Java programming language.
Sodius sent me this information on personal request, indicating a disadvantage which might concern you:
Our Java layer is done to manipulate DOORS data, meaning read/write
DOORS data. You will not find Java wrappers of DXL functions that
interact with DOORS UI for example.
Note we are able to execute DXL code trough the Java layer so you can
always get this mean to achieve DXL-based operations.
However, it's not too cheap.
This might interest you, as it is using available APIs and does not rely on additional commercial products:
https://www.ibm.com/developerworks/rational/library/oslc-services-rational-doors/index.html
Disclaimer:
I work for IBM DOORS product support, however all information posted here is done so in a personal, private capacity and does not necessarily reflect the position of my employer. Therefore what I post does not constitute an official statement by IBM nor is any endorsement of the information by IBM to be assumed, implied or otherwise.
For your own benefit, please do not contact me in a business capacity on this platform, but use the official IBM Support community web portal. Thank you! :-)
In the sample Delphi web service tutorials I've read, they tend to build a web service that returns a simple string or integer, e.g.
http://blogs.codegear.com/pawelglowacki/2008/12/18/38624
However, I read it's possible in .NET to build a web service that returns a DataSet or even an object. Is this possible in Delphi 2009, and if so where can I find more info on this?
Also, what's your opinion on tools for building a web service, between Delphi and .NET?
Delphi 2009 has full suport for returning classes and datasets from webservices.
Datasnap is the framework that supports datasets you can find more information in this video by Nick Hodges, Delphi Product manager, this whitepaper by Marco Cantù (whitepaper extracted from his book “Delphi 2009 Handbook” and this code by Bruno Lichot.
Yes, it is possible to do so; you can do it by using Delphi's DataSnap with SoapConnection.
DataSnap is Delphi's multi-tier solution. In Delphi 2009 it saw a major update, and named DataSnap 2009, but DataSnap 2009 doesn't yet support some features of old DataSnap, like using WebServices for communication. The good news is that the older DataSnap is still available in Delphi 2009, and you are not forced to use DataSnap 2009.
OK, please let me know if the following is correct:
DataSnap 2009 can transmit datasets, but does not support communication via web service. i.e. both the client and server must be written in Delphi 2009?
Plain old DataSnap can transmit datasets using web service, but requires COM?
So next question would be, what would be use to consume a dataset returned by a web service?
As far as I know, .Net and Java clients can access DataSnap 2009 server too.
Old DataSnap doesn't use COM for SoapConnection, but COM is used for other kinds of connections (e.g. socket connection).
So next question would be, what would
be use to consume a dataset returned
by a web service?
On the server side, you can put DatasetProviders which connect to the dataset controls, and provide datasets for clients.
On the client side, ClientDataset is used to receive data provided by DatasetProviders on the server.