I'm just starting out on Bluemix, I've created a small rails 4 app and tested it locally using sqlite. As DB2 is the default on Bluemix I opted for it when I setup the app.
When I added the ibm_db gem to my gem file ready to deploy to Bluemix I got the following error from bundler:
Environment variable IBM_DB_HOME is not set. Set it to your DB2/IBM_Data_Server_Driver installation directory and retry gem install.
I don't have DB2 installed on my dev machine as I won't use it for anything else, I normally use Postgres but thats not natively supported on Bluemix it's a third party provider which I don't want to get into. I'm not willing to install DB2 just to deploy to Bluemix, I'm hoping theres another way ?
Thanks.
In order to make a connection to DB2 from your local machine, you'll also need an IBM DB2 driver installed and your IBM_DB_HOME environment variable set to the path where you install the driver package. (e.g. /home/db2inst1/sqllib)
The ibm_db gem should then be able to find the necessary binaries in order to make a connection to the database.
You can obtain a driver here: https://www-304.ibm.com/support/docview.wss?uid=swg21418043
This developerworks article will also be of help, as you could use DB2 Express-C to test things out locally: http://www.ibm.com/developerworks/data/library/techarticle/dm-0705chun/
DB2 is not default on Bluemix. You can select a range of database services offered on bluemix and you need to bind your app with that database service. For DB2, you can opt for a SQLDB service. On Bluemix, DB2 comes as SQLDB.
We get "Environment variable IBM_DB_HOME is not set" error while pushing app in Bluemix if during deployment of your app client driver for DB2 is missing. Bluemix automatically download DB2 client driver if version of your gem is 2.5.18 or later. It seems you are using older version of gem. Hence, getting this error. Download latest version of gem and try. It should work.
you can choose from wide range of DB(mysql,SQLDB,mongodb,elephantSQL) service offerings from Bluemix dashoard. There is no such default db or any services is defined in Bluemix!!
I think you are likely on Mac and Darwin 14 more specifically for which there isn't yet a prebuilt ibm_db binary. U can follow on this dw thread for updates
DB2 is termed as SQLDB in bluemix dashboard,which is not the default one.Infact,Bluemix dont have anything boilerplate which is default.
Now looking at the error snippet your provided "Environment variable IBM_DB_HOME is not set" means that your app client driver for DB2 is missing.
Possible reason would be that you are using older version gem,you should try to download latest version,which would fix this error.
Hope it helps!!!
Related
Can I have two versions of the Neo4j Community Server database (V3.5 and V4.0.1) installed in macOS? I'd like to continue working with the V3.5 database for my current project, begin testing V4.0.1, migrate the current database to V4.0.1, and use V4.0.1 for my new project. Has anyone tried this?
Perhaps a good strategy would be to use separate Docker containers.
If you want to use the same data set, you might try grabbing the Neo4j 3.5 container from Docker Hub and following the migration path (https://neo4j.com/docs/operations-manual/current/upgrade/) to test out the actual migration and continue working with your data from 3.
You could also, of course, grab the Neo4j 4.0 container from Docker Hub and start playing around with new data if you just want to test out features.
I am trying to get the Paradise Papers running on an AWS cloud server. I installed Neo4j Desktop on my mac, and commissioned a server by deploying the AMI from https://aws.amazon.com/marketplace/pp/B071P26C9D.
I then copied the data from the Desktop install to the cloud install, updated the neo4j.template file to point to the new directory, and restarted the service.
The problem is that I can no longer connect to the server. In the log files I can see the following lines (along with a lot of diagnostic information).
2018-09-30 07:41:59.920+0000 INFO [o.n.k.i.f.CommunityEditionModule] No locking implementation specified, defaulting to 'community'
2018-09-30 07:42:00.104+0000 INFO [o.n.k.AvailabilityGuard] Requirement makes database unavailable: Database available
I suspect that there may be some kind of licensing or version constraint that is preventing the database from running. Could this be possible? Or is it something else?
Your issue is not related with a licensing or version problem.
In your debug.log, I don't see any message that can explained your issue.
Is there something useful in the neo4j.log file ?
Can you try to start neo4j after after removing all your plugins (apoc) ?
Can you try to start Neo4j on an empty database ?
Cheers
When I do :
sudo docker version
I obtain this error:
Error response from daemon:client is newer than server (client API version: 1.24, server API version: 1.21)
Anyone can help me to understand what I have to do?
Try setting the version using the command:
export DOCKER_API_VERSION=1.23
It worked perfectly fine for me and resolved the issue.
Docker is running on client / server model, each Docker Engine release has a specific API version.
The combination of the release version and API version of Docker is as follows:
https://docs.docker.com/engine/api/v1.26/#section/Versioning
According to the table above, the Docker API v1.24 is used in Docker Engine 1.12.x and the Docker API v1.21 is used in the Docker Engine 1.9.x. The server needs API version equal to or later than the client.
You have the following three options.
Upgrade the server side to Docker Engine 1.12.x or later.
Downgrade the client side to Engine 1.9.x or lower.
Downgrade the API version used at run time by exporting the DOCKER_API_VERSION=1.21 to environment variable on the client side.
The other answers don't really explain how to do this on a windows machine. I had no access to the gui so I had to get it done from the CLI.
I know this is old, but I fumbled with this for a while until I finally figured it out. So, I hope this helps someone.
Windows Users
For people who are on windows you can set your env variables by going to the Advanced System Settings.
If you need to do it via command line. This is what worked for me:
setx /M DOCKER_API_VERSION "1.23"
Additionally, you can also set the permanent host location, and then just run your commands without the -H option by using the following:
setx /M DOCKER_HOST "192.168.207.131:2375"
NOTE: after you set the variables you must close the command line and open a new one for the changes to take affect.
NOTE 2: If changes are being made to a remote system, you need to logout and log back in for the changes to take affect.
I have successfully installed weblogic server 12.2.1 and created a domain using docker tool box, but when I tried to create a generic datasource for MS SQL server, I am getting the following error:
Cannot load driver: weblogic.jdbc.sqlserver.SQLServerDriver
Please give solution how to add this driver to weblogic server.
Thanks in advance.
I encountered the same problem when I installed Weblogic Quick Installer for Developers which was about 209MB.
After some research, I found out why from here: https://docs.oracle.com/cd/E15523_01/web.1111/e13753/usedriver.htm#JDBCD111
The WebLogic Type 4 JDBC drivers are installed by default when you perform a complete installation of WebLogic Server. If you choose a custom installation, ensure that the WebLogic JDBC Drivers option is selected (checked). If this option is unchecked, the drivers are not installed.
Granted, the above documentation is for Weblogic 11, but it seems to still apply to 12. I resolved the issue by downloading a separate wlsqlserver jar file (wlsqlserver-12.2.1-0-0.jar) and renaming it to wlsqlserver.jar (according to Manifest.MF spec) and putting it on to oracle_common/modules/datadirect folder.
I also believe if you choose Generic Installer which is about 791MB, it would also install the drivers so you wouldn't have to go through the process I did.
my application works fine on the computer where I made it but on another
machines it displays error: "Transaction not connected".
I installed pack created by Runtime Packager and added a couple
of DLLs to the application path on the target machine. Still the same
problem.
My question is if I should use dnsless connection to make it work?
Or maybe inform the target PC about the required ODBC configuration?
If yes - how to do that?
I read a lot about odbc.ini, system variables, registry entries etc.
but now I got it all mixed up and have no clue what to do.
I'll be very grateful for your help.
Kris.
Personnally, I use DSN. So, the only thing the PowerBuilder knows is the DSN name. On that base, I establish the connection.
On the target PC, I configure ODBC with this very same DSN name and the necessary drivers. This way, you have some flexibility in the deployment. For instance, I can develop using DSN 'db', referring on my developper's machine to database 'dev' or the server 'server_dev' and deploy on a target machine where 'db' refers to 'prod' database on server 'productionServ'.
Compatibility issues aside, it can even be that the first one is MySQL and the other onee Oracle.
It is anyway crucial to install the drivers allowing you to access the desired database on the target machine and, if you use DSN, to configure it.
Since you said you got confused with odbc.ini & registry, lets start from there.
Assuming the required ODBC connection is configured in your computer (as you said),
Open registry editor (Win+R, regedit) in your computer.
Go to path "HKEY_CURRENT_USER\SOFTWARE\ODBC\ODBC.INI". NOte: "HKEY_LOCAL_MACHINE" can be used for multi-user computers.
Right click on your "ODBC Connection" and export. Save the file(*.reg). (opening this file in notepad will give you some ideas on how it is configured.)
Also do the same for "HKEY_CURRENT_USER\SOFTWARE\ODBC\ODBC.INI\ODBC Data Sources\". Merge both files into one (open the *.reg files in notepad and merge the text).
Now you have the following options to setup the ODBC in your target machine(s):
This file(*.reg) can be executed on the target machine (beware, the drivers path might be different in the target machines.).
If you have an installer, try to include code to write the registry values from the installer.
Use PB function RegistrySet() to create your own ODBC connection if it does NOT exists.
Brief of what we are trying: Export registry entries (as a *.reg file) from the development machine and create similar registry entries on target machines using any of the 3 methods listed above.