Why Postgres DUMP data database connection lost On Amazon RDS? - ruby-on-rails

I am having Postgres database dump file of 150 GB on Amazon EC2 instance. While dumping the data on RDS from EC2 server I am getting error.
The output of the command is as given below. It's giving error
1. psql connection not open
2. connection to server was lost
on RDS dump postgres copy command
Command Output:
SET
SET
SET
SET
SET
SET
SET
SET
ALTER TABLE
ALTER TABLE
ALTER SEQUENCE
ALTER TABLE
psql:filename.sql:1396266: connection not open
psql:filename.sql:1396266: connection to server was lost
Application Configuration:
Ruby 1.9.3
Rails 3
PostgreSQL 9.3
Please help why it is breaking while copying the data. As the connection was established and command is running. Suddenly while executing the copy command it is breaking.
Update Findings
Command/script that I am using to dump data the data is below:
psql -h instance.id.region-2.rds.amazonaws.com -p 5432 -U username -W -d database_name -f filename.sql
Workaround for finding issues as below:
I took head 100 rows and tail 200 rows and made 1 file from the big 150 GB file it converted to 56KB. when I run this command, it's dumped successfully.
So the file size is causing problem. The same command is running for small size file for dumping data on RDS.
How can I resolve this issue?

I think your connect it's disconnect in tcp connection level.
these is any limit in your env from your client to RDS?
like netflow control, idle session kill and so on...

Related

How to backup of deploy rails application from server?

Hi I have a ruby on rails app hosted on AWS EC2 and it is using mysql3 as database.Now I have to take backup of the database to my local machine.
There are two ways to take backup.
Using mysql workbench UI tool connect you database via ssh tunnel.
Connect to the AWS EC2 and take backup there itself and copy the backup file using scp command.
Hope this might help you.
I did the same with a DigitalOcean application with PostgreSql, to do that, this is what I did, for that you will need a ssh connection, everywhere (DigitalOcean... and probably Amazon) explains how to do that
In the server (AWS in you case):
make a cron to execute daily a script to make a backup of the database
crontrb -e
and add, to perform a copy every day at 23:00
23 * * * sh /home/rails/backup/backup_dump.sh
create /home/rails/backup/backup_dump.sh:
NOW=$(date +"%d")
FILE="app_production_$NOW.sql"
pg_dump -U rails -w app_production > /home/rails/backup/$FILE
of course pg_dump is what I use to make a backup of my PostgreSql database, in you case with MySQL will need another
In your local machine:
Add in /etc/cron.daily directory the script file that contains the recover from AWS backup file, and populate:
NOW=$(date +"%d") # date - 1... the day before, don't remeber the script sintax
scp -r root#ip_server:/home/rails/backup/app_production_$NOW.sql /local_machine/user/local_backups
And that's all, I hope will help you

Can't connect to local MySQL server through socket in ubuntu while running rails

I trying to follow the steps here to run ruby on rails on linux, everything was fine except when i try to execute this
rake db:create
i got this error,
#<Mysql2::Error: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)>
Couldn't create database for {"adapter"=>"mysql2", "encoding"=>"utf8", "pool"=>5, "username"=>"root", "password"=>"secretpassword", "host"=>"localhost", "database"=>"apps_development"}, {:charset=>"utf8"}
(If you set the charset manually, make sure you have a matching collation)
Created database 'apps_development'
#<Mysql2::Error: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)>
Couldn't create database for {"adapter"=>"mysql2", "encoding"=>"utf8", "pool"=>5, "username"=>"root", "password"=>"secretpassword", "host"=>"localhost", "database"=>"apps_test"}, {:charset=>"utf8"}
(If you set the charset manually, make sure you have a matching collation)
Created database 'apps_test'
what this means?
The error is quite explicit:
#<Mysql2::Error: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)>
Your application can't connect to your MySQL database.
Your are trying to reach it through socket (/var/run/mysqld/mysqld.sock).
You have to know where Mysql socket is stored and adapt code OR MySQL config to match eachother.
Before that, you may want to check if Mysql is running, obviously you can not access to socket if Mysql is down. Check it with: sudo service mysql status.
If Mysql is up, check what following.
Find socket and check rights
Find socket path while Mysql running: mysql -e '\s;' | grep 'UNIX socket:'. You may need to add -u<USERNAME> -p<PASSWORD> depending your client configuration. It should result something like: UNIX socket: /var/run/mysqld/mysqld.sock. Here /var/run/mysqld/mysqld.sock is the socket location.
Check socket file: ls -l <SOCKET-PATH> (here: ls -l /var/run/mysqld/mysqld.sock).
If you got ls: cannot access /var/run/mysqld/mysqld.sock: No such file or directory then your socket file do not exists. Restarting Mysql may fix it (sudo service mysql restart).
If you got srwxrwxrwx 1 mysql mysql 0 nov. 25 10:07 /var/run/mysqld/mysqld.sock check rights and owners on this file (by default should be owned by mysql user & group and be executable at least to mysql user).
Configure your app
Now you know where socket file is and you are sure your app can access it, you may need to adapte your application to what you learn from first part.
According to Rails doc you must edit your config/database.yml file to set socket field to the socket location.

How to deploy a local postgresql database of Ruby on rails app to heroku?

I have been trying to deploy a local database which has some data into heroku. My app's name is myFirstBlog. But after deploying when I refresh my app, it says something went wrong. After so troubeshooting. I found out this:
Before exporting data into heroku database, I ran the command
heroku pg:info
And the output was:
Plan:Hobby-dev
Status:Available
Connections:1/20
PG Version: 9.4.1
Created: 2015-07-09 08:20 UTC
Data Size:6.6 MB
Tables: 3
Rows: 2/10000 (In compliance)
Fork/Follow: Unsupported
Rollback: Unsupported
And after I exported data to heroku and ran the same command, output was:
Plan: Hobby-dev
Status: Available
Connections:1/20
PG Version:9.4.1
Created:2015-07-09 08:20 UTC
Data Size:6.5 MB
Tables:0
Rows: 0/10000 (In compliance)
Fork/Follow:Unsupported
Rollback:Unsupported
The number of tables becomes 0 after export. Why is it happenning?
This is how I am exporting my local database's data to heroku:
PGPASSWORD="password" pg_dump -Fc --no-acl --no-owner -h localhost -U aditya9509 myFirstBlog_development > backup.dump //This command dumps the data in the backup.dump
Then I saved backup.dump in my github account because the tutorials stated that in order to export data from local database to heroku, it must be at a location which can be retrieved using a http protocol. I did not understand why but I did what it said.
Then finally I ran this command:
heroku pg:backups restore "http://github.com/aditya9509/rubyOnRails/blob/master/backup.dump" DATABASE -a stark-beach-9626
The "stark-beach-9626 is the name of the app given by heroku.
After running this command, when I access the app, it shows "something went wrong". What am I missing here?
P.S. I am new to ruby on rails so please be as simple as you can when you answer. I have been busting my head to solve thi problem for hours now. Also, let me know if you need some additional info. I gave all the info I thought was relevant.
You are trying to restore the git blob and not the raw file, use https://github.com/aditya9509/rubyOnRails/raw/master/backup.dump for the url and it will work.

Grabbing mysql dump file from remote server

Im working as an intern and am new to rails and it's production evn. I was wondering how I could grab a database dump from a remote server and import into my local database so that my local env mirrors that of the live version of a site. I have access to the database, and I have the current version of the code in my environment. I am missing the pictures and files attached to the site, and need it to make changes locally.
In the production server execute the following command
mysqldump -u username -ppassword db_name > production_dump.sql
scp the production_dump.sql file to your local machine
In your local machine execute the following command.
mysql -u username -ppassword db_name < production_dump.sql

Putting Rails app into production - rake db:schema:load not working

I'm trying to deploy a new Rails app to a Bitnami/Ubuntu/Virtual Server. I am remote and using the SSH terminal.
I have successfully used Capistrano to cap deploy:update. My source is going to github and Capistrano is then putting it on the server.
So, I have this directory on the server:
/opt/bitnami/projects/ndeavor/releases/20130306180756
The server also has a PostgreSQL stack running. I have created my Postgresql user and empty database. I believe my next step is to run this command using the SSH console:
bitnami#linux:/opt/bitnami/projects/ndeavor/releases/20130306180756$ rake RAILS_ENV=production db:schema:load
Question 1 = Is that the correct next step?
When I run that command, I get this:
could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/tmp/.s.PGSQL.5432"?
Questions 2 = How can I get Rake to find the PostgreSQL socket?
I could put something like this in the database.yml file:
socket: /var/pgsql_socket
But, I don't know what the correct entry should be
Thanks for your help!!
UPDATE1
I also tried having the database.yml file like this:
production:
adapter: postgresql
encoding: unicode
database: ndeavor_production
pool: 5
username: (user name)
password: (password)
socket: /var/run/postgresql
But, I get the same error:
Is the server running locally and accepting
connections on Unix domain socket "/tmp/.s.PGSQL.5432"?
Why isn't it at least asking me for Unix domain socket "/var/run/postgresql" ??
UPDATE2
I found this:
"I have solved my problem by declaring the unix_socket_directory in postgresql.conf file to be /var/run/postgresql. It does seem for a standard build they should have a common location?
If you build from unmodified PG sources, the default socket location is
indeed /tmp. However, Debian and related distros feel that this
violates some distro standard or other, so they modify the source code
to make the default location /var/run/postgresql. So it depends on
whose build you're using."
But, I'm not sure if I should be changing the postgresql.conf file or the Rails database.yml file
UPDATE3
I looked in /var/run/postgresql directory and it's empty.
I can't find where the .s.PGSQL.5432 is located
As Bob noted, specifying host and port can fix the problem. Since he hasn't explained this in more detail I want to specify.
The default port is 5432, and the default "host" is a path to where it expects the socket. Port is always numeric, but host can be set for any libpq connection either to the network host or to the directory containing the socket. For example, connect using psql with
psql -h /tmp -p 5432 -U chris mydb
This will connect over the /tmp/.s.PGSQL.5432 socket. Now Ruby is somewhat different but the pg gem uses libpq so it should behave the same.
If you don't know where the socket is, the obvious next thing to try is the network address, and particularly localhost.

Resources