Im working as an intern and am new to rails and it's production evn. I was wondering how I could grab a database dump from a remote server and import into my local database so that my local env mirrors that of the live version of a site. I have access to the database, and I have the current version of the code in my environment. I am missing the pictures and files attached to the site, and need it to make changes locally.
In the production server execute the following command
mysqldump -u username -ppassword db_name > production_dump.sql
scp the production_dump.sql file to your local machine
In your local machine execute the following command.
mysql -u username -ppassword db_name < production_dump.sql
Related
I am going through Heroku's getting started tutorial. As far as I know, Postgres is fully updated and everything is installed appropriately. Interestingly, the app works online when I deploy it to heroku, but when I try to run the app locally, not so much! I am getting a:
PG::ConnectionBad
could not translate host name "myname" to address: Temporary failure in name resolution
### Convenience alias for PG::Connection.new.
def self::connect( *args )
return PG::Connection.new( *args )
end
I have not touched any information in the files created by Heroku for this tutorial - so what could be wrong?
The following commands were run:
heroku login
git clone https://github.com/heroku/ruby-getting-started.git
cd ruby-getting-started
heroku create
git push heroku main
heroku open (this gets to the app okay) heroku logs --tail
which psql
bundle install
export DATABASE_URL=postgres://$(whoami) bundle exec rake db:create db:migrate (Here is where the error is)
heroku local web
(Error here) http://localhost:5000/
Debug Postgres connection
Confirm that Postgres exists in yout local machine, is running and it have a user that maches with you machine username.
Try to run this command in your local machine:
psql -U $(whoami) -d postgres
The output should be something like this:
psql (9.2.24, server 10.14)
WARNING: psql version 9.2, server version 10.0.
Some psql features might not work.
Type "help" for help.
postgres=#
Then, confirm that a database matching with your username exists with "\l".
postgres=#\l
List of databases
Name | Owner | Encoding | Collate | Ctype | Access privileges
----------------+----------+----------+-------------+-------------+-----------------------
myuser | odoo | UTF8 | en_US.UTF-8 | en_US.UTF-8 |
tip: you can close with "\q"
postgres=#\q
if no error, try replacing the value of DATABASE_URL with one of the following...
postgresql://
postgresql://localhost
postgresql://localhost:5433
postgresql://localhost/mydb
postgresql://user#localhost
If this is not successful, then you must review the way that postgres is installed.
...Or manually create a superadmin User with a password inside Postgres.
The conection string pattern is
postgres://user:password#host:port
example...
export DATABASE_URL=postgres://db_user:123456#localhost:5432
DATABASE_URL=postgres://$(whoami)
This is trying to connect to a computer which happens to have the same name as your local user account. Unsurprisingly, it doesn't exist, hence the error.
Here's the command to use if you are trying to make a backup or create a dump file from your local Postgres DB.
Mainly you need to change --host to --port, and that's it:
pg_dump --format t --no-owner --no-privileges --verbose --username postgres --password --dbname <DATABASE_NAME> --port <PORT, usually 5432> --file <THE FILE NAME YOU WANT>.dump
*if you are on Windows, use CMD instead of git bash. idk why.
Hi I have a ruby on rails app hosted on AWS EC2 and it is using mysql3 as database.Now I have to take backup of the database to my local machine.
There are two ways to take backup.
Using mysql workbench UI tool connect you database via ssh tunnel.
Connect to the AWS EC2 and take backup there itself and copy the backup file using scp command.
Hope this might help you.
I did the same with a DigitalOcean application with PostgreSql, to do that, this is what I did, for that you will need a ssh connection, everywhere (DigitalOcean... and probably Amazon) explains how to do that
In the server (AWS in you case):
make a cron to execute daily a script to make a backup of the database
crontrb -e
and add, to perform a copy every day at 23:00
23 * * * sh /home/rails/backup/backup_dump.sh
create /home/rails/backup/backup_dump.sh:
NOW=$(date +"%d")
FILE="app_production_$NOW.sql"
pg_dump -U rails -w app_production > /home/rails/backup/$FILE
of course pg_dump is what I use to make a backup of my PostgreSql database, in you case with MySQL will need another
In your local machine:
Add in /etc/cron.daily directory the script file that contains the recover from AWS backup file, and populate:
NOW=$(date +"%d") # date - 1... the day before, don't remeber the script sintax
scp -r root#ip_server:/home/rails/backup/app_production_$NOW.sql /local_machine/user/local_backups
And that's all, I hope will help you
I am working with a Rails 4 app, I am using PSQL both in my development and production. Due to some reason I have to work with a new computer/laptop so I set up Rails environment in it and cloned my app into it, but what I really require is that, I need my existing databse with data in it, how to do it ?
To do this you need to create a dump on system 1 and restore it on system 2, here are the steps:
sudo -u postgres pg_dump <DB_NAME> > dump - creating a file dump
Copy this file via dropbox or whatever to another system.
sudo -u postgres psql <DB_NAME> < dump - copy the new created dump to new system.
Note:
You should have empty created database on your new system, or you can use dataonly dump passing the --data-only to pg_dump command.
Also you can read documentation for pg_dump to find any other options which you might need.
I would like to create a script that would make a dump of a postgresql DB on heroku and download it to my local server.
I using windows server 2008 R2 and would assume that this would be activated with scheduler.
On the local server installed is ruby 1.93 and chocolately (run curl on a PC).
I am assuming that the script would be a ruby file and have the commands to both create a backup and and then use a curl command to download it. The latest backup would be the only one downloaded
The commands would be something like
heroku pgbackups:capture --expire -a appname
curl -o latest.dump heroku pgbackups:url
thanks in advance
The easiest way would be to get curl for your Windows machine at http://curl.haxx.se
I have a Ruby on Rails application and I want to get the data from a Heroku server to a local machine.
I have tried the steps mentioned in
https://devcenter.heroku.com/articles/heroku-postgres-import-export, but it just copies the Data Definitions not the actual data on the server.
Is there any way to get the data from Heroku to local databases?
Try using the steps detailed here.
Basically, grab the information from your Heroku database using heroku config:get. Look for your Heroku Postgres url. It will look something like this: HEROKU_POSTGRESQL_RED_URL: postgres://user3123:passkja83kd8#ec2-117-21-174-214.compute-1.amazonaws.com:6212/db982398. Note that the string is configured as database://username:password#host:port/databasename. Thus, in this example, the username is user3123, the database name is db982398. You'll need this for the next part.
Then, you'll use the information in the above postgres string to make a local copy of your Heroku database using pg_dump. Enter the following into your terminal:
pg_dump --host=<host_name> --port=<port> --username=<username> --password --dbname=<dbname> > output.sql
In each place where <....> is in the above code, enter your specific information. After typing this, the terminal will ask you for your password. Enter it.
Finally, you'll load that dumpfile into your local database using psql.
psql -d <local_database_name> -f output.sql
Be sure to put the name of your local database.