The production server that hosts my rails app is being wiped and started again, as a result i will need to transfer my rails app onto the new system. The source isnt a problem i can just pull down from git again but the database is another matter. I could install phpmyadmin or something similar to access the database but i was wondering if there was something in rails (possibly a rake task) that would let me dump the current database and then import it onto a new server.
You don't need Rails or PHPMyAdmin for this. Assuming you're using MySQL, simply ssh to your server:
mysqldump -u root -p databasename > database.sql
Then on the other system:
mysql -u root -p newdatabasename < database.sql
Easy, huh?
If it is a recurring task, you could also put that into a rake task under lib/tasks:
namespace :db do
desc "Dump database"
task :dump => :environment do
exec "mysqldump -u root -p databasename > database.sql"
end
desc "Restore database"
task :restore => :environment do
exec "mysql -u root -p newdatabasename < database.sql"
end
end
Related
I am trying to take pg dump from testing server to my local machine, i have been logged into my server
root#myproject-staging-development1:~/myproject/current#
Which command should i write in here to get pg dump into my local machine?
Dump Your PostgreSQL Database
Step 1
SSH to the staging/production server.
Step 2
Dump the desired database:
pg_dump database_name > database_name_20160527.sql
You can name your dump as you wish - I'm using dates to distinguish multiple dumps.
Step 3
Leave SSH and download your new SQL file using SCP.
scp login#host:path_to_dir_with/database_name_20160527.sql database_name_20160527.sql
This command logs you into your remote server using SSH and downloads a given file to the local directory specified by you. If you give no path to the local directory, the dump will be saved in your current working dir.
Example:
scp marcin#8.8.8.8:/home/my_app/backups/my_app_database_20160527.sql my_app_database_20160527.sql
Restore Your PostgreSQL Dump
Step 1
If you want to use the current localhost database, you must drop it first:
psql template1 -c 'drop database database_name;'
Step 2
Create a new database on the localhost:
psql template1 -c 'create database database_name with owner your_user_name;
Step 3
And write your dump into the database:
psql database_name < database_name_20160527.sql
Source
You can run the pg_dump via the ssh command so you have a one-liner:
filename="tmp/backup_$(date +%Y-%m-%d_%H-%M-%S).sql"
ssh user#IP \
"pg_dump --no-owner postgresql://user:pass#127.0.0.1/dbname" \
>"$filename"
I'm trying to connect to a server's Postgres db through a script. I've ssh-ed into the box and tried
$ psql postgres -U my_username -W #And then entered the password
and get the error:
psql: FATAL: Peer authentication failed for user "my_username"
However, the Rails application running on this server is using these same credentials. I've used ActiveRecord::Base.connection_config in the rails console and used the credentials exactly as they are when trying to use psql. In the rails console, I'm able to query on the database, so I do know the connection is working there.
Is it possible there's a restriction somewhere that's preventing me from connecting through psql? Or is there something else I'm doing wrong?
Try to connect like this:
psql postgres -U my_username -h 127.0.0.1 -W #And then entered the password
Have you tried it like this:
psql --username="postgres"
Another possibility would be to inform more parameters
psql --username="postgres" --host="<ip or host name>" --port="<port, default 5432>" -W <db name>
Good look!
I am new to rails.
I want to take backup of pg database from digitalocean to my local machine. How I take dump of that and migrate to my local machine
To use pg_dump,
First, for the target machine(remote machine with database you want to dump), two steps to make the machine receive pg_dump requests:
1.Add or edit the following line in your postgresql.conf :(in my experience, the location maybe /etc/postgresql/9.3/main/postgresql.conf, replace 9.3 with your psql version. If nobody change the file before, you add the line below to the end of the file)
listen_addresses = '*'
2.Add the following line as the first line of file 'pg_hba.conf'. (in my experience, the location like /etc/postgresql/9.3/main/pg_hba.conf) It allows access to all databases for all users with an encrypted password:
# TYPE DATABASEUSER CIDR-ADDRESS METHOD
host all all all md5
After those two steps, type in the terminal:
/etc/init.d/postgresql start
At last, in your local machine, you should figure out the target database's user(or owner) who can read it:
You can achieve this by ssh to connect that machine and step into psql console
sudo su - postgres && psql
and type
\l
to see the db owner.
Finally you can use pg_dump in your local machine to dump the database.Like :
pg_dump -f dump_name -h host_ip -d database_name -U database_user -p 5432 -W
then input the user's password, and wait for the long time for dumping the db.
Hope you make it~
First you need to create backup then download dump from digital ocean and the run these commands on console.
Download dump using SCP.
1-pg_dump dbname > outfile
2-pg_restore --verbose --clean --jobs=4 --disable-triggers --no-acl --no-owner -h localhost -U user_name -d database_name outfile.dump
I'm working on a Rails project that uses Sidekiq. Our Sidekiq implementation has two workers (WorkerA, that reads queue_a, and WorkerB, which reads queue_b). One of them has to be executed in the same server the Rails app is and the other one in a different server(s). How can I prevent WorkerB from being executed in the first server, and vice versa? Can a Sidekiq process be configured to run just specific workers?
EDIT:
The Redis server is in the same machine the Rails app is.
Use a hostname-specific queue. config/sidekiq.yml:
---
:verbose: false
:concurrency: 25
:queues:
- default
- <%= `hostname`.strip %>
In your worker:
class ImageUploadProcessor
include Sidekiq::Worker
sidekiq_options queue: `hostname`.strip
def perform(filename)
# process image
end
end
More detail on my blog:
http://www.mikeperham.com/2013/11/13/advanced-sidekiq-host-specific-queues/
well, here is the way to start sidekiq with options
nohup bundle exec sidekiq -q queue_a queue_b -c 5 -e #{Rails.env} -P #{pidfile} 2>&1 &
you can start sidekiq with specific workers
you can run nohup bundle exec sidekiq -q queue_a -c 5 -e #{Rails.env} -P #{pidfile} 2>&1 & to execute only WorkA
to distinguish different workers on different servers, just do like below:
system "nohup bundle exec sidekiq -q #{workers_string} -c 5 -e #{Rails.env} -P #{pidfile} 2>&1 &"
def workers_string
if <on server A> # using ENV or serverip to distinguish
"queue_a"
elsif <on server B>
"queue_b queue_b ..."
end
end
#or you can set the workers_strings into config file on different servers
I have rails app which requires mongo dump of a test database, which I restore using something like
mongorestore -d test_database dump/test_databse
when I run this command from the terminal everything works fine
$ mongo test_database
MongoDB shell version: 2.4.12
connecting to: test_database
> db.users_user.count()
50
> db.users_posts.count()
100
but when I run the same command using Ruby
system "mongorestore -d test_database dump/test_databse"
one of the collections users_posts is not inserted
$ mongo test_database
MongoDB shell version: 2.4.12
connecting to: test_database
> db.users_user.count()
50
> db.users_posts.count()
0
What's going on here? Is it a permissions issue? I am stumped.