Pushing a single table to Heroku - ruby-on-rails

I am aware of the heroku pg:push command which pushes an entire database up to Heroku.
Now that I am launching my product, I would like to be able to push up only a specific table that contains information collected locally without overwriting existing tables (such as users).
Is there a command that enables me to only push specific tables to heroku?

My suggestion is to use PostgreSQL dump/restore capabilities directly using the pg_dump and psql commands.
With pg_dump you can dump a specific table from your local database
$ pg_dump --data-only --table=products sourcedb > products.sql
Then grab the Heroku PostgreSQL connection string from the configs
$ heroku config | grep HEROKU_POSTGRESQL
# example
# postgres://user3123:passkja83kd8#ec2-117-21-174-214.compute-1.amazonaws.com:6212/db982398
and restore the table in the remote database, using the information retrieved from Heroku.
$ psql -h ec2-117-21-174-214.compute-1.amazonaws.com -p 6212 -U user3123 db982398 < products.sql
You will need to customize the -p, -h and -U parameters, as well as the database name. The password will be prompted by psql.
You can also use the pg_restore to filter a dump and restore the table, but I personally prefer psql.
Note that Heroku is recommending the use of PostgreSQL tools in several documentations, such as Importing and Exporting for large data, or whenever the provided CLI commands don't cover specific cases like the one in this question.

I wrote script which extracts DB url from heroku. Then it dumps single tables from production and restores them on development/localhost. Run it like this:
rake production_to_development:run\['users;news;third_table',my-sushi-app\]
Code:
namespace :production_to_development do
task :run, [:tables, :app] => [:environment] do |t, args|
tables = args["tables"].split(';')
database_url = nil
Bundler.with_clean_env { database_url = `heroku config:get DATABASE_URL --app=#{args["app"]}` }
require 'addressable/uri'
uri = Addressable::URI.parse(database_url)
remote_database = uri.path[1,uri.path.length-2] # there is \n at the end of the path!
tables.each do |table|
backup_file = "tmp/#{table}.backup"
#bin_dir = "/Applications/Postgres.app/Contents/Versions/latest/bin"
bin_dir = ""
dump_command = "PGPASSWORD=#{uri.password} #{bin_dir}/pg_dump --file \"#{backup_file}\" --host \"#{uri.host}\" --port \"#{uri.port}\" --username \"#{uri.user}\" --no-password --verbose --format=c --blobs --table \"public.#{table}\" \"#{remote_database}\""
`#{dump_command}`
`psql -U 'root' -d my_table -c 'drop table if exists #{table}'`
`pg_restore -d my_table --no-owner #{backup_file}`
end
end
end

If I understand correctly, you just need a single database table with its locally created data pushed to your Rails production app. Maybe this is a simplistic approach, but you could create a migration for your table and then populate using db/seeds.rb.
After you've populated the seeds.rb file and pushed your repo to heroku:
heroku run rake db:migrate
heroku run rake db:seed
Also, if your local table has a ton of data and you're using Rails 4, check out the seed dump gem: https://github.com/rroblak/seed_dump. This will take your existing db data and map it to the seed format.

Related

How to deal with foreign keys when moving a postgres database between machines

I'm trying to move a postgres database between machines as I move from one development platform to another. I have yaml_db gem installed on both machines.
On my old platform I do:
rake db:scheme:dump
rake db:data:dump
When I go to reload the database on my new machine I've discovered that my 2 dozen foreign_keys are preventing me for loading my data. What are my options?
You're copying a database, Rails really shouldn't have anything to do with the process (and as you're seeing, it just gets in the way).
Instead, put on your DBA hat and copy the database without bothering with Rails. Dump the data using pg_dump and then restore the data with pg_restore. The database's backup/restore tools know all about foreign keys, triggers, extensions, and anything else that Railsy tools don't understand.
you can use pg_dump command to dump your database using:
eg:
pg_dump -U <user-name> -h <host> <database> > <file-name>.sql
pg_dump -U postgres -h 127.0.0.1 database1 > database1.sql
Then copy file to other machine and run following command to restore database
psql <database-name> < path/to/sql_dump_file
psql database1 < database1.sql

Heroku Rails Rake Task to Sync Production & Local DB

I'm trying to create a rake task so that I can simply type "rake db:sync" in order to update my local DB to match production.
This solution leverages code provided by the Heroku team here:
Importing and Exporting Heroku Postgres Databases with PG Backups
When I use curl --output /tmp/latest.dump #{url} I'm getting the following error in my latest.dump file:
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AuthorizationQueryParametersError</Code><Message>Query-string authentication version 4 requires the X-Amz-Algorithm, X-Amz-Credential, X-Amz-Signature, X-Amz-Date, X-Amz-SignedHeaders, and X-Amz-Expires parameters.</Message><RequestId>421FEFF763870123</RequestId><HostId>vlVr/ihmQiDgYIpdFFkuCgEP8Smvr2ks0wRkf89fJ8NfHfsBb92EVv40Q0NZuQIC</HostId></Error>
Here is the code I'm using.
#lib/tasks/db_sync.rake
namespace :db do
desc 'Pull production db to development'
task :sync => [:backup, :dump, :restore]
task :backup do
Bundler.with_clean_env {
puts 'Backup started...'
system "heroku pg:backups capture --app YOUR_APP_NAME"
puts 'Backup complete!'
}
end
task :dump do
dumpfile = "#{Rails.root}/tmp/latest.dump"
puts 'Fetching url and file...'
Bundler.with_clean_env {
url = `heroku pg:backups public-url --app YOUR_APP_NAME | cat`
system "curl --output #{dumpfile} #{url}"
}
puts 'Fetching complete!'
end
task :restore do
dev = Rails.application.config.database_configuration['development']
dumpfile = "#{Rails.root}/tmp/latest.dump"
puts 'PG_RESTORE on development database...'
system "pg_restore --verbose --clean --no-acl --no-owner -h localhost -U #{dev['username']} -d #{dev['database']} #{dumpfile}"
puts 'PG_RESTORE Complete!'
end
end
Check out the Parity gem. It offers several commands to do the following Heroku Rails tasks easily -
Backup DB's
Restore DB's
Run rails console
Tail logs
Run migrations
Deploy
You're of course primarily looking for the first two.
After installation, it expects that you have two git remote values set named staging and production. development isn't needed as it is assumed to be your local machine.
You can get the git url for the other two environments from your Heroku dashboard -> (your app) -> Settings -> Info
After you have that set up, it's as simple as
production backup
development restore production
The code is pretty simple, so I encourage you to read it. But it's essentially doing exactly what your rake code attempts to do by getting a public URL and restoring it.

Rails / postgresql Migrate Data from Database to Newly created database

I have an existing database on my server containing many tables with content. Now I have created a new database but some columns are added.
Is it possible to migrate all the data from the one database to the other.
Kind regards.
I've used the yaml_db gem to migrate DBs: https://github.com/ludicast/yaml_db - this gem adds some rake tasks that are helpful
After installing the gem, you can run rake db:data:dump to save your database to a .yml file.
Then, after changing your database configuration, you can run rake db:data:load to load the data into your new database.
I like your answer! But a more easy way is to dump the whole database like you said. But just transfer it to another server.
Like this:
To Dump:
pg_dump -U demo02 -h localhost -O demo02 > demo2.sql
To Restore:
psql -U demo3 demo3 < demo2.sql

Heroku - how to pull data from one database and put it to another one?

We have 2 Heroku apps - the first one is production and the second one is staging. I would like to pull data from one table from the production app (it's table users with all user's data) and push it to the staging database.
After a little research I found the addon called pgbackups - I have just a few concerns:
Does this addon also allow to get data only from one table, not from the whole database?
The second thing is this - let's say that on production are users with IDs from 1 to 300. On the staging version users with IDs from 1 to 10. How to put those 300 users from the production to the staging version the way, that these 300 users would be counted from the ID 11 (we would like to keep our staging users in the staging database as well).
Thank you
There are ways to do this in straight SQL. If you're comfortable with that, go for it. This way is for devs comfortable in Rails -- so we pull data out using JSON, and create users with a new ID in the new database from that JSON.
Since you're pulling only 1 table, AND you want to reset the IDs to the new database, I recommend:
bring a copy of the database locally with pbackups
File.open('yourfile.json', 'wb') {|file| file << User.all.to_json }
Connect to your new database, and move yourfile.json up there
then:
users_json = JSON.parse(File.read('yourfile.json'))
users_json.each do |json|
json.delete("id")
User.create(json)
end
This script pulls data from your Heroku database into a local postgres database. You need the pgbackups addon installed. Execute it from the root directory of your Heroku app.
#!/bin/bash -ex
# This script asks Heroku to backup the current production data, then downloads that dump and imports it into the local db.
# It is at least 3 times quicker than heroku db:pull, and we can only do it because we are dumping from postgres to postgres.
heroku pgbackups:capture --expire
curl -o tmp/latest.dump `heroku pgbackups:url`
pg_restore --verbose --clean --no-acl --no-owner -d your_db_name tmp/latest.dump
You can check my answer in this thread. You can use a library called forceps to do exactly what you are asking for.

Transfer initial PostgreSQL database from development to Heroku production

I have an initial set of production data stored locally in the development database that I'd like to migrate to production for a starting point for data. What's the best way to transfer this data?
It does not seem evident if there is a way to use pgbackups as per the instructions. Perhaps I have to run a manual backup of some sort locally and then push it over with pgbackups and if that is the case, I'd appreciate some specific instructions on accomplishing this.
After some additional digging and an answer from Heroku, the answer for importation of initial data is to:
1) If using PGSQL locally, first dump the data:
pg_dump -U your_username your_database_name -f backup.sql
2) Then follow the instructions found here for importation to Heroku's database:
http://devcenter.heroku.com/articles/pgbackups#importing_from_a_backup
First dump your local database using pg_dump:
pg_dump -Fc --no-acl --no-owner -h ... mydb > mydb.dump
and then use heroku pgbackups:restore:
heroku pgbackups:restore heroku-database-url url-to-your-mydb.dump-file
Note that the mydb.dump file needs to be accessible by the Heroku servers.
The Heroku Dev Center page has detailed instructions:
https://devcenter.heroku.com/articles/heroku-postgres-import-export

Resources