I am trying to seed sql file that looks like that:
COPY courses (id, name, "courseID", school_id, created_at, updated_at) FROM stdin;
1 Fundamentals of Programming CSCI150 1 2016-04-27 14:04:07.460825 2016-04-27 14:04:07.460825
I try this code:
connection = ActiveRecord::Base.connection
# migrate all courses
sql = File.read('db/nurate.sql')
statements = sql.split(/;$/)
#statements.pop
ActiveRecord::Base.transaction do
statements.each do |statement|
connection.execute(statement)
end
end
Previously I used this code for INSERT etc statements. But here I have error as follows:
PG::UnableToSend: another command is already in progress
So Believe this has something to do with the fact that my sql dump file contains "COPY" statements. What can I do with this? I don't want to change the whole file to use INSERT statements. Or is it the only solution?
I've got same issue and it costed me couple of days.
Firstly I guess that you've used the sql from dump export feature of postgres. So because of that when ever you call
connection.execute(statement)
It will cause error because it's never been a legit call for query in postgresql with provide input data. I've done some check over internet. It seem this thread provide some helpful information.
Okei so in order to keep using buck copy combine with the sql file export by pg_dump, I have come up with quite dirty code (IMO)
unless Rails.env.production?
connection = ActiveRecord::Base.connection
connection.tables.each do |table|
connection.execute("TRUNCATE #{table}") unless table == "schema_migrations"
end
# Create a sql file with purely COPY data and ignoring schema_migrations data
cmd = %x[awk -v RS='' '/COPY/{if (!match($0,/schema/)) print $0}' #{Rails.root}/db/dump.sql> #{Rails.root}/db/temp.sql]
puts cmd
# Restoring sql data to postgres data. Putting password to env variable is never been smart
cmd = %x[PGPASSWORD=#{ActiveRecord::Base.connection_config[:password]} psql --host #{ActiveRecord::Base.connection_config[:host]} -U #{ActiveRecord::Base.connection_config[:username]} --dbname #{ActiveRecord::Base.connection_config[:database]} -1 -f #{Rails.root}/db/temp.sql]
puts cmd
# remove the deed
cmd = %x[rm #{Rails.root}/db/temp.sql]
puts cmd
end
Related
The new multi-database feature is great, however it is unclear to me how you can still execute raw mysql calling ActiveRecord::Base.connection.execute and have it select a specific database in the database.yml file.
this
ActiveRecord::Base.connected_to(role: :reading) do
end
Doesnt accept a database
ActiveRecord::Base.connection.execute always seems to USE the first database in the config file.
thanks
Tried: ActiveRecord::Base.connected_to do ...
doesn't expect a database
This question already has answers here:
Rails + Postgres drop error: database is being accessed by other users
(20 answers)
Closed 9 years ago.
Is there a way to force a database reset even if Postgres has other users using it. I almost always get this error when I try a $ rake db:reset:
Couldn't drop example_database_name :
#<ActiveRecord::StatementInvalid: PG::Error: ERROR: database "example_database_name" is being accessed by other users DETAIL:
There are 2 other session(s) using the database.
Put this in a file lib/database.rake if you find yourself using db:reset a lot in development.
require 'active_record/connection_adapters/postgresql_adapter'
module ActiveRecord
module ConnectionAdapters
class PostgreSQLAdapter < AbstractAdapter
def drop_database(name)
raise "Nah, I won't drop the production database" if Rails.env.production?
execute <<-SQL
UPDATE pg_catalog.pg_database
SET datallowconn=false WHERE datname='#{name}'
SQL
execute <<-SQL
SELECT pg_terminate_backend(pg_stat_activity.pid)
FROM pg_stat_activity
WHERE pg_stat_activity.datname = '#{name}';
SQL
execute "DROP DATABASE IF EXISTS #{quote_table_name(name)}"
end
end
end
end
Obviously only designed for use on non-production databases. It will cause all existing db connections to be severed, so the next page load might be an error if unicorn/passenger/pow is pooling db connections. If this happens, a simple page refresh will cause the server to open a new connection and all will be well.
if the connected sessions are from your rails process - you definitely want to stop rails, strange things could occur if you drop and rebuild the db from underneath the running process
anyhow the following can be run from the command line to drop postgres connections
psql -c "SELECT pid, pg_terminate_backend(pid) as terminated FROM pg_stat_activity WHERE pid <> pg_backend_pid();" -d 'example_database_name'
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Truncating all tables in a postgres database
How can I delete all data from all tables without dropping the database?
You can use raw connection object to execute SQL statements:
connection = ActiveRecord::Base.connection
connection.tables.each{|t| connection.execute "TRUNCATE #{t}"}
Use the DatabaseCleaner gem.
DatabaseCleaner.strategy = :truncation
DatabaseCleaner.clean
If you absolutely must have this within a rake task, just wrap it up in one yourself.
For ad-hoc use
Run this statement in the database (Careful! Nukes all your data!):
DO
$func$
BEGIN
EXECUTE (
SELECT 'TRUNCATE TABLE '
|| string_agg(quote_ident(t.tablename), ', ')
|| ' CASCADE'
FROM pg_tables t
WHERE t.schemaname = 'public' -- assuming default schema
);
END
$func$;
The DO command was introduced with PostgreSQL 9.0. You would create a plpgsql function and execute it for older versions.
Compare to the closely related question: Truncating all tables in a Postgres database
For repeated use
It might be simpler (and faster!) to create a "template" database (let's name it my_template) with your vanilla structure and all empty tables. Then go through a DROP / CREATE DATABASE cycle:
DROP DATABASE mydb;
CREATE DATABASE mydb TEMPLATE my_template;
This is extremely fast, because PostgreSQL copies the whole structure on the file level. No concurrency issues or other overhead slowing you down.
You can do a backup of the database with pg_dump and restore only the schema of the database with pg_resotre --schema-only , it deletes all data in all tables.
Exemple:
To backup.
pg_dump --format=c --compress=0 -h localhost mydatabasename > mydump.dmp
To restore only schema information without data.
pg_restore -c --schema-only mydump.dmp | psql -h localhost mydatabasename
I want to execute this sql script inside my seed.rb
LOAD DATA LOCAL INFILE '/home/list-38.csv'
INTO TABLE list
FIELDS TERMINATED BY ':'
LINES TERMINATED BY '\n'
(email,name,password);
I checked this link but unable to figure out the solution.So that once we run rake db:seed
How to seed mysql database by running sql scripts in Ruby Rails platform? it dumps my data into the table.
any queries ..do reply
Thanx
Try this in db/seeds.rb to execute raw SQL with rake db:seed
connection = ActiveRecord::Base.connection()
connection.execute("*_YOUR_SQL_HERE_*")
For multiple sql statements, I ended up iterating over each statement separately:
# db/seeds.rb
connection = ActiveRecord::Base.connection()
sql = <<-EOL
INSERT INTO users values ('Admin');
INSERT INTO categories values ('Supervisors');
EOL
sql.split(';').each do |s|
connection.execute(s.strip) unless s.strip.empty?
end
A variation on Fredrik Boström's answer that can load from a file:
# db/seeds.rb
def execute_sql_file(path, connection = ActiveRecord::Base.connection)
require 'active_support/core_ext/object/blank.rb'
IO.read(path).split(';').reject(&:blank?).each do |statement|
connection.execute(statement)
end
end
execute_sql_file('my.sql')
Super late to this answer but hopefully it helps someone in my position. Instead of iterating over each sql statement you can use .execute_batch using a direct connection to your database with the database version you are using. Below is an example with SQLite3.
#db/seeds.rb
DB = {:conn => SQLite3::Database.new("./db/development.sqlite")}
sql = <<-EOL
INSERT INTO users values ('Admin');
INSERT INTO categories values ('Supervisors');
EOL
DB[:conn].execute_batch(sql)
Is there an easy way to see the actual SQL generated by a rails migration?
I have a situation where a migration to change a column type worked on my local development machine by partially failed on the production server.
My postgreSQL versions are different between local and production (7 on production, 8 on local) so I'm hoping by looking at the SQL generated on the successful migration locally I can work out a SQL statement to run on production to fix things....
Look at the log files: log/development.log locally vs log/production.log on your server.
I did some digging and found another way this can be achieved too... (This way only gives you the SQL so it was a bit easier for me to read)
Postgresql will log all the queries executed if you put this line in your config file: (there's an example which has been commented out in the "What to log" section of the config file)
log_statement = 'all'
Then I rolled back and re-ran my migration locally to find the SQL I was looking for.
This method also gives you the SQL in a format where you can easily paste it into something like PGAdmin's query builder and mess around with it.
You could set the logger to STDOUT at the top of your migration's change, up, or down methods. Example:
class SomMigration < ActiveRecord::Migration
def change
ActiveRecord::Base.logger = Logger.new(STDOUT)
# ...
end
end
Or see this answer for adding SQL logging to all rake tasks