How to enable contrib modules on Heroku Postgres database - ruby-on-rails

I'm trying to use contrib modules in the new Postgres 9 shared databases on Heroku. More specifically, the pg_trgm and fuzzystrmatch modules. In the documentation it says
In addition, many complimentary extensions are available such as
fuzzystrmatch, pg_trgm, and unaccent.
I can't seem to find any documentation on HOW to actually enable these modules on a shared Heroku database. See answer below.
NOTE:
I tried adding them by connecting to the database with
heroku pg:psql HEROKU_POSTGRESQL_BROWN
and running
create extension pg_trgm
create extension fuzzystrmatch
but after trying to use it with
SELECT levenshtein('tests', 'test');
it still said
ERROR: function levenshtein(unknown, unknown) does not existLINE 1: SELECT levenshtein('tests', 'test');
^
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
Anybody know why this happens?

Found answer here while scouring Stack Overflow. Don't know why it didn't come up in any of my Google searches. Going to leave the question here if anybody else uses the same wording to search for this.
To enable modules, you need to add them to a migration as follows:
def up
execute "create extension fuzzystrmatch"
execute "create extension pg_trgm"
end

In newer versions of Rails it should be sufficient to do:
def change
enable_extension "fuzzystrmatch"
enable_extension "pg_trgm"
end
If you need to write up and down methods, the corresponding method to enable_extension is disable_extension.

Related

(PG::UndefinedObject: ERROR: type "hstore" does not exist) in rails production

I have a element with type hstore, during the migration for schema it does not work even though the database is enabled with hstore extension.I get a (PG::UndefinedObject: ERROR: type "hstore" does not exist) error during migration. This work perfectly in local. How to make it effect across all schema?
To create extension in your database, you have to explicitly connect to that database. So, if your database is my_app_development, you have to do :
sudo -u postgres psql my_app_development
create extension hstore;
Also, you do not tell which rails version you're on. If you're not on rails-4, you will have to use the postgres hstore gem.
That extension is most likely located in schema outside of search_path for the user that is used in query that gave you error.
You can fix it either by recreating extension in public schema:
CREATE EXTENSION hstore WITH SCHEMA public;
Note that it is possible to change default setting and not have public in search_path.
Or adding to search_path schema that has hstore located at:
ALTER ROLE your_role_name
SET search_path = public, your_role_name, some_schema_with_hstore_extension;
This requires new connection to take effect. You can also use SET search_path ... in session to have immediate effect for that session only. I do not remember at the moment if permissions are required for your_role_name to schema some_schema_with_hstore_extension and hstore objects within it; most likely it is required, but might already be granted.
This works for me.
SELECT public.hstore_to_json(hstore_col)
FROM ...;

How do I enable an extension only if it doesn't already exist?

I’m using Rails 4.2.7. I want to create a migration that enables an extension, but only if that extension doesn’t exist in the host environment in which I’m running. I have created
class EnableUuidOsspExtension < ActiveRecord::Migration
def change
enable_extension 'uuid-ossp'
end
end
but I would like to suppress enabling of the extension if this is already enabled. How do I adjust the above migration to achieve this? The motivation for this is because on my local machine I have to run this to add it to PostGres, but if I migrate to Heroku, this extension may already be in place but I don’t want things to crash when I run my db migration scripts.
There is an extensions method that returns an array of extension names so you could do something like this:
def up
enable_extension('uuid-ossp') unless extensions.include?('uuid-ossp')
end
def down
disable_extension('uuid-ossp') if extensions.include?('uuid-ossp')
end
You could also do it by hand in SQL where you'd have access to create extension if not exists:
def up
connection.execute('create extension if not exists "uuid-ossp"')
end
By Postgres documentation, you can explicitly flag IF NOT EXISTS (https://www.postgresql.org/docs/current/static/sql-createextension.html)
This is also used at enable_extension in PostgreSQLAdapter (https://github.com/rails/rails/blob/master/activerecord/lib/active_record/connection_adapters/postgresql_adapter.rb#L332)
def enable_extension(name)
exec_query("CREATE EXTENSION IF NOT EXISTS \"#{name}\"").tap {
reload_type_map
}
end
Also, in case it would not be used, Postgres does not re-create or do some magic if the extension is already installed. It will simply throw error – for which case normally your migration should not anyway crash and burn :)

How to run specific script after connected to oracle using rails?

I need to run an oracle script after connect to oracle database using ActiveRecord.
I know that exists the initializers, but these run only in the application's start. I need a point to write a code that runs after every new database connection be established.
This is needed to initialize some oracle environments variables shared with others applications that uses the same legacy database.
Any ideas?
Thanks in advance.
I found the solution:
Create the file /config/initializers/oracle.rb and put into it this code:
ActiveRecord::ConnectionAdapters::ConnectionPool.class_eval do
def new_connection_with_initialization
result = new_connection_without_initialization
result.execute('begin Base_Pck.ConfigSession; end;')
result
end
alias_method_chain :new_connection, :initialization
end
The alias_method_chain allows you to change a method (new_connection) without override it, but extending it.
Then we need only to change the script into the result.execute call.

Rails migrations with database-specific data types

I'm currently running a Rails migration where I am adding a datatype specific to Postgres, the tsvector. It holds search information in the form that Postgres expects for its built-in text searching capabilities.
This is the line from my migration:
t.column "search_vectors", :tsvector
Everything seems to be working fine, and the search works with it. However, when I opened up schema.rb, this is what I got:
Could not dump table "users" because of following StandardError
Unknown type 'tsvector' for column 'search_vectors'
This is preventing me from running unit tests on the user table, and also strikes me as really dangerous looking given that the schema.rb is supposed to be the authoritative definition of my database.
I notice there are a number of Rails plugins that seem to use the same approach of storing the tsvector like I would expect, such as tsearchable. Am I really just stuck without testing and without an authoritative definition of my database?
FYI for anyone who happens across this page, I fixed this by adding this (actually uncommenting it) to my Rails config:
config.active_record.schema_format = :sql
Have you tried specifying the type as a string instead of a symbol?
t.column "search_vectors", "tsvector"
If that doesn't work then you might need to drop down to database-specific SQL:
def self.up
execute "--Put your PostgreSQL specific SQL statements here"
end

Ruby on Rails Migration - Create New Database Schema

I have a migration that runs an SQL script to create a new Postgres schema. When creating a new database in Postgres by default it creates a schema called 'public', which is the main schema we use. The migration to create the new database schema seems to be working fine, however the problem occurs after the migration has run, when rails tries to update the 'schema_info' table that it relies on it says that it does not exist, as if it is looking for it in the new database schema and not the default 'public' schema where the table actually is.
Does anybody know how I can tell rails to look at the 'public' schema for this table?
Example of SQL being executed: ~
CREATE SCHEMA new_schema;
COMMENT ON SCHEMA new_schema IS 'this is the new Postgres database schema to sit along side the "public" schema';
-- various tables, triggers and functions created in new_schema
Error being thrown: ~
RuntimeError: ERROR C42P01 Mrelation "schema_info" does not exist
L221 RRangeVarGetRelid: UPDATE schema_info SET version = ??
Thanks for your help
Chris Knight
Well that depends what your migration looks like, what your database.yml looks like and what exactly you are trying to attempt. Anyway more information is needed change the names if you have to and post an example database.yml and the migration. does the migration change the search_path for the adapter for example ?
But know that in general rails and postgresql schemas don't work well together (yet?).
There are a few places which have problems. Try and build and app that uses only one pg database with 2 non-default schemas one for dev and one for test and tell me about it. (from thefollowing I can already tell you that you will get burned)
Maybe it was fixed since the last time I played with it but when I see http://rails.lighthouseapp.com/projects/8994/tickets/390-postgres-adapter-quotes-table-name-breaks-when-non-default-schema-is-used or this http://rails.lighthouseapp.com/projects/8994/tickets/918-postgresql-tables-not-generating-correct-schema-list or this in postgresql_adapter.rb
# Drops a PostgreSQL database
#
# Example:
# drop_database 'matt_development'
def drop_database(name) #:nodoc:
execute "DROP DATABASE IF EXISTS #{name}"
end
(yes this is wrong if you use the same database with different schemas for both dev and test, this would drop both databases each time you run the unit tests !)
I actually started writing patches. the first one was for the indexes methods in the adapter which didn't care about the search_path ending up with duplicated indexes in some conditions, then I started getting hurt by the rest and ended up abandonning the idea of using schemas: I wanted to get my app done and I didn't have the extra time needed to fix the problems I had using schemas.
I'm not sure I understand what you're asking exactly, but, rake will be expecting to update the version of the Rails schema into the schema_info table. Check your database.yml config file, this is where rake will be looking to find the table to update.
Is it a possibility that you are migrating to a new Postgres schema and rake is still pointing to the old one? I'm not sure then that a standard Rails migration is what you need. It might be best to create your own rake task instead.
Edit: If you're referencing two different databases or Postgres schemas, Rails doesn't support this in standard migrations. Rails assumes one database, so migrations from one database to another is usually not possible. When you run "rake db:migrate" it actually looks at the RAILS_ENV environment variable to find the correct entry in database.yml. If rake starts the migration looking at the "development" environment and database config from database.yml, it will expect to update to this environment at the end of the migration.
So, you'll probably need to do this from outside the Rails stack as you can't reference two databases at the same time within Rails. There are attempts at plugins to allow this, but they're majorly hacky and don't work properly.
You can use pg_power. It provides additional DSL for migration to create PostgreSQL schemas and not only.

Resources