Google App Engine + Postgres Cloud SQL connection issues - ruby-on-rails

I have been using GCP for a few days now but I am struggling to get my Ruby on Rails app to connect to Postgres hosted on Cloud SQL.
I have managed to connect locally via the cloud SQL proxy and execute migrations but I have not gotten past that.
Here is my database.yml production settings.
production:
<<: *default
adapter: postgresql
database: databasename
username: databaseuser
password: databasepassword
host: /cloudsql/project-name-172409:us-central1:application-name
Here is my app.yaml settings.
runtime: custom
env: flex
health_check:
enable_health_check: false
beta_settings:
cloud_sql_instances: project-name-172409:us-central1:application-name
env_variables:
SECRET_KEY_BASE: 121212
My custom docker file inherits from the base ruby build and executes migrations.
The error i get is this.
PG::ConnectionBad: could not connect to server: No such file or directory
Is the server running locally and accepting

I had the same issue. I forgot to activate the Cloud SQL API. After activating it, everything worked like a charm. After activating I had to deploy everything again.

It looks like CloudSQL is not mounted during the build of the docker image which makes sense. I was assuming it was never mounted because none of my builds finished due to the migrations failed which left me to presume that the database was not connecting.
Once I removed the migration execution code from my Dockerfile and build the Docker image, I SSH'd into the app engine instance to inspect if /cloudsql directory was there which it was. I am currently assuming that Rails can connect to the database as I am not getting any errors in my webapp. I will report back once I have confirmed rails is connecting to Postgres Cloud SQL.

Related

How do you connect rails app with postgres database on app engine?

I'm new to rails, and I'm trying to deploy an app on Google App engine Flex with postgres.
My database.yml contains the following :
production:
adapter: postgresql
socket: /cloudsql/[postgres_instance_name]
database: pia-data
timeout: 5000
pool: 5
username: postgres
password: test
but I get the following error :
PG::ConnectionBad (could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
):
I tried with private IP and VPC, but the same error.
I don't know what I'm missing exactly to make it run, as it runs locally with 127.0.0.1 and cloud proxy.
Anybody had the same issue on GCP ?
Thank you in advance

Rails trying to connect to postgresql on port 5432, but it's configured for 5433

I am attempting to get rails running in an ubuntu subsystem on Windows 10.
I have installed everything needed, but rails is unable to access postgres.
In both /etc/postgresql/9.5/main/postgresql.conf and the rails config/database.yml, I have port: 5433 instead of 5432. See [1] below for why I'm using 5433.
When trying to do a database operation through rails (e.g. rails db:setup), I get this:
PG::ConnectionBad: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
database.yml:
default: &default
adapter: postgresql
encoding: unicode
pool: 5
development:
<<: *default
database: some_database
username: 'postgres'
password: 'postgres'
I cannot figure out how to make rails try to start postgres on port 5433... I'm also not sure if there is a better way to solve the issue here.
Thanks for your time.
Further details:
[1] I am using port 5433 simply because I've been unable to figure out a way to make it use 5432. I have removed all postgres libraries and made sure to reinstall one using a specified version number, but it defaults to port 5433. When changing the port to 5432 in the config, starting it yields the error that some other process is using it, but netstat, lsof, and ps aux disagree. Not sure what else to do there.
For me, I believe this issue was somehow caused by something being cached by spring or some other service. After doing the following, the problem was resolved, though I'm not sure what all was required.
Killing all spring processes
restarting the machine
deleting and re-creating database.yml
changes the port from 5433 to 9854 (basically just anything else)
changing database user and name

Configure a remote sqlite3 database on rails

I programmed an application using rails and I can deploy it on a single machine. It uses a sqlite3 database that is created in the local machine.
Now I need to put that db on another machine, but I have no idea how. I installed a rails environment on the other machine and sqlite3. I configured the database.yml file this way:
development:
adapter: sqlite3
database: db/development.sqlite3
host: 172.**.**.**
pool: 5
timeout: 10000
username: username
password: password
However nothing happens. Do I need to configure something on the other machine? Am I missing something? Sorry if I seem ignorant, is the first time I do something like this.
Out of the box, no, you cannot, and it's actually discouraged in SQLite own manual:
If you have many client programs accessing a common database over a
network, you should consider using a client/server database engine
instead of SQLite.
You can take a look at various solutions built around SQLite to solve this problem here.
However, a much better solution would be to switch to another RDBMS such as MySQL or Postgresql. It should not impact your app much (as ActiveRecord does a nice job of isolating you of the DB specific instruction).

Git Deployment to an AWS instance within a VPC

Last week I succeeded in getting a database and a rails app on two instances and pushing from my console my rails app with a working connection to the database. Now I want to do the same but with the two instances (app and db) safely within a VPC on AWS.
I have the two instances launched within the VPC but I'm struggling to figure out the final step of a) setting up my database.yml to connect to the db now that there's no public ec2 host to refer to; it needs to go through the VPC I assume, but how? b) setting up git so a simple git push production is linked to this rails instance in the AWS.
The only things I've found so far about going about this assume I'm creating the instances via Beanstalk, which wasn't the case. Thoughts?
Edit:
In terms of problem 1, it is now working if I do RAILS_ENV=production rails s, but only with a public DNS or IP; the private ones aren't getting me in. The server ends up timing out, but I've made sure that the VPS is open to port 5432 and so are the instances. The error is:
PG::ConnectionBad (could not connect to server: Operation timed out
Is the server running on host "10.0.0.153" and accepting
TCP/IP connections on port 5432?
):
And here's my database.yml file
production:
adapter: postgresql
encoding: unicode
database: database_name
host: ip_address
pool: 10
username: my_username
password: my_password

Deploy rails app on dotcloud

I'm trying to deploy a ruby on rails app to dotcloud. The app is deployed but when I try to access the url, I get this error:
could not connect to server: No such file or directory Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"? (PG::Error)
I'm using a postgresql database. What all changes do I need to make in the database.yml file?
I've also followed the steps outlined here:
http://docs.dotcloud.com/services/postgresql/
Can anyone please help on this?
It looks like your app is configured to use a local PostgreSQL database (local as in "running on the same machine"). You should make sure that your dotcloud.yml file contains a section for a PosgreSQL database, e.g.:
db:
type: postgresql
Then use either dotcloud info to retrieve the host, port, and credentials of the database, or parse them from environment.json in your Ruby app.
This last step is explained in the dotCloud PostgreSQL service documentation.

Resources