I'm new to rails, and I'm trying to deploy an app on Google App engine Flex with postgres.
My database.yml contains the following :
production:
adapter: postgresql
socket: /cloudsql/[postgres_instance_name]
database: pia-data
timeout: 5000
pool: 5
username: postgres
password: test
but I get the following error :
PG::ConnectionBad (could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
):
I tried with private IP and VPC, but the same error.
I don't know what I'm missing exactly to make it run, as it runs locally with 127.0.0.1 and cloud proxy.
Anybody had the same issue on GCP ?
Thank you in advance
Related
On our Ruby on Rails applications, we use tinytds to connect to Azure SqlServer databases. Sample configuration would be as below (and in general, it all works fine)
development:
adapter: sqlserver
host: mytestsite.database.windows.net
mode: DBLIB
port: 1433
database: mytestdb
username: myusername
password: mypassword
azure: true
At the moment, working remotely, I find this very slow. So I would like to take a local copy (as I would do with my .Net applications) and attach to that.
Finding connecting to (localdb)\MSSQLLocalDB a problem with tinyTDS.
Would anyone have done this before?
I am attempting to get rails running in an ubuntu subsystem on Windows 10.
I have installed everything needed, but rails is unable to access postgres.
In both /etc/postgresql/9.5/main/postgresql.conf and the rails config/database.yml, I have port: 5433 instead of 5432. See [1] below for why I'm using 5433.
When trying to do a database operation through rails (e.g. rails db:setup), I get this:
PG::ConnectionBad: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
database.yml:
default: &default
adapter: postgresql
encoding: unicode
pool: 5
development:
<<: *default
database: some_database
username: 'postgres'
password: 'postgres'
I cannot figure out how to make rails try to start postgres on port 5433... I'm also not sure if there is a better way to solve the issue here.
Thanks for your time.
Further details:
[1] I am using port 5433 simply because I've been unable to figure out a way to make it use 5432. I have removed all postgres libraries and made sure to reinstall one using a specified version number, but it defaults to port 5433. When changing the port to 5432 in the config, starting it yields the error that some other process is using it, but netstat, lsof, and ps aux disagree. Not sure what else to do there.
For me, I believe this issue was somehow caused by something being cached by spring or some other service. After doing the following, the problem was resolved, though I'm not sure what all was required.
Killing all spring processes
restarting the machine
deleting and re-creating database.yml
changes the port from 5433 to 9854 (basically just anything else)
changing database user and name
I have been using GCP for a few days now but I am struggling to get my Ruby on Rails app to connect to Postgres hosted on Cloud SQL.
I have managed to connect locally via the cloud SQL proxy and execute migrations but I have not gotten past that.
Here is my database.yml production settings.
production:
<<: *default
adapter: postgresql
database: databasename
username: databaseuser
password: databasepassword
host: /cloudsql/project-name-172409:us-central1:application-name
Here is my app.yaml settings.
runtime: custom
env: flex
health_check:
enable_health_check: false
beta_settings:
cloud_sql_instances: project-name-172409:us-central1:application-name
env_variables:
SECRET_KEY_BASE: 121212
My custom docker file inherits from the base ruby build and executes migrations.
The error i get is this.
PG::ConnectionBad: could not connect to server: No such file or directory
Is the server running locally and accepting
I had the same issue. I forgot to activate the Cloud SQL API. After activating it, everything worked like a charm. After activating I had to deploy everything again.
It looks like CloudSQL is not mounted during the build of the docker image which makes sense. I was assuming it was never mounted because none of my builds finished due to the migrations failed which left me to presume that the database was not connecting.
Once I removed the migration execution code from my Dockerfile and build the Docker image, I SSH'd into the app engine instance to inspect if /cloudsql directory was there which it was. I am currently assuming that Rails can connect to the database as I am not getting any errors in my webapp. I will report back once I have confirmed rails is connecting to Postgres Cloud SQL.
I have been searching for a solution couple of days and done with nothing.
Is it really impossible to run rails app from a remote database?
My case:
I have a database located on one of my offices as a local server.
I need my rails app(hosted on some hosting service) to connect and run from this private database on this private server.
Is there really a solution or workaround?
And not related to rails but is it possible with django framework. just out of curiosity.
Thank you all very much for the answers!
You can set host in your databasey.yml like:
production:
adapter: postgresql
encoding: utf8
database: prod_db
username: prod_user
password: prod_pwd
host: 10.10.10.10
port: 5432
pool: 3
But you should have static ip or you can create VPN between two servers
Last week I succeeded in getting a database and a rails app on two instances and pushing from my console my rails app with a working connection to the database. Now I want to do the same but with the two instances (app and db) safely within a VPC on AWS.
I have the two instances launched within the VPC but I'm struggling to figure out the final step of a) setting up my database.yml to connect to the db now that there's no public ec2 host to refer to; it needs to go through the VPC I assume, but how? b) setting up git so a simple git push production is linked to this rails instance in the AWS.
The only things I've found so far about going about this assume I'm creating the instances via Beanstalk, which wasn't the case. Thoughts?
Edit:
In terms of problem 1, it is now working if I do RAILS_ENV=production rails s, but only with a public DNS or IP; the private ones aren't getting me in. The server ends up timing out, but I've made sure that the VPS is open to port 5432 and so are the instances. The error is:
PG::ConnectionBad (could not connect to server: Operation timed out
Is the server running on host "10.0.0.153" and accepting
TCP/IP connections on port 5432?
):
And here's my database.yml file
production:
adapter: postgresql
encoding: unicode
database: database_name
host: ip_address
pool: 10
username: my_username
password: my_password