I'm working on a Rails 7 application that connects to an Aurora Serverless 2 Postgres cluster with a primary and 2 replicas.
The app uses multi database setup in database.yml and creates the token for authentication in an erb block in database.yml.
This works perfectly, except, as described in the linked guide, the token only lasts for 15 minutes.
My question is: How can I avoid this? My thought was maybe have a global error handle that re-generates the token and establishes the new connection, but I would have no idea how to do that.
The only "solution" I have come up with is restarting the app servers, but, obviously, that's not a viable solution.
Related
This question already has answers here:
How to connect amazon RDS in iOS
(3 answers)
Closed 3 years ago.
I am building an app for the first time and it requires a back-end connection with the database. I've established a working EC2 and RDS instance in AWS (also trying out for the first time). I don't know how exactly to establish a connection in Xcode with a .php file, there's a couple missing holes in my knowledge that I hope someone can clear up. Every online resource I've looked up doesn't use the exact stack that I'm using so I get confused as to where to go from here.
How would I go about connecting to my Amazon RDS (mySQL) credentials inside my XCode project? Also would I save a separate .php file with the server connection and insert the URL of the file from my local directory? I know I have an endpoint URL provided by RDS, I'm just confused as to how to go about connecting all of this in order to start the POST/GET requests. There isn't a clear tutorial on this, so hope to get some answers here.
The reason why you can not find tutorial on this subject is because it is not considered a good architecture practice to connect backend databases directly from mobile applications. Not only you have the problem of providing the mobile application with the database credentials, in a secure way, but also it would oblige you to implement the access and authorization logic in your mobile app and will create a hard dependency between your mobile and the database endpoint. What if it crashes ? What if you need more CPU/Storage on the server side to accommodate client requests etc ..
The correct architecture is to expose the data required by your mobile app through a API. AWS has several options to let you create an API (API Gateway and AppSync). In both cases, the amplify command line will generate the XCode Swift code for you for easy integration with your app.
See https://docs.aws.amazon.com/apigateway/latest/developerguide/how-to-generate-sdk-ios-swift.html
See an iOS code sample here https://github.com/sebsto/reinvent2018-mob320 (it doe snot have RDS integration in the backend, I will leave this as an exercise for you)
Searching the web, you will find other persons having post similar questions and having received similar answers :
https://forums.aws.amazon.com/thread.jspa?threadID=136902
How to connect amazon RDS in iOS
I have a Rails app which I moved to docker. The process forced me to split the app into 2 microservices: the main app and an address verification microservice. I encapsulated the address verification microservice into another Rails app which my main app calls. It uses rest-client and it blocks until it receives a response.
Requests used to be processed in 300ms. Now, they take 1.3s. After looking at the newrelic data, it seems the bulk of the time is spent in the main Rails app calling the address verification Rails app. Is there a way microservices should communicate between containers? I guess my question is Ruby/Rails specific. Should I look into RabbitMQ? The problem is that I need a verified address very early into the flow, so I'm not sure how much time an asynchronous request to the address verification microservice Rails app will buy me.
It turns out that the address verification microservice had a problem. I had enabled devise on the address verification and the user find/update actions were taking a lot of time. I'm still not sure why they were taking so long, but soon as I disabled them, I went back to decent numbers. I'll need to find out what the hell was happening with devise. It's still not what I had with internal calls, but docker & microservices is not that terrible.
I am using the Devise Gem for authentication in my Rails app and it works fine. So far we only had one server hosting the Rails application.
Now with AWS migration, we have two servers hosting the application. The sign in process has broken and we cannot log in. If we remove one server from the Load Balancer, it starts working again. Adding the server back breaks the login system.
We use ActiveRecord based authentication in a master-master configuration, viz. There are two DB servers in master-master mode that remain in sync.
The issue here was not related to Devise Gem. It was that the MySql database which is in master-master replication mode, did not replicate the data correctly. The session id was being saved in one DB server but not the other. Hence removing one App Server worked.
The remedy was to reconfigure the Database servers to act in master master mode.
I have been using Heroku-postgres as my database for my rails 4 app deployed to Heroku.
I connect to the DB locally using pgAdmin3, and haven't had any issues.
Now, I want to switch my database to a amazon-redshift instance which has been spun up for me. All I have is a username, password, and the database host name. Where do I store this information within my Rails 4 app so that my app will use this DB instead of the current postgres DB?
I provided a similar answer here, but I would recommend using this adapter to connect:
https://github.com/fiksu/activerecord-redshift-adapter
This certainly works well for any ActiveRecord query you need to do, I'm using insert statements to update redshift tables rather than ActiveRecord create. Working on a full redshift adapter, hopefully to be released in the next few weeks.
Here's the answer I've given in the past with code examples about halfway down:
How can I build a front end for querying a Redshift database (hopefully with Rails)
Heroku will need to support Redshift as a database option for you, otherwise you'll need to spin up your own stack.
It might be worthwhile checking out AWS EBS service to do this.
So I have a Rails application. It currently runs separate as front-end and back-end + database.
I need to scale it to have several back-end servers.
The backend server has Resque background workers running (spawned by user front-end requests). It also relies heavily on callbacks.
I am planning the following setup:
|front-end| --- |load-balancer (haproxy or AWS ELB)| --- Server 1 ---- Postgresql Database (+++ other DBs added via replication later if needed)
\___ Server 2 ---/
++ (other servers added in the same fashion later )
I have concerns about how to deal with putting Database on a separate machine in this case.
1) I intend to create a new empty Rails app with schema identical to initial back-end. Have it running and accepting updates / posts via HTTP and keep connected via remote SSH (to trigger :after_commit callbacks in back-end). Is it a bettergood idea?
2) I am using Postgresql and intend to switch to an enterprise DB once the need arises. Currently the need is to scale the part of back-end that does processing not the database.
3) Does this approach seem scalable?
I'm not sure I really understand your question. Generally in production applications the database layer is separate from the application layer. I can't quite tell if this pertains to you, but it's definitely an interesting watch. http://vimeo.com/33263672 . It talks about using a redis layer between the rails and db layers to facilitate queuing, and creating a zero downtime environment. It seems like it would be a better solution than using a second rails stack? I think it should look something like this;
|ELB| Web Servers |ELB| Application Servers |RRDNS| Redis Servers | PostGreSQL Servers |
If I am understanding your meaning. If not, that video link is still worth a watch :)