Can I use Docker for production deployment of a Rails application? - ruby-on-rails

I want to use Docker to deploy my Rails application. I want to know if there is someone tried this? And what problems can I face?

Deploying Rails apps to production with Docker is not only possible, but something you'd want to do, to make sure your app runs on any server you deploy.
This comes with some challenges. First, it's advisable to run your database server and your Rails app different containers to keep things isolated. You can also set up your production server Docker environment with Docker Machine. Machine allows you to configure AWS, Digital Ocean, Azure and Compute Engine instances (among many others), and manage your containers from your computer. I assume you're just getting started with Docker, so I suggest you take a look at this cool guide about setting up a Rails + Postgres app with Docker.

Related

Is it possible to use docker as dev enviornment for golang app engine standard enviornment?

My understanding is that docker needs App Engine Flexible Environment.
But I want to use Docker to create dev and local testing environment only, so that it will be easier and faster to replicate the environment on dev machines. I still want to deploy the GoLang app to App Engine Standard Environment. I am wondering if there is a way?
You can build custom runtimes using Docker, you will only need App Engine Flexible if you want to deploy them. In your case, as you want to deploy to App Engine Standard, I would recommend using the Development Server to simulate correctly the environment.
You cannot use Containers with App Engine Standard. You can use containers with App Engine Flexible.

Multiple web apps with Docker architecture

I have multiple web apps, all of them running on Apache, many of them using PHP, MySQL, node, etc.
I'm not currently using Docker, but I would like to use it, and I would like to know what would be the best architectureto use.
I suppose that in my localhost I should create a container with Apache, and all the applications would be using it (am I wrong?). The same with MySQL if the application uses it.
But then, what happens when I want to deploy my projects (or some of them) into a production environment? I'm currently using Microsoft Azure WebApps, and I don't thing that my 'localhost' setup will be valid. I suppose that in production, each project should have its own Apache, but this changes my Docker setup, and I don't think this is the Docker philosophy.
So, how should I structure my architecture?

Feasibility of choosing EC2 + Docker as a production deployment option

I am trying to deploy my microservice in EC2 machine. I already launched my EC2 machine with Ubuntu 16.04 LTS AMI. And also I found that we can install Docker and run containers through Docker installation. Also I tried sample service deployment using Docker in my Ubuntu. I successfully run commands using -d option for running image in background also.
Can I choose this EC2 + Docker for deployment of my microservice for actual production environment? Then I can deploy all my Spring Boot microservice in this option.
I know that ECS is another option for me.To be frank trying to avoid ECR, ECS optimized AMI and its burdens, Looking for machine with full control that only belongs to me.
But still I need to know about the feasibility of choosing EC2 + Docker through my Ubuntu machine. Also I am planning to deploy my Angular 2 app. I don't need to install, deploy and manage any application server for both Spring Boot and Angular, since it will gives me about a serverless production environment.
What you are describing is a "traditional" single server environment and does not have much in common with a microservices deployment. However keep in mind that this may be OK if it is only you, or a small team working on the whole application. The microservices architectural style was introduced to be able to handle huge, complex applications with large development teams that require to scale out immensely due to fast business growth. Here an example story from Uber.
Please read this for more information about how and why the microservices architectural style was introduced as well as the benefits/drawbacks. Now about your question:
"Can I choose this EC2 + Docker for deployment of my microservice for actual production environment? "
Your question can be simply answered: You can, but it is probably not a good idea assuming you have a large enough project to require a microservices architecture.
You would have to implement all of the following deployment aspects yourself, which is typically covered by an orchestration system, like kubernetes:
Service Discovery and Load Balancing
Horizontal Scaling
Multi-Container Application Deployment
Container Health-Management / Self-Healing
Virtual Networking
Rolling Updates
Storage Orchestration
"Since It will gives me about a serverless production environment to
me."
EC2 is by definition not serverless, of course. You will have to maintain your EC2 instances, including OS updates, security patches etc. And if you only have a single server you will have service outages because of it.
You can do it. I have had Docker on standard EC2 instances running without problem. By "my microservice" you mean a single microservice, right?
You don't need service discovery or routing rules?
Can I choose this EC2 + Docker for deployment of my microservice for actual production environment?
Yes, this is totally possible, although I suggest using kubernetes as the container-orchestrator as it manages the lifecycle of the containers for you:
Running Kubernetes on AWS EC2
Amazon Elastic Container Service for Kubernetes
Manage Kubernetes Clusters on AWS Using Kops
Amazon EKS

Linking containers together on production deploys

I want to migrate my current deploy to docker, it counts on a mongodb service, a redis service, a pg server and a rails app, I have created already a docker container for each but i have doubts when it comes to start and linking them. Under development I'm using fig but I think it was not meant to be used on production. In order to take my deployment to production level, what mechanism should I use to auto-start and link containers together? my deploy uses a single docker host that already runs Ubuntu so i can't use CoreOS.
Linknig containers in production is a tricky thing. It will hardwire the IP addresses of the dependent containers so if you ever need to restart a container or launch a replacement (like upgrading the version of mongodb) your rails app will not work out of the box with the new container and its new IP address.
This other answer explains some available alternatives to linking.
Regarding starting the containers, you can use any deployment tool to run the required docker commands (Capistrano can easily do that). After that, docker will restart running the containers after a reboot.
You might need a watcher process to restart containers if they die, just as you would have one for a normal rails app.
Services like Tutum and Dockerize.it can make this simpler. As far as I know, Tutum will not deploy to your servers. Dockerize.it will, but is very rough (disclaimer: I'm part of the team building it).
You can convert your fig configuration to CoreOS formatted systemd configuration files with fig2coreos. Google App Engine supports CoreOS, or you can run CoreOS on AWS or your cloud provider of choice. fig2coreos also supports deploying to CoreOS in Vagrant for local development.
CenturyLink (fig2coreos authors) have an example blog post here:
This blog post will show you how to bridge the gap between building
complex multi-container apps using Fig and deploying those
applications into a production CoreOS system.
EDIT: If you are constrained to an existing host OS you can use QEMU ("a generic and open source machine emulator and virtualizer") to host a CoreOS instance. Instructions are available from the CoreOS team.

Running and Deploying Rails to Docker Container

I am a total noob to linux containers and been spending some time learning about Docker, and forgive my confusion thought this question. Currently, I have a Rails app in production deployed via capistrano. My cloud servers are maintained with Opscode Chef on the Debian Wheezy distribution. For development, I have a Vagrant VM preinstalled with the app and services.
If I were to employ Docker, where would my app sit? The container or the host? How would I deploy (production) and share directories (development)? Can I run all my additional services ie memcache, redis, postgresql, etc on the same server using docker? I can maybe envision the potential of Docker but having trouble seeing its practical use.
Seems like containers are part of the future. Any guidance for someone making the switch from virtualization?
If I were to employ Docker, where would my app sit?
It could sit inside the container or it could sit on the host(you can use docker build to copy the app into the container)
How would I deploy (production) and share directories (development)?
Deploying your app would mean committing your local container into an image, publishing it
and running a container out of the published images on your servers. I have not tried sharing directories between host and container, but you can try this : https://gist.github.com/jpetazzo/5668338 . You can also write a Dockerfile which can copy a directory to a target in the container. Docker's docs on building images will help you there.
Can I run all my additional services ie memcache, redis, postgresql, etc on the same server using docker?
Yes. You will be running multiple containers on the same server.
I'm no expert and I haven't even used docker myself, but as I understand it, your app sits inside a docker container. You would deploy ideally a whole container with your own ruby version installed and so on.
The big benefit is, that you can test exactly the same container in your staging system that you're going to ship to production then. So you're able to test the complete system with all installed C extensions, the exact same ls command and so on.

Resources