Do we end up with multiple Apache/PHP installations when using Docker? - docker

I am trying to understand Docker. The idea is that we can create a complete environment and have it run on any machine. But in my mind there is a problem.
Imagine that I create a Apache/PHP/MySQL application where Apache/PHP is on 1) one container and MySQL is 2) another container. Then my friend says - "Hey mate, I just created this really cool Apache/PHP/PostgreSQL application, can I run it on your server?". But he also has a Apache/PHP (different configuration) container and a PostgreSQL container. He DID NOT KNOW that there is already a Apache/PHP container running on the server. That means we will have 2 Apache and PHP installations running on the server.
If we were not using Docker we could use the same Apache and PHP. Except we would need to install some Postgres extensions. Is my understanding correct? Is this how it is done?

Related

Docker without internet

I am currently working on a project which needs to be deployed on customer infra (which is not cloud) and also it will not have internet.
We currently deploy manually our application and install dependencies using tarball, can docker help us here?
Note:
Application stack:
NodeJs
MySql
Elasticsearch
Redis
MongoDB
We will not have internet.
You can use docker load and docker save to load Docker images in TAR format or export these images. If you package your application files within these images this could be used to deliver your project to your customers.
Also note that the destination services must all have Docker Engine installed and running.
If you have control over your dev environment, you can also use Nexus or Gitlab as your private Docker repository. You can then pull your images from there into production, if it makes sense for your product.
I think the most advantage can be had in your local dev setup. Instead of installing, say, MySQL locally, you can run it as a Docker container. I use docker-compose for all client services in my current project. This helps keep your computer clean, makes it easy to avoid versioning hell (if you use different versions for each release or stage) and you don't have to mess around with configuration for each dev machine.
In my previous job every developer had a local Oracle SQL install, and that was not a happy state of affairs.

Two seperate docker php environments one shared mysql

I am SUPER new to Docker and have spent 4-5 hrs trying to figure this out with no luck so I am turning to you docker geniuses.
I currently have multiple websites, each with their own docker container. Each container is a full environment I created using the Docker documentation - PHP, Ubuntu, Mysql, Server (nginx / apache). Though this works, it isn't what I need / want in the long run.
I have a several Laravel sites running php7 and ngnix with mysql. I also have a couple Phalcon php 5.5 containers using apache and mysql. For each site I have a container built from a base like this webdevops then using the exec command went in and added the laravel stuff or Phalcon stuff.
The problem is that many times I need to reference multiple databases as once. The sites aren't linked but I need a quick look at a db from another project. I also need to run a new container for EACH site, which is stupid because all Laravel sites have the EXACT same environment.
What I would love is to have a mysql container with all my databases. A container with my php7 and nginx for ALL my laravel sites and a container with php 5.5 and apache for ALL my Phalcon stuff. This lets me just look at the code in one environment (without running the environment) AND see the tables in the database while running the other environment. Ie running environment container A that has sites 1, 2, 3 mapped and a shared mysql container. So I can see sites 3, 4 databases without running environment container B
I tried creating yaml files in each project and having a shared dir with envrionment dockerfiles but that isn't working. I have used the likes of this, and this and this to try and guide me but no luck.
Can anyone give me some pointers on where to start or help me with a super simple base example of how to do this?
Thanks in advance.

How can my friend and I share an exact development environment together while on different operating systems?

I use a Mac for development and deployment, and have a need for creating an isolated environment. I've been exploring vagrant and docker and it seems that in order to run Docker, I need to be on a linux environment. I'm running an instance of vagrant with Ubuntu, the same as my partner uses on their desktop.
My question is, can my partner run the docker container off their Ubuntu instance instead of having to setup Vagrant like myself? Does my server and app run inside my Docker instance? (I'm using MEAN).
Trying to build a workflow and piece it all together.
He could probably get docker to run but packaging it all inside of a vagrant VM really is the way to go as that will keep it transportable across the board.
You can skip the vagrant file and just share the Docker images. There should be no detectable host differences from within the container.

What would be a good docker webdev workflow?

I have a hunch that docker could greatly improve my webdev workflow - but I haven't quite managed to wrap my head around how to approach a project adding docker to the stack.
The basic software stack would look like this:
Software
Docker image(s) providing custom LAMP stack
Apache with several modules
MYSQL
PHP
Some CMS, e.g. Silverstripe
GIT
Workflow
I could imagine the workflow to look somewhat like the following:
Development
Write a Dockerfile that defines a LAMP-container meeting the requirements stated above
REQ: The machine should start apache/mysql right after booting
Build the docker image
Copy the files required to run the CMS into e.g. ~/dev/cmsdir
Put ~/dev/cmsdir/ under version control
Run the docker container, and somehow mount ~/dev/cmsdir to /var/www/ on the container
Populate the database
Do work in /dev/cmsdir/
Commit & shut down docker container
Deployment
Set up remote host (e.g. with ansible)
Push container image to remote host
Fetch cmsdir-project via git
Run the docker container, pull in the database and mount cmsdir into /var/www
Now, this looks all quite nice on paper, BUT I am not quite sure whether this would be the right approach at all.
Questions:
While developing locally, how would I get the database to persist between reboots of the container instance? Or would I need to run sql-dump every time before spinning down the container?
Should I have separate container instances for the db and the apache server? Or would it be sufficient to have a single container for above use case?
If using separate containers for database and server, how could I automate spinning them up and down at the same time?
How would I actually mount /dev/cmsdir/ into the containers /var/www/-directory? Should I utilize data-volumes for this?
Did I miss any pitfalls? Anything that could be simplified?
If you need database persistance indepent of your CMS container, you can use one container for MySQL and one container for your CMS. In such case, you can have your MySQL container still running and your can redeploy your CMS as often as you want independently.
For development - the another option is to map mysql data directories from your host/development machine using data volumes. This way you can manage data files for mysql (in docker) using git (on host) and "reload" initial state anytime you want (before starting mysql container).
Yes, I think you should have a separate container for db.
I am using just basic script:
#!/bin/bash
$JOB1 = (docker run ... /usr/sbin/mysqld)
$JOB2 = (docker run ... /usr/sbin/apache2)
echo MySql=$JOB1, Apache=$JOB2
Yes, you can use data-volumes -v switch. I would use this for development. You can use read-only mounting, so no changes will be made to this directory if you want (your app should store data somewhere else anyway).
docker run -v=/home/user/dev/cmsdir:/var/www/cmsdir:ro image /usr/sbin/apache2
Anyway, for final deployment, I would build and image using dockerfile with ADD /home/user/dev/cmsdir /var/www/cmsdir
I don't know :-)
You want to use docker-compose. Follow the tutorial here. Very simple. Seems to tick all your boxes.
https://docs.docker.com/compose/
I understand this post is over a year old at this time, but I have recently asked myself very similar questions and have several great answers to your questions.
You can setup a MySQL docker instance and have data persist on a stateless data container, aka the data container does not need to be actively running
Yes I would recommend having a separate instance for your web server and database. This is the power of Docker.
Check out this repo I have been building. Basically it is as simple as make build & make run and you can have a web server and database container running locally.
You use the -v argument when running the container for the first time, this will link a specific folder on the container to the host running the container.
I think your ideas are great and it is currently possible to achieve all that you are asking.
Here is a turn key solution achieving all of the needs you have listed.
I've put together an easy to use docker compose setup that should match your development workflow requirements.
https://github.com/ehyland/docker-silverstripe-dev
Main Features
Persistent DB
Your choice of HHVM + NGINX or Apache2 + PHP5
Debug and set breakpoints with xDebug
The README.md should be clear enough to get you started.

Running and Deploying Rails to Docker Container

I am a total noob to linux containers and been spending some time learning about Docker, and forgive my confusion thought this question. Currently, I have a Rails app in production deployed via capistrano. My cloud servers are maintained with Opscode Chef on the Debian Wheezy distribution. For development, I have a Vagrant VM preinstalled with the app and services.
If I were to employ Docker, where would my app sit? The container or the host? How would I deploy (production) and share directories (development)? Can I run all my additional services ie memcache, redis, postgresql, etc on the same server using docker? I can maybe envision the potential of Docker but having trouble seeing its practical use.
Seems like containers are part of the future. Any guidance for someone making the switch from virtualization?
If I were to employ Docker, where would my app sit?
It could sit inside the container or it could sit on the host(you can use docker build to copy the app into the container)
How would I deploy (production) and share directories (development)?
Deploying your app would mean committing your local container into an image, publishing it
and running a container out of the published images on your servers. I have not tried sharing directories between host and container, but you can try this : https://gist.github.com/jpetazzo/5668338 . You can also write a Dockerfile which can copy a directory to a target in the container. Docker's docs on building images will help you there.
Can I run all my additional services ie memcache, redis, postgresql, etc on the same server using docker?
Yes. You will be running multiple containers on the same server.
I'm no expert and I haven't even used docker myself, but as I understand it, your app sits inside a docker container. You would deploy ideally a whole container with your own ruby version installed and so on.
The big benefit is, that you can test exactly the same container in your staging system that you're going to ship to production then. So you're able to test the complete system with all installed C extensions, the exact same ls command and so on.

Resources