Development environment setup for Mac and CentOS using Docker - docker

I have searched the history a little bit but failed to find a good answer. So I just asked my question here. If there is a good answer already, please redirect it for me. Thanks.
The question is, I found my company's new hire doc lists a bunch of software to install to setup the development environment. Usually it took 1 or 2 days for a new hire to setup everything ready for a new mac. We want to shorten that process. The first thing I thought is Docker.
I read through the user guide of Docker and followed some blogs regarding to how to setup dev environment using Docker but still a little confused if Docker applies to our setting. So here's the detail of requirements:
We need to install a bunch of software (many of them are customized binaries). Right now, we distribute the source code, a new hire need to build from the source code, install it and set environment to include the binary into path. I am wondering if Docker allows us to install customized binaries into it's container?
The source code should not stay in the container. The source code is still checked out in one's local machine using git. Then, how can I rely on the Docker container's environment to build my software? I have searched a little bit is that, you need to mount your folder into the container, and then shell into your container to build? Is that how it works?
We usually develop in mac, does Docker also support mac container or it just allows you to run Linux container using boot2Docker?
Thank you so much in advance for your help.

Some answers :)
First, I think it's a really good idea to use Docker to standardise the development configuration (softwares, custom packages, env variables, ...).
With Docker, you can get your customised binaries from the host, it's not a problem. With the CMD command, you can use bash to install them and add them into your PATH. You can also write a shell script to install all your stuff and launch this script when you build your container
Your code will be on the host and you can "mount" a host folder in your docker image with the -v command. Ex: docker run -v /home/user/code:/tmp/code your_image. I'll detail below how the developer will use your Docker image.
Yep, you have to use Boot2Docker, it works well
Once your development image will be ready, you have to publish it on the official Docker registry (or to host a local registry on your network).
Next, the developer will launch the following Docker command:
docker run -rm -ti your_build_image /bin/bash
This will launch a bash terminal in your Docker image and the developer will be able to compile the code. Ex: cd /tmp/code + mvn clean install
Please have a look to this article to learn about volumes: http://jam.sg/blog/mongodb-docker-part-2/
And this one about Dockerfile: https://www.digitalocean.com/community/tutorials/docker-explained-using-dockerfiles-to-automate-building-of-images
You can also find a lot of Dockerfiles on github (search Dockerfile).

If the goal is to speed up the time it takes to get a Mac setup and usable in your environment, you might want to look at Boxen.
From the "About" section:
"Boxen is your team's IT robot. It's a dangerously opinionated framework that automates every piece of your development environment. GitHub, Inc. wrote the first version of Boxen (imaginatively called “The Setup”) to help employees start shipping on day one."

Related

How to get started development inside docker container in windows operating system

I am a developer who is using Ubuntu 20.04 LTS regularly for my development. I never install any packages like, node, PHP, python in the OS and make use of docker for the purpose. VS Code is the editor I use, and the extension of the remote container will help me to develop & debug inside the docker container.
Right now, I am in the process of moving the development to a windows environment and I wanted to follow a similar workflow there too. Unfortunately, I am facing few issues like "file changes are not getting detected" (when npm serve in angular and react projects).
https://github.com/microsoft/WSL/issues/4739
https://www.reddit.com/r/bashonubuntuonwindows/comments/c48yej/wsl_2_react_not_reloading_with_file_changes/
I have tried different methods to solve the issue like
use wsl2 and then docker inside that and then serve from the container
use just docker and serve the code from inside the container
Regardless of the methods, the file changes are not getting detected inside the docker.
Trust me I have gone through many bizarre words like inotify, increasing the watchers, etc... Nothing helped.
Is there a developer out there following a similar practice in a Windows environment? (docker + windows)
Any help is highly appreciated.
I suggest moving the files to the wsl2 file system and not the windows.
Wsl2 'sees' the windows file system from inside a mount image /mnt/c .
Move out of it, like at ~ (cd ~) and i think your files will be normally watched .

How to setup dockerized binaries in VSCode

I have learned to use docker as development server (LAMP and MEAN) and now I feel I should take next step, By removing PHP and node binaries from system and use binaries from containers. So on a fresh Solus install, I setup containers for PHP, node, Ruby etc. Solus already recommends using containers for such tasks. But I got stuck on first day.
I installed vs code (Code-oss) on installed extensions (prettier, PHPCS etc) on it, and they need path of installed binaries (path/to/phpcs, path/to/node etc).
I initially set up configuration path as
docker run -it --rm herloct/phpcs phpcs
based on https://gist.github.com/barraq/e7f85262bc7a0af2d8d8884d27b62d2c but using more updated container. It didn't work, So I set it up as alias thinking it would fool VSCode into thinking it is native command, but it didn't work either. I have confirmed that using those command directly from terminal does work, But VSCode PHPIntellisense extension does not want to work.
Any suggestion?
P.S. Any tip to keep container running in background as to avoid container bootup delay everytime I use PHPCS or javac from container? I can keep LAMP server running but everytime I enter terminal tools, it loads up new container to execute command, and then kill container causing delay for bootup and closing.
In case it is still relevant to someone: You might want to create a VS Code development container to use dockerized binaries.
For this to work, a .devcontainer.json is required which could be as simple as:
{
"image": "mcr.microsoft.com/vscode/devcontainers/typescript-node:0-12"
}

Using Docker container as a Ruby on Rails development environment

I want to use Docker as a development environment. I am familiar with the basic Docker concepts such as containers, images, volumes, etc. I am also reading this article.
I think that there are already images specifically created for RoR development. Could someone recommend me a couple of images to start with?
Suppose that I create a container, mount my
working folder (RoR projects). Besides code writing, there are also command line jobs such as Linux tasks (update, install), Rails specific commands (Rake, migrations....). I may need to install new binaries or new gems, change Ruby version using rbenv. How can I accomplish these tasks under Docker? May I type a command in a console or ssh the container?
I managed to create an ubuntu container and run it as following:
docker run -it -v /Users/me/Documents/Projects:/var/source_files ubuntu
It creates a console for my container. Next I guess I can run commands like gem install, apt-get update and etc. Is this how we should configure our environment?
I cannot find information on how to run, how to maintain, how add/remove gems, etc.
It's really up to you and what you're comfortable the most with. I'm assuming solo development on some kind of libraries rather than full-fledged apps[1].
I, for example, tend to use Makefiles when developing on specific Golang projects and have some separate images I tend to use for different occasions. For example, if I have to test a Python / Node scripts, I simply type play and I get into a silly container with a few dependencies pre-installed:
https://github.com/odino/dev#play
https://github.com/odino/dev/blob/master/play/Dockerfile
In my personal experience, though, I found that shell scripts / aliases work very well across projects, so I tend to have simple aliases that work on most projects. If I were you, for example, I would use a minimalistic approach and alias dev to docker run -ti -v $(pwd):/src $RUBY_IMAGE so that you can then run dev rake test, dev rails server etc etc from any project. Your $RUBY_IMAGE should have a few utilities installed (htop, curl and so on) and you should be good to go.
Again, I must stress on the fact that it really depends what you're comfortable with -- most of the times I'm extremely productive with just a Makefile.
[1] if working on full-fledged apps docker-compose works well for a lot of people and has a very good DX. minikube is a tool I'd recommend you to pick up only if you know how to work with kubernetes. We used docker-compose for a long time but have switched to minikube since a few months as it closely mirrors our production environment, and minikube works better (imho) when you have quite a few services talking to each other.

what programs can be installed in a docker container

I am a Windows user.
I have looked at the official Docker tutorial "Get Started". The example focus is a python app. I don't know python and I guess a Docker container can have many programs installed as an environment, not just python.
Is Docker good for testing a program I download from the internet in an isolated environment (like a sandbox in firewalls or antivirus) ?
How for example can I make a container that has an environment containing installed programs like Visual Studio, VLC player, Office, etc.?
Thanks,
Abe
Yes; you can have an isolated environment with docker. You can set your desired configurations, download from internet, install, and whatever you do in a Virtual Machine.
Yes, you can. What your container contains depends on the base image you create it FROM and packages you install inside of it.
Tips
You can build your container from an empty OS (e.g. ubuntu), configure the OS, download/install/configure/run whatever you want.
You can create a base image which derives FROM a suitable OS, then install any basic application (e.g. firefox) which you may use in a lot of containers on it. Then you should push it in a registry (e.g. Github). After that, you can use it as a base image for other containers, so your new containers have installed applications by default; no need to install them again. It reduces complexity and repetitions in Dockerfile.

Deploy docker image as standalone executable

Are there any tools to install/deploy a docker image as a standalone/portable installation.
So that you don't have to install docker manually beforehand, just one installation, and it will run and deploy your docker image. And perhaps autostart it as well on boot.
Mainly interested for Win&OSX, but for linux would be nice too.
You can get a standalone Docker image automatically with preconfigured scaling options using the already packaged Docker engine. The details of this solution and its installation are described in the instruction.
I don't think that this is even possible. Docker has so many dependencies.
(Linux &/ OSX)
The much easier way would be a bash script wich starts the installation and afterwards runs the container. Shouldn't be that time consuming.

Resources