I am working on creating a baseline of a developer's set up for them to 'plug and play'. What would be the best option? VM, Containers or else? - docker

I am trying to find the best way to achieve the following scenario;
I am currently working on getting a complex enterprise web application that consist on:
DB
BPM Engine
SOA Engine
Reporting Engine
Web Application Server
IDE
The applications is currently running in non-prod and prod environment but each environment is independent (no infra as a code, and deployments go from dev -> ... -> prod).
When a new developer comes in, they can't run the system in their local machine as it involves too many components (will come to this later). So they do development in their local machine and to test, they need to publish and deploy to dev. Test, rinse and repeat.
I am currently working on reverse engineer the whole thing so I can get it working on my local machine provided that I can install and run all the components. I am nearly there after fiddling with a lot of configuration, settings, etc.
This work I would like others to use, so they can also run the project in their local machines. In fact, since we will be migrating soon, I would like to pack the whole thing in a way that I can deploy it anywhere (the app already working and configured) and parametrised somehow whether is DEV, SYS, UAT, PROD. This, according to my understanding is what a docker image would do for you correct? You do all the work and then you create an image out of it? Then you can have this image running in a container and that way, other people can 'reuse' your work?
Is this the correct way of doing it? Any hints / comments would be appreciated
Apologies for my writing.

Related

Example of web development local and production environment sample setup and workflow

I am working on moving our existing websites from a shared hosting to a VPS and then they will be redeveloped and improved using Laravel. My background is not software development, I have however a decent understanding of web development (enough to make a blog, CMS etc) BUT I have never worked in a web dev team so I don't know how things should be done "properly".
Locally I have always used XAMPP and remotely it has always been as basic as publishing files via Filezilla.
Now I have been required to do:
Version control - the changes to the website should be reviewed by a second (non technical person) before going live
Develop a system based on Laravel
What I am struggling to understand is how Ubuntu Server, GIT, Docker, Kubernetes, NGINX etc. all work together. Basically I don't know what "the big picture" looks like, how a decent workflow should look like.
So far I have manually installed all the necessary software to run Laravel on the VPS (the LAMP stack) but soon after I started to run into problems (libraries that are activated locally are not activated remotely). It has also become clear that software updates and differences between my local environment and remote (production environment) will make the issues worst over time.
Can someone explain in VERY general terms, how things should fit together so that my setup is both resilient, robust and scalable? For example:
Install docker on the server and on your computer
Download such and such image
connect GIT in such and such way
Enable unattended-upgrades
The more I read the more I get confused.
What I would like is a simple guide/idea on how things should be done properly.

Setting up a Docker-Compose that will set up a Rails dev enviroment for local development

I’m new to Docker, I think I’m getting it down for server setup using docker, however I want to use Docker so non-engineer staff in my company can more easily setup a Rails dev enviroment, database, installing libraries on the ruby and javascript end, that kind of thing.
I can do it using Docker, however it requires getting into the docker engine and tweaking databases and what not which I feel is asking a lot for a non-engineer persons and doesn’t lessen there burden as just setting them up natively.
I’ve looked around on guides to setup Rails environments but they all tend to have a lot of work required to get everything up and running, I wish to have a more easier way of just fire up terminal and go to the project folder, smack 1 or 2 commands in and have it all done, is there any guide or example setup to do something like this?

Apache Service Mix Deployment Approaches

Folks,
We have got an enterprise application which uses Apache Service Mix for deployment. The application consists of various services and each is created as separate Maven project (bundle).During development, we are actually building each service separately and in-order to deploy it,its being put in the deploy folder. Also, we have to uninstall the bundle from the container(say; karaf) and then install it again from the console to bring the new changes in effect. This is fine during the development phase.
Now we want to deploy the code to an UAT environment (Amazon EC2) for the client to do the testing. We are now confused about how to deploy the bundles to the remote environment. Do we have a standard approach for CI using Jenkins(or some other tool) to automate the build and deploy process , so that someone who has no knowledge about the bundles(SMX) can deploy the code. We are using Github for source code management.
We have searched a lot in this regard and couldn't find any resources which provide some leads/hints on this.
Any help/tips is highly appreciated. If you need more info, I can give more details.
~Ragesh
We do have exactly similar setup and we use the Jenkins to build and let the Sysadmin to copy the bundles to one server and then he enables the rsync to rest of the servers.
Remember, always deploy the dependent bundle first and then remaining ..
Since we have this dependency ,we can't go automating this process.

Use Docker rather than native/homebrew on Mac?

I currently have a LAMP stack installed on my mac running through Homebrew, which, to be honest hardly ever get's used.
Lately I have been working a lot with AngularJS and service based apps, so generally run the sites through a gulp / nodeJS based webserver.
I am totally frontend orientated, so very rarely do I play with backend related technologies other than the odd Drupal site and mysql.
I am interested to learn more NodeJS, perhaps even some Ruby, purely to understand programming more - not really for it to become my new job description.
So reading up on NodeJS a bit last night I read a lot about Docker, and installed it the toolkit and gui this morning. It looks pretty neat!
My question is: Would it work better for me to just run everything I need through Docker? For example, I can just install the mysql container, and turn it on when I need a db, and just spin up a drupal instance when I need one and connect it to my db instance?
I understand that running Docker on Mac is slower as it doesn't have the native Linux kernel and runs through a VM - but considering my needs from it, this should be okay?
I love the idea of just deploying containers, so will probably want to install Docker on my hosting environment too (VM in the cloud).
Follow up question: 90% of the sites I work on are AngularJS based frontends that speak to APIs that our backend guys build separately. Would it be overkill to have a Docker for each of those sites, or would I rather just run them all in one, or just bypass docker entirely for that (as I mentioned, I normally just load them up from within my Gulp's webserver)
Thanks a lot. I realise this is a n00b asking questions about big technology, but I'm trying to wrap my head around it and hopefully grow a bit in the process.
The interest in deploying Docker container is reproducibility.
You can easily reproduce:
either a complex development environment requiring the installation of numerous libraries (that you don't want to pollute directly your host)
or an execution environment, for a given tool to run (like a web server)
If you are not likely to repeat a setup (for dev or exec), a docker container would bring little value.
But if you want to keep track of the exact specification of an environment (through its Dockerfile) and will deploy it not just on your workstation, but in other places as well, then docker is certainly a good option to consider.

Ruby development environment (OS X vs. Ubuntu)

I'm a developer who uses RoR-CoffeeScript-Sass-Passenger-Apache. We use EC2 for our deployment and we have Macbook Airs for development. While the rails community is very much Mac-friendly, because of the whole deployment stack difference in dev vs. prod, I'm using a virtualbox+ubuntu while my peers are developing on OS X native.
Having on OS X Native adds more problem as we have more dependencies in the stack (Solr, Beanstalk, Mongodb and more which works well in Ubuntu)
I'm looking for suggestions on how Rails developers using Mac and Amazon EC2 can have their dev and prod environment setup.
Would also like feedback on use of vagrant for distribution of development environments for this use case.
A common practice would be to replicate your stack as a "staging" environment. With EC2, you can just create AMI's of your existing machines and duplicate them, turning them on only to test deploys, and run your tests to make sure everything is running properly before deploying it to production. Or often you may wish to leave it on permanently so developers can quickly deploy updates or patches to test as need be.
Doing it this way ensures that you have an exact replica of your production system to test before rolling out, thereby eliminating any (catastrophic) issues pertaining to the deploy sneaking out into production.
Our team has been been developing on Macs and deploying to Ubuntu on EC2 for three years now with very few issues. Several things have helped make this a smooth process:
We can run the entire app stack** on a Mac. Between macports, homebrew, and building from source when necessary, we have managed to get every piece of technology that we run in prod working on our dev boxes. The way the pieces are configured and fit together is different locally (in prod, for example, we auto-discover our memcached instance, whereas locally it's hard coded) but every integration can be tested on Macs first before going to prod.
Our continuous build system is on the same setup as our prod boxes. This means if you check in some code that depends on some piece of local magic it's discovered quickly.
We run a soak (some people call this staging or integ) stack that is configured identically to production. This sometimes causes some development overhead but has so many benefits that it's well worth it. All code goes through this stack before being pushed to prod.
This setup has worked well enough that over time we've allowed more parts of the setup to drift apart. We used to run passenger locally (like we do in prod) but now use Pow. We regularly experiment with new ruby versions in development for some time before upgrading the rest of the stack.
I've had to develop using a virtualized environment for other projects (OSX + CentOS in VirtualBox) and definitely found it more painful that all-native. For one, it felt like managing two machines instead of one. Everything also felt sloooooowww.
If there's a piece of the stack that is painful to run on the Mac, I would definitely prefer to take the hit of either a) spending the time of getting it working locally or b) abstracting that piece away, rather than pay the tax of dealing with a virtual environment.
** I'm only including the Rails app and direct dependencies in this discussion. For example, we use puppet to configure our EC2 fleet, but don't run it on our dev boxes.

Resources