How can I build and run a .devcontainer folder outside vscode? - docker

VS Code has this cool feature where you can create devcontainers which leverage docker and can help your team build software in a containerised and unified way - context. However, I want to also use the dev containers outside of VS Code. Is there any way to do so?

I have created a script that does this. It take the existing devcontainer.json and runs the container in docker. You find more information here https://blog.wille-zone.de/post/run-devcontainer-outside-of-visual-studio-code/

Related

Run a gitlab CI pipeline in Docker container

Absolute beginner in DevOps here. I have a Gitlab repo that I would like to build and run its tests in the Gitlab pipeline CI.
So far, I'm only testing locally on my machine with a specific runner. There's a lot information out there and I'm starting to get lost with what to use and how to use it.
How would I go about creating a container with the tools that I need ? (VS compiler, cmake, git, etc...)
My application contains an SDK that only works on windows, so I'm not sure building on another platform would work at all, so how do I select a windows based container?
How would I use that container in the yml file in gitlab so that I can build my solution and run my tests?
Any specific documentation links or suggestions are welcomed and appreciated.
How would I go about creating a container with the tools that I need ? (VS compiler, cmake, git, etc...)
you can install those tools before the pipeline script runs. I usually do this in before_script.
If there's large-ish packages that need to be installed on every pipeline run, I'd recommend that you make yourown image, with all the required build dependencies, push it to GitLab and then just use it as your job image.
My application contains an SDK that only works on windows, so I'm not sure building on another platform would work at all, so how do I select a windows based container?
If you're using gitlab.com - Windows runners are currently in beta, but available for use.
SaaS runners on Windows are in beta and shouldn’t be used for production workloads.
During this beta period, the shared runner quota for CI/CD minutes applies for groups and projects in the same manner as Linux runners. This may change when the beta period ends, as discussed in this related issue.
If you're self-hosting - setup your own runner on Windows.
How would I use that container in the yml file in gitlab so that I can build my solution and run my tests?
This really depends on:
previous parts (you're using GL.com / self hosted)
how your application is built
what infrastructure you have access to
What I'm trying to say is that I feel like I can't give you a good answer without quite some more information

How to deploy weblogic application as docker container completely using Dockerfile?

I've a simple REST API in the weblogic application. I've to deploy the application as the docker container. But, I'm facing a problem in defining the Dockerfile.
Dockerfile
FROM store/oracle/weblogic:12.2.1.4
COPY target/app.war /u01/oracle
Above is my current Dockerfile. With the current dockerfile, I have to manually deploy the application on the weblogic server. We would like to automate the application deployment using Dockerfile and didn't get the exact examples.
Please advise.
This is a complex task, so it is hard to explain the whole process here.
The high-level steps that you need to execute are the followings:
Start a properly configured WebLogic domain in Docker. This task involves the creation of the admin and managed servers and WL cluster, etc.
Build the application that you wanna deploy
Configure the database properly if you have any
Create the WL resources like connection pool, JMS, etc manually or via WLST script
Deploy your artifact via the WL web console or with WLST script or copy the file under the autodeploy directory
Be careful because the tasks that you executed manually will be lost if you drop your docker container.
You can find concrete examples, use cases, automated scripts that you can use and well prepared, ready for use WebLogic Docker images here: https://github.com/zappee/docker-images
If you have a concrete question, not a general one, like this, then please start a new thread.
Take a look at the GitHub project:
https://github.com/oracle/docker-images/tree/master/OracleWebLogic/dockerfiles

How to start docker containers using shell commands in Jenkins

I'm trying to start two containers (each with different image) using Jenkins shell commands. I tried installing docker extension in Jenkins and/or setting docker in global configuration tools. I am also doing all this in a pipeline. After executing docker run... I'm getting Docker: not found error in Jenkins console output.
I am also having a hard time finding a guide on the internet that describes exactly what I wish to accomplish. If it is of any importance, I'm trying to start a Selenium Grid and a Selenium Chrome Node and then using maven (that is configured and works correctly) send a test suite on that node.
If u have any experience with something similiar to what I wish to accomplish, please share your thoughts as what the best approach is to this situation.
Cheers.
That's because docker images that you probably create within your pipeline cannot also run (become containers) within the pipeline environment, because that environment isn't designed to also host applications.
You need to find a hosting provider for your docker images (e.g. Azure or GCP). Once you set up the hosting part, you need to add a step to your pipeline to upload/push the image to that provider's docker registry or to the free public Docker Hub. Then, finally, add a step to your pipeline to send a command to your hosting, to download the image from whichever docker registry you chose, and to launch the image into a container (this last part of download and launch is covered by docker run). Only at that point you have a running app.
Good luck.
Somewhat relevant (maybe it'll help you understand how some of those things work):
Command docker build is comparable to the proces of producing an installer package such as MSI.
Docker image is comparable to an installation package (e.g. MSI).
Command docker run is comparable to running an installer package with the goal of installing an app. So, using same analogy, running an MSI installs an app.
Container is comparable to installed application. Just like an app, docker container can run or be in stopped state. This depends on the environment, which I referred to as "hosting" above.
Just like you can build an MSI package on one machine and run it on other machines, you build docker images on one machine (pipeline host, in your case), but you need to host them in environments that support that.

Best practice/way to develop Golang app to be run in Docker container

Basically what the title says... Is there a best practice or an efficient way to develop a Golang app that will be Dockerized? I know you can mount volumes to point to your source code, and it works great for languages like PHP where you don't need to compile your code. But for Go, it seems like it would be a pain to develop alongside Docker since you pretty much only have two options I guess.
First would be to have a Dockerfile that is just onbuild so it starts the go app when a container is run, thus having to build a new image on every change (whether it be small or not). Or, you do mount your source code dir to the container dir, then attach to the container itself and do the manual go build/run yourself as if you would normally.
Those two ways are really the only way that I see it happening unless you just don't develop your Go app in a docker container. Just develop it as normal, then use the scratch image method where you pre build the Go into a binary then copy that into your container when you are ready to run it. I assume that is probably the way to go, but I wanted to ask more professional people on the subject and maybe get some feedback on the topic.
Not sure it's the best pratice but here is my way.
Makefile is MANDATORY
Use my local machine and my go tools for small iterations
Use a dedicated build container based on golang:{1.X,latest}, mount code directory to build a release, mainly to ensure that my code will build correctly on the CI. (Tips, here is my standard go build command for release build : CGO_ENABLED=0 GOGC=off go build -ldflags -s -w)
Test code
Then use a FROM scratch to build a release container (copy the bin + entrypoint)
Push you image to your registry
Steps 3 to 6 are tasks for the CI.
Important note : this is changing due to the new multistage builds feature : https://docs.docker.com/engine/userguide/eng-image/multistage-build/, no more build vs release containers.
Build container and release container will be merged in one multistage build so one Dockerfile with (not sure about the correct syntax but, you will get the idea) :
FROM golang:latest as build
WORKDIR /go/src/myrepos/myproject
RUN go build -o mybin
FROM scratch
COPY --from=build /go/src/myrepos/myproject/mybin /usr/local/bin/mybin
ENTRYPOINT [ "/usr/local/bin/mybin" ]
Lately, I've been using
https://github.com/thockin/go-build-template
As a base for all of my projects. The template comes with a Makefile that will build/test your application in a Docker.
As far as I understood from you question, you want to have a running container to develop a golang application. The same thing can be done in your host machine also. But good thing is that if you could build such application, then that will be consider as cloud Platform-as-a-Service(PaaS).
The basic requirement of the container will be: Ubuntu image and other packages such as editor, golang compiler and so on.
I would suggest to look on the docker development environment.
https://docs.docker.com/opensource/project/set-up-dev-env/
The docker development environment is running inside a container and the files are mounted from one of the host directory. The container image is build from Ubuntu scratch image and added required packages which are needed to compile docker source code.
I hope you almost got what you are looking for.

DevOps vs Docker

I am wondering how exactly does docker fit into CI /CD .
I understand that with help of containers, you may focus on code , rather than dependencies/environment. But once you check-in your code, you will expect tools like TeamCity, Jenkins or Bamboo to take care of integration build , integration test/unit tests and deployment to target servers ( after approvals) where you will expect same Docker container image to run the built code.
However, in all above, Docker is nowhere in the CI/CD cycle , though it comes into play when execution happens at server. So, why do I see articles listing it as one of the things for DevOps.
I could be wrong , as I am not a DevOps guru, please enlighten !
Docker is just another tool available to DevOps Engineers, DevOps practitioners, or whatever you want to call them. What Docker does is it encapsulates code and code dependencies in a single unit (a container) that can be run anywhere where the Docker engine is installed. Why is this useful? For multiple reasons; but in terms of CI/CD it can help Engineers separate Configuration from Code, decrease the amount of time spent doing dependency management etc., can use it to scale (with the help of some other tools of course). The list goes on.
For example: If I had a single code repository, in my build script I could pull in environment specific dependencies to create a Container that functionally behaves the same in each environment, as I'm building from the same source repository, but it can contain a set of environment specific certificates and configuration files etc.
Another example: If you have multiple build servers, you can create a bunch of utility Docker containers that can be used in your CI/CD Pipeline to do a certain operation by pulling down a Container to do something during a stage. The only dependency on your build server now becomes Docker Engine. And you can change, add, modify, these utility containers independent of any other operation performed by another utility container.
Having said all of that, there really is a great deal you can do to utilize Docker in your CI/CD Pipelines. I think an understanding of what Docker is, and what Docker can do is more important that a "how to use Docker in your CI/CD" guide. While there are some common patterns out there, it all comes down to the problem(s) you are trying to solve, and certain patterns may not apply to a certain use case.
Docker facilitates the notion of "configuration as code". I can write a Dockerfile that specifies a particular base image that has all the frameworks I need, along with the custom configuration files that are checked into my repository. I can then build that image using the Dockerfile, push it to my docker registry, then tell my target host to pull the latest image, and then run the image. I can do all of this automatically, using target hosts that have nothing but Linux installed on them.
This is a simple scenario that illustrates how Docker can contribute to CI/CD.
Docker is also usefull for building your applications. If you have multiple applications with different dependencies you can avoid having a lot of dependencies and conflicts on your CI machine by building everything in docker containers that have the necessary dependencies. If you need to scale in the future all you need is another machine running your CI tool (like jenkins slave), and an installation of docker.
When using microservices this is very important. One applicatio can depend on an old version of a framework while another needs the new version. With containers thats not problem.
Docker is a DevOps Enabler, Not DevOps Itself: Using Docker, developers can support new development, enhancement, and production support tasks easily. Docker containers define the exact versions of software in use, this means we can decouple a developer’s environment from the application that needs to be serviced or enhanced.
Without Pervasive Automation, Docker Won’t Do Much for You : You can’t achieve DevOps with bad code. You must first ensure that the code being delivered is of the highest quality by automating all developer code delivery tasks, such as Unit testing, Integration testing, Automated acceptance testing (AAT), Static code analysis, code review sign offs & pull request workflow, and security analysis.
Leapfrogging to Docker without Virtualization Know-How Won’t Work : Leapfrogging as an IT strategy rarely works. More often than not new technologies bring about abstractions over existing technologies. It is true that such abstractions increase productivity, but they are not an excuse to skip the part where we must understand how a piece of technology works.
Docker is a First-Class Citizen on All Computing Platforms : This is the right time to jump on to the Docker bandwagon. For the first time ever Docker is supported on all major computing platforms in the world. There are two kinds of servers: Linux servers and Windows servers. Native Docker support for Linux existed from Day 1, since then Linux support has been optimized to the point of having access to the pint-sized.
Agile is a Must to Achieve DevOps : DevOps is a must to achieve Agile. The point of Agile is adding and demonstrating value iteratively to all stakeholders without DevOps you likely won’t be able to demonstrate the value you’re adding to stakeholders in a timely manner. So why is Agile also a must to achieve DevOps? It takes a lot of discipline to create a stream of continuous improvement and an Agile framework like Scrum defines fundamental qualities that a team must possess to begin delivering iteratively.
Docker saves the wastage for your organization capital and resources by containerizing our application. Containers on a singe host are isolated from each other and thy uses same OS resources. This frees up RAM, CPU and storage etc. Docker makes it easy to package our application along with all the required dependencies in an image. For most of the application we have readily available base images. One can create customized base image as well. We build our own custom image by writing simple Dockerfile. We can have this image shipped to central registry from where we can PULL it to deploy into various environments like QA, STAGE and PROD. This All these activities can be automated by CI tools like Jenkins.
In a CI/CD pipeline you can expect the Docker coming into picture when the build is ready. Initially CI server (Jenkins) will checkout the code from SCM in a temporary workspace where the application is built. Once you have the build artifact ready, you can package it as an image with the dependencies. Jenkins does this by executing simple docker build commands.
Docker removes what we all know the matrix from hell problem, making the environments independent with its container technology. An open source project Docker changed the game by simplifying container workflows and this has resulted in a lot of excitement around using containers in all stages of the software delivery lifecycle, from development to production.
It is not just about containers, it involves building Docker images, managing your images and dependencies on any Docker registry, deploying to an orchestration platform, etc. and it all comes under CI/CD process.
DevOps is a culture or methodology or procedure to deliver our development is very fast. Docker is a one of the tool in our devops culture to deploy application as container technology (use less resources to deploy our application).
Docker just package devloper environment to run on other system so that developer need not to worry about whether there code work in there system and not work in production due to differences in environment and operating system.
It just make the code portable to other environments.

Resources