Steps to run Test framework in Docker and Jenkins - docker

Background:
I am a newbie to docker.
I have 2 automation frameworks in my local PC - One for Mobile and other a web application. I have integrated the test frameworks with Jenkins.
Both test frameworks have open Jar dependencies mentioned in Maven pom.xml.
Now i want that when I click on Jenkins Job run to execute tests, my tests should run in a docker container.
Can anyone please give me steps to
Configure Docker in this completer Integrated framework
How to push my dependencies in docker
How to integrate jenkins and Docker
how to run Tests of web and mobile apps in docker on jenkins job click

I'm not a Jenkins professional, but from my experience, there are many possible setups here:
Assumptions:
By "Automation Framework", I understand that there is some java module (built by maven, I believe for gradle it will be pretty much the same) that has some tests that in turn call various APIs that should exist "remotely". It can be HTTP calls, working with selenium servers and so forth.
Currently, your Jenkins job looks like this (it doesn't really matter whether its an "old-school" job "step-by-step" definition or groovy script (pipelines):
Checkout from GIT
run mvn test
publish test results
If so, you need to prepare a docker image that will run your test suite (preferably with maven) to take advantage of surefire reports.
So you'll need to build this docker image once (see docker build command) and make it available in the private repository / docker hub depending on what your organization prefers. Technically for this docker image, you can consider a Java image as a base image, get the maven (download and unzip + configure) then issue the "git pull command". You might want to pass credentials as system variables to the docker process itself (see '-e' flag)
The main point here is that maven inside the docker image will run the build, so it will resolve the dependencies automatically (you might want to configure custom repositories if you have them in settings.xml of maven). This effectively answers the second question.
One subtle point is results that should be somehow shown in Jenkins:
You might want to share the volume with surefire-results folder with the Jenkins "host machine" so that Jenkins's plugins that are supposed to show the results of tests will work. The same idea is applicable if you're using something like allure reports, spock reports and so forth.
Now when the image is ready the integration with Jenkins might be as simple as running a docker run command and wait till it's done. So now the Jenkins job will look like:
docker run pre-defined image -e <credentials for git>
show reports
This is one example of possible integration.
One slightly different option is running docker build as a job definition. This might be beneficial if for each build that image should be significantly different but it will make the build slower.

Following approach can be followed to achieve your goal
Create a docker file with all your setup as well as dependency ( refer)
Install docker plugin on jenkins to integrate the support of docker (refer)
Use Jenkinsfile's approach to pull the docker image or create it by dockerfile and run the test within docker.
below sample code just for reference
node
{
checkout scm
docker.withRegistry('https://registry.example.com', 'credentials-id')
{
def customImage = docker.build("my-image")
docker.image('my-image').inside
{
//Run inside the container
sh 'run test'
}
}
}

Related

How can I create a temporary instance of a docker image and execute a command on the instance within an Azure Pipeline?

I have multiple build and deploy pipelines for my application (User Interface, Internal APIs, External APIs, etc...). I have another build pipeline for my automated tests (which use Node JS, Nightwatch-API, Cucumber, etc..) that builds a Docker image and pushes it to the Container registry.
I want to be able to pull the testing image into my deployment pipelines and execute the appropriate test script command (i.e. npm run test:InternalAPIs). My test scripts will publish the results to a separate system. I am trying to find the best way to execute the automated testing from within the deployment pipeline.
This seems like it should be an easy task within the pipeline build, I just cannot find the task that does what I need. Any assistance would be greatly appreciated.
I'd probably write a bash script for it, and then run the script with the Bash#3 task.
Alternatively, you could make use of built in tasks, such as Docker#2 and npm#1.
Refer to Microsoft's documenation for more details.
Edit: You can create a temporary instance of the docker image with the docker run command.

CICD Jenkins with Docker Container, Kubernetes & Gitlab

I have a workflow where
1) Team A & Team B would push changes in an app to a private gitlab (running in docker container) on Server A.
2) Their app should contain Dockerfile or docker-compose.yml
3) Gitlab should trigger jenkins build (jenkins runs in docker container which is also on Server A) and executes the usual build things like test
4) Jenkins should build a new docker image and deploy it.
Question:
If Team A needs packages like maven and npm to do web applications but Team B needs other packages like c++ etc, how do i solve this issue?
Because i don't think it is correct for my jenkins container to have all these packages (mvn, npm, yarn, c++ etc) and then execute the jenkins build.
I was thinking that Team A should get a container with packages it needs installed. Similarly for Team B.
I want to make use of Docker, Kubernetes, Jenkins and Gitlab. (As much container technology as possible)
Please advise me on the workflow. Thank you
I would like to share a developer's perspective which is different than the presented in the question "operation-centric" state of mind.
For developers the Jenkins is also a tool that can trigger the build of the application. Yes, of course, a built artifact can be a docker image as well but this is not what developers really concerned about. You've referred to this as "the usual build things like test" but developers have entire ecosystems around this "usual build".
For example, mvn that you've mentioned has great dependency resolution capabilities. It can alone resolve the dependencies. Roughly the same assumption holds for other build tools. I'll stick with maven during this answer.
So you don't need to maintain dependencies by yourself but, as a Jenkins maintainer, you should give a technical ability to actually build the product (which is running maven that in turn resolves/downloads all the dependencies and then runs the tests, produces tests results and can even create a docker image or even to deploy the image to some images repository if you wish ;) ).
So developers who use some build technologies maintain their own scripts (declarative as in the case of maven or something like make files in case of C++) should be able to run their own tools.
Now this picture doesn't contradict with the containerization:
The jenkins image can contain maven/make/npm really a small number of tools just to run the build. The actual scripts can be a part of the application source code base (maintained in git).
So when Jenkins gets the notification about the build - it should checkout the source code, run some script (like mvn package), show the test results and then as a separate step or from maven to create an image of your application and upload it to some repository or supply it to the kubernetes cluster depending on your actual devops needs.
Note that during mvn package maven will download all the dependencies (3rd-party packages) into the jenkins workspace, compile everything with Java compiler that you should also obviously need to make available on Jenkins machine.
Whoa, that's a big one and there are many ways to solve this challenge.
Here are 2 approaches which you can apply:
Solve it by using different types of Jenkins Slaves
In the long run you should consider running Jenkins workloads on slaves.
This is not only desirable for your use case but also scales much better on higher workloads.
Because in worst case denser workloads can kill your Jenkins master.
See https://wiki.jenkins.io/display/JENKINS/Distributed+builds for reference.
When defining slaves in Jenkins (e.g. with the ec2 plugin for AWS integration) you can use different slaves for different workloads. That means you prepare a slave (or slave image, in AWS this would be an AMI) for each specific purpose you've got. One for Java, one for node, you name it.
These defined slaves then can be used within your jobs by checking the Restrict where this project can be run and entering the label of your slave.
Solve it by doing everything within the Docker Environment
The simplest solution in your case would be to just use the Docker infrastructure to build Docker images of any kind which you'll then push to your private Docker Registry (all big cloud providers have such Container Registry services) and then download at deploy time. This would save you pains in installing new technology stacks everytime you change them.
I don't really know how familiar you're with Docker or other containerization technologies, so I roughly outline the solution for you.
Create a Dockerfile with the desired base image as starting point. Just google them or look for them on Dockerhub. Here's a sample for nodeJS:
MAINTAINER devops#yourcompany.biz
ARG NODE_ENV=development
ENV NODE_ENV=${NODE_ENV}
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install --silent
#i recommend using a .dockerignore file to not copy unnecessary stuff in your image
COPY . .
CMD [ "npm", "start"]
Adapt the image and create a Jenkinsfile for your CI/CD pipeline. There you should build and push your docker image like this:
stage('build docker image') {
...
steps {
script {
docker.build('your/registry/your-image', '--build-arg NODE_ENV=production .')
docker.withRegistry('https://yourregistryurl.com', 'credentials token to access registry') {
docker.image('your/registry/your-image').push(env.BRANCH_NAME + '-latest')
}
}
}
}
Do your deployment (in my case it's done via Ansible) by downloading and running the previously deployed image during your deployment scripts.
If you'd try to define your questions a little more in detail, you'd get better answers.

Can Docker image be configured or pushed data runtime from jenkins

I am planning to pull sonarqube docker image and push the code to docker image and run mvn sonar on that code and generate report. I read through many docker jenkins integration documents and white paper, I didn't come across this scenario, does it mean not possible :(
One scenario that comes to my mind is you keep your sonarqube environment.
Create your sonarqube environment, and keep it (it can be dockerized you just need to persist some paths in you database container and sonarqube container).
Setup email notifications, rules and stuff
In your CI, use mvn with sonarqube goal.
You can also use various plugins available, You can find an example approach here.

Run tests inside Docker container with Jenkins

We want to give it a try to setup CI/CD with Jenkins for our project. The project itself has Elasticsearch and PostgreSQL as runtime dependencies and Webdriver for acceptance testing.
In dev environment, everything is set up within one docker-compose.yml file and we have acceptance.sh script to run acceptance tests.
After digging documentation I found that it's potentially possible to build CI with following steps:
dockerize project
pull project from git repo
somehow pull docker-compose.yml and project Dockerfile - either:
put it in the project repo
put it in separate repo (this is how it's done now)
put somewhere on a server and jut copy it over
execute docker-compose up
project's Dockerfile will have ONBUILT section to run tests. Unit tests are run through mix tests and acceptance through scripts/acceptance.sh. It'll be cool to run them in parallel.
shutdown docker-compose, clean up containers
Because this is my first experience with Jenkins a series of questions arise:
Is this a viable strategy?
How to connect tests output with Jenkins?
How to run and shut down docker-compose?
Do we need/want to write a pipeline for that? Will we need/want pipeline when we will get to the CD on the next stage?
Thanks
Is this a viable strategy?
Yes it is. I think it would be better to include the docker-compose.yml and Dockerfile in the project repo. That way any changes are tied to the version of code that uses the changes. If it's in an external repo it becomes a lot harder to change (unless you pin the git sha somehow , like using a submodule).
project's Dockerfile will have ONBUILD section to run tests
I would avoid this. Just set a different command to run the tests in a container, not at build time.
How to connect tests output with Jenkins?
Jenkins just uses the exit status from the build steps, so as long as the test script exits with a non-zero code on failure and a zero code on success that's all you need. Test output that is printed to stdout/stderr will be visible from jenkins console.
How to run and shut down docker-compose?
I would recommend this to run Compose:
docker-compose pull # if you use images from the hub, pull the latest version
docker-compose up --build -d
In a post-build step to shutdown:
docker-compose down --volumes
Do we need/want to write a pipeline for that?
No, I think just a single job is fine. Get it working with a simple setup first, and then you can figure out what you need to split into different jobs.

Jenkins - Docker integration - Use Jenkins to build Docker images and push to the registry

I am currently working on integrating Docker with Jenkins and I am currently trying to figure out the following pipeline:
Whenever a Dockerfile is updated in GIT, trigger a Jenkins Job to do the following
Build the Docker image
List item
Test, Verify the Docker image
Version the image - Prod, testing etc.
Push the image to the registry
If the image is not built, have a proper mechanism to get the logs
From my research, I found that we have 2 different plugins for Jenkins for Docker integration - Build step plugin and Docker build publish plugin. As far as I could see, I could not see any plugins or workflow to test the image before pushing it to the repository. Since we are doing this from the scratch, I would like to know the best tried and tested workflow.
Any help appreciated.
We applied the same mindset like "git flow" to the creation of docker images. In our solution, there was no need for testing the image itself. We solved that splitting up the Build in to a "Source-Build" producing artifacts and a downstream job e.g. "Runtime-Build" only packaging the artifacts into the runtime and pushing into the registry. At this point the whole stack is delivered to a "Release-Stage" for automatic testing.
To test the image there's a tool called Anchore.
Then, if you want to integrate other types of tests before building the Docker image, you can integrate for example Sonarqube with Jenkins and do a static analysis of the source code. Full example at: https://pillsfromtheweb.blogspot.com/2020/05/integrate-sonarqube-static-analysis-in.html

Resources