Where does Jenkins fit in the devops pipeline? - jenkins

I don't know where could I fit the jenkins tool in the following devops pipeline:
code -> integrate -> test -> release -> deploy -> operate
Maybe it can be in every steps ?

Jenkins is a build factory. In other words, its primary use is to run tasks that dedicated to build, integrate and deliver applications. It's a typical DEVOPS tool.
Jenkins can be used to build pipelines (sequences of tasks) or to be called from a pipeline (to execute one of the pipeline's tasks).
The great thing about Jenkins is that it integrates nicely with other devops tools:
SCM: SVN, Github, Gitlab
Build: maven, gradle
Test: Cucumber reports
Quality: SonarQube
Deployment: Octopus Deploy, XL Deploy, Run Deck...
You name it!
However, Jenkins is generally not used to "code" and "operated" applications.
A typical pipeline would be:
Try Pull Request => Build Release Candidate => Deploy RC on Integration => Deploy on Production
This is a over simplified pipeline, just to give an idea of the scope of this tool. A production grade piepline should include security checks, and integrate nicely with human validation when needed.

Jenkins is use for the Build, Test, and Deploy stages of the continuous delivery pipeline.

You can have "n" number of stages in a pipeline that can be configured using Jenkins.
Stages as follows (example) :-
code -> integrate -> test -> release -> deploy -> operate

Currently in the business sector Jenkins is used as follows:
If you are a software developer you need Jenkins for two reasons:
To build your project and check that it completes all the requirements concerning the pmd rules, checkstyles, findbugs, etc.
To deploy a new environment so to evaluate yourself. You need to see that the changes you made are the proper one and as you wanted them to be.
If you are a tester or a test automation engineer you want it two three reasons:
To build your code and check for findbugs, pmds and qaplugs generally
To test the software product, of the client or of your company's product
To create dynamic environments so to test the changes of the developers (mostly as a regression testing and not as an individual)
If you are a business related, project manager or supervisor you can do the next two actions:
Execute the tests so that you can see yourself if the product is working properly
Check the reports that Jenkins can give you every after test execution

Jenkins is an opensource automation server. Earlier it used to be a CI server only but after Jenkins 2.0, Pipeline as Code has made it popular for CI/CD both. It can manage all application lifecycle phases.

Related

CI/CD pipeline and build server

Have seen the following graph representing Jenkins pipeline:
git push --> Git repo --> Jenkins CI server --> Maven build server --> Test server --> Deliver build artifacts --> Deploy
Given that is correct, I am struggling to understand how the above different(?) servers work under the hood and I need some clarification so that Jenkins procedures are not just a black box.
Jenkins CI server, Maven build server and Test server are in reality one physical server?
If the answer to the previous question is yes, these 3 servers are different logical servers?
In my understanding and my case (Java Spring project), Maven build server executes mvn install and since pom.xml contains npm install plus npm run test commands, it is Maven build server that executes the UI tests and not the Test server. Am I right?
Does the Test server execute only the back-end Java acceptance tests?
It could be one physical or virtual server server.
no
To use maven you just need to install maven plugin in jenkins. Configure which version of maven you want to use in "Manage jenkins" -> "Global tools configuration" -> add needed maven version.
Depends on which kind of tests you want to run - you can repeat steps from step 3 for the tool you need or you can install needed tools on the jenkins server manually and add it to the PATH.
At the end just use needed tools in your pipeline or other king of job in jenkins.
Without further context, we must put some points clear:
Jenkins may work on multiples nodes (could be real/virtual/pods)
Jenkins is an orchestrator... This means that it uses different tools in a order given by the pipeline. This order is pretty much: git > build > test > delivery > deploy
Testing server are used for install the software an run a variety of tests, pretty much for several teams.
Build servers are used for running commands to build software
With those clear, these are the questions:
1.- Depends on your infrastructure. Jenkins works with executors that may run on master or in nodes. If your Jenkins is just one server, then yes... It is one physical server. If you use nodes, then it is more probable than you may found one node handling the building, and another one the tests. The definite answer lies within your Jenkins and your pipeline
2.- No... Even with a Jenkins master-agent infrastructure, the building and testing occurs within Jenkins.
3.- Depends on your concept of test server. If you define it as a server where you use as a target for testing purposes, then the answer is no... If you define a test server as a machine that executes test, then the answer is yes.
4.- Depends on what you type or test do you have automatized. You can run unit tests, regression tests, smoke test, etc. For example, you may have some unit test for your back-end, and have some test in karma for your UI. Again, in your case you must check on your pipeline, and the code to check what kind of tests are you running.

DevOps Continuos Delivery for Java Script

Can i use Jenkins integration for DevOps Continuos Delivery Pipeline for JavaScript code builds?
I am trying to build/propose a solution for integrating Jenkins tool and to remove manual code build and deployments to remove the manual effort for my team.
Generally in Jenkins, you can use the below:
Build -> Automated Test -> Dev Deploy -> QA Approval -> QA Deploy
The Seed Job is the one which will be creating other Jenkins job automatically but the seed job itself will be configured manually. The Seed Job will read the DSL script, parse those and create appropriate Job configurations in Jenkins.
After the seed job runs successfully, we will have a job created for our sample app.
The Seed job will create the following set of jobs which will eventually be part of the pipeline. The Seed job will also create a Jenkins pipeline view.
Build: This job includes the configuration for the building project, job triggers, scm location, jdk version to use, maven goals, artifact upload to repo like Artifactory.
Test: This job can call test suites and decide to call a downstream job or not.
Dev Deploy: Simple job with a trigger to the Promotion Job if the deployment was successful.
This job can call the script to perform a deployment or use tools like Bamboo or Urbancode.
Usually a Dev deploy doesn’t need promotion, but we can add that step if required.
QA Promotion: This job includes a send email notification to the person/group responsible for approval. The email contains a link for promotion.
The Promotion Email link can look like this: http://localhost:8080/XXX/XXXXX/XXX
The same can be done for the UAT and Prod:
We can chain multiple Promotion Jobs and Deploy jobs to accomplish the need for another environment, for e.g. UAT Promotion -> UAT Deploy ->PreProd Promotion -> PreProd Deploy -> Prod Promotion -> Prod Deploy
And here for all the above mentioned processes it can be done via Jenkins tool
Also yes, to answer your question you definitely can use Jenkins integration in Devops Pipeline.
If you are building your solution in cloud or in any domain servers then you might have to get the Jenkins integrated in the same environment.

How does the Build Pipeline Plugin relate to the Jenkins 2 Pipeline Plugin?

Currently, I use the Build Pipeline Plugin to orchestrate the delivery of my code through the different environments:
Build the code and execute unit tests
Manually deploy to the development environment
Automatically execute tests on the development environment
Manually release the software and put the version number to the released version.
Manually deploy to the integration test environment by downloading the artefact from a repository, based on the version put by the release build.
Manually deploy to ...
With Jenkins 2.0 comes the Pipeline plugin. But how do these two plugins relate to each other?
Should I migrate to the latest plugin? The things I seem to miss from the Jenkins 2 Pipeline plugin:
Manually trigger a stage. I can wait for an input, but it does not seem to be so elegant
Restart a stage to retrigger a deployment. This does not seem possible.
Visibility into the parameters that were used to trigger a stage, e.g. the version number of the software that was deployed.
Am I missing the point here? Should the two of them be combined? Or how are you approaching a pipeline like this?
With the current state of Jenkins 2 pipelines you are correct to state all the 'missing features' you listed.
One of the advantages of the Jenkins 2 pipeline plugin is that rather than chaining together a series of jobs as with the Build Pipeline Plugin, your entire pipeline is 1 'job', which makes user administration much easier IMO.
The other advantage of Jenkins 2 pipelines would be 'configuration as code', so you can track changes to your pipeline as you would any other file in version control.
Jenkins 2 pipelines are very much the new 'hotness', and there's many plugins implementing compatability day after day.
Once the new UI becomes production ready, I'd imagine that the old build pipeline plugin will begin to be deprecated.
Also you should be aware that the Build Pipeline plugin is not maintained by the Jenkins or CloudBees teams as far as I know, whereas Jenkins 2 pipelines are.
Would I recommend migrating now? No, I personally still don't consider the Jenkins 2 pipelines mature enough for deployment to production in an organisation. I'd stay to stick with what you know for now while you wait for the Jenkins 2 Pipeline ecosystem to mature.
My reasoning I gave in a blog post a few weeks ago (read more here if you want, but I've extracted out the 'weaknesses' here for you):
There are still plenty of plugins that I and many others will consider 'core to their CI pipeline' missing full or partial support for pipelines.
The lack of 'per-project-configuration' in pipelines for many plugins. e.g. Slack - the current implementation 'assumes' that all Jenkins 2 Pipeline projects should be communicated to the same Slack channel/team - whereas you may want to have multiple Slack teams configured. There are multiple other plugins like this.
At present the documentation of Jenkins 2 Pipelines is very limited, though this is improving.

What is the purpose of a build agent in continuous integration and continuous deployment?

What is the purpose of a build agent in continuous integration (CI) and continuous deployment? Is this something that impacts all CI servers (e.g. Jenkins, TeamCity, TFS, etc.)
On the TeamCity license types page I noted that the professional server license, which is free, only includes three build agents.
https://www.jetbrains.com/teamcity/buy/#license-type=new-license
The expression build agent basically describes an environment in which builds or jobs of the CI pipeline are run. There are multiple synonyms for this part of the CI infrastructure. TeamCity seems to define a build agent as an environment where one build at a time can run.
Jenkins would define the machine which runs builds as a slave with a (different) master machine that coordinates which builds runs where. Multiple builds can run on the same slave in Jenkins in different executor slots.
Another system using a build agent is a Team Foundation server which should be structured similarly to TeamCity's solution. There has already been a more detailed answer here.

Continuous Delivery pipeline integrated with TFS

We work on .NET project, using TFS for:
source control
builds: gated check-ins that produce MSI files
deployments to Labs
We want to create a proper Continuous Delivery pipeline, that is a Dashboard with pipelines for each check-in with traffic lights.
Pipeline should show all the stages like TFS build > Deploy to Lab > Smoke test > Integration Tests > Acceptance Tests > Deploy to PreProd > ...
So it has to be tightly integrated with TFS.
We are assessing 2 options:
use TFS-based tool\plugin\dashboard if there are any that can show pipelines?
use CI tool for example Jenkins, TeamCity, Bamboo to build this pipeline - ideally with support to fetch built code from TFS drop folder, not just the source code
What would you recommend?
If you are using TFS why don't you leverage the built in Release Management tooling? You can create a release pipeline that is automted and even include approvals I necessary.
http://nakedalm.com/building-release-pipeline-release-management-visual-studio-2013/
If you want to integrate the lab tools for collecting test results as part of your pipeline this works as well.
http://nakedalm.com/execute-tests-release-management-visual-studio-2013/
This works pretty well and the new features anounced at Connect() will make it even better.

Resources