How create Jenkins Build Pipeline depends on test result? - jenkins

I have 3 Jenkins jobs. Smoke tests, critical path test (part 1), critical path test (part 2).
Now it starts one by one. I need create build PipeLine depends on test result. I need to take into account the results of a single test (#Test annotation in TestNG), ignoring the overall result of test suite.
I want to get the configuration like this:
Smoke tests -> If specified test passed then run critical path test Part 1 and Part 2 on different nodes
So, please tell me how depends in Jenkins only on one tests result (not all suite)?

You can try to use some of build log analysis plugins:
https://wiki.jenkins-ci.org/display/JENKINS/Text-finder+Plugin
https://wiki.jenkins-ci.org/display/JENKINS/Post+build+task
Scan build output, and downgrade build result to failure on specific text.
Next in downstream item check option "Build after other projects are built" in build triggers section. Set proper upstream item name and set proper trigger result.

I solved that task by using 2 Jenkins extensions:
https://wiki.jenkins-ci.org/display/JENKINS/EnvInject+Plugin
https://wiki.jenkins-ci.org/display/JENKINS/Build+Flow+Plugin
Create properties file from test. File contains property, that indicate result of test step status
With EnvInject Plugin add new step into Jenkins Job (step must be after test run) and inject parameter value (from file created at first step)
Create build flow with Build Flow Plugin
Write groovy script:
smokeTest = build( "Run_Smoke_Test" )
def isTestStepSuccessful = smokeTest.environment.get( "TestStepSuccessful" )
if (isTestStepSuccessful != "false") {
parallel (
{
build("Run_Critical_Path_Part_1_Test")
build("Run_Critical_Path_Part_3_Test")
},
{
build("Run_Critical_Path_Part_2_Test")
}
)
}
build( "Run_Critical_Path_Final_Test" )

Related

PullRequest Build Validation with Jenkins and OnPrem Az-Devops

First off the setup in question:
A Jenkins Instance with several build nodes and on prem Azure-Devops server containing the Git Repositories.
The Repo in question is too large to always build on push for all branches and all devs, so a small workaround was done:
The production branches have a polling enabled twice a day (because of testing duration which is handled downstream more builds would not help with quality)
All other branches have their automated building suppressed. They still can start it manually for Builds/Deployments/Unittests if they so choose.
The jenkinsfile has parameterization for which platforms to build, on prod* all the platforms are true, on all other branches false.
This helps because else the initial build of a feature branch would always build/deploy locally all platforms which would take too much of a load on the server infrastructure.
I added a service endpoint for Jenkins in the Azure Devops, added a Buildvalidation .yml - this basically works because when I call the sourcebranch of the pull request with the merge commitID i added a parameter
isPullRequestBuild which contains the ID of the PR.
snippet of the yml:
- task: JenkinsQueueJob#2
inputs:
serverEndpoint: 'MyServerEndpoint'
jobName: 'MyJob'
isMultibranchJob: true
captureConsole: true
capturePipeline: true
isParameterizedJob: true
multibranchPipelineBranch: $(System.PullRequest.SourceBranch)
jobParameters: |
stepsToPerform=Build
runUnittest=true
pullRequestID=$(System.PullRequest.PullRequestId)
Snippet of the Jenkinsfile:
def isPullRequest = false
if ( params.pullRequestID?.trim() )
{
isPullRequest = true
//do stuff to change how the pipeline should react.
}
In the jenkinsfile I look whether the parameter is not empty and reset the platforms to build to basically all and to run the unittests.
The problem is: if the branch has never run, Jenkins does not already know the parameter in the first run, so it is ignored, building nothing, and returning with 0 because "nothing had to be done".
Is there any way to only run the jenkins build if it hasnt run already?
Or is it possible to get information from the remote call if this was the build with ID 1?
The only other thing would be to Call the Jenkins via web api and check for the last successful build, but in that case I would have have the token somewhere stored in source control.
Am I missing something obvious here? I dont want to trigger the feature branch builds to do nothing more than once, because Devs could lose useful information about their started builds/deployments.
Any ideas appreciated
To whom it may concern with similar problems:
In the end I used the following workaround:
The Jenkins Endpoint is called via a user that only is used for automated builds. So, in case that this user triggered the build, I set everything to run a Pull Request Validation, even if it is the first build. Along the lines of
def causes = currentBuild.getBuildCauses('hudson.model.Cause$UserIdCause')
if (causes != null)
{
def buildCauses= readJSON text: currentBuild.getBuildCauses('hudson.model.Cause$UserIdCause').toString()
buildCauses.each
{
buildCause ->
if (buildCause['userId'] == "theNameOfMyBuildUser")
{
triggeredByAzureDevops = true
}
}
}
getBuildcauses must be allowed to run by a Jenkins Admin for that to work.

Jenkins - Add a dynamic label to a build

I am trying to achieve the following task in Jenkins:
1) Build a maven project
2) When running the test cases I print certain messages to the console output
3) Parse the console output of the build and determine if certain patterns exist in the output
4) If the pattern exists I want to label the build with a specific string
I have achieved steps 1-3. I am not able to create a dynamic label and tie it to a build. I have a Groovy script that parses the console output and determines if the pattern exists in the build's output.
Bamboo provides this feature to label a build based on regular expressions present in the build's console output.
Link - https://confluence.atlassian.com/bamboo0606/using-bamboo/jobs-and-tasks/configuring-jobs/configuring-miscellaneous-settings-for-a-job/configuring-automatic-labeling-of-job-build-results
I have gone through various existing Jenkins plugins but have not been successful in achieving this functionality. Is there a plugin to achieve this functionality or can I add additional lines in the Groovy script to create a dynamic build label.
Any help is appreciated.
you can use if to set agent:
def AGENT_LABEL = null
node('master') {
stage('Checkout and set agent'){
checkout scm
### Or just use any other approach to figure out agent label: read file, etc
if (env.BRANCH_NAME == 'master') {
AGENT_LABEL = "prod"
} else {
AGENT_LABEL = "dev"
}
}
}
pipeline {
agent {
label "${AGENT_LABEL}"
}

Pass large amount of parameters between jobs

I have two Jenkins jobs that tun on separate computers. On computer 1 I read properties file and use it for environment variables. But i need the same file on PC 2 and it only exist on the first one. When the first Jenkins job finishes it starts the second one and it can pass parameters file via job but I have to receive with creation of separate parameter with Parameterized Trigger Plugin for each parameter, and I have a lot and don`t want to do so. Is there simple solution for this issue?
Forget Jenkins 1 and the plugins Parameterized Trigger Plugin. Using Jenkins 2, here's an example of your need:
node ("pc1") {
stage "step1"
stash name: "app", includes: "properties_dir/*"
}
node ("pc2") {
stage "step2"
dir("dir_to_unstash") {
unstash "app"
}
}

Jenkins, how to check regressions against another job

When you set up a Jenkins job various test result plugins will show regressions if the latest build is worse than the previous one.
We have many jobs for many projects on our Jenkins and we wanted to avoid having a 'job per branch' set up. So currently we are using a parameterized build to build eg different development branches using a single job.
But that means when I build a new branch any regressions are measured against the previous build, which may be for a different branch. What I really want is to measure regressions in a feature branch against the latest build of the master branch.
I thought we should probably set up a separate 'master' build alongside the parameterized 'branches' build. But I still can't see how I would compare results between jobs. Is there any plugin that can help?
UPDATE
I have started experimenting in the Script Console to see if I could write a post-build script... I have managed to get the latest build of master branch in my parameterized job... I can't work out how to get to the test results from the build object though.
The data I need is available in JSON at
http://<jenkins server>/job/<job name>/<build number>/testReport/api/json?pretty=true
...if I could just get at this data structure it would be great!
I tried using JsonSlurper to load the json via HTTP but I get 403, I guess because my script has no auth session.
I guess I could load the xml test results from disk and parse them in my script, it just seems a bit stupid when Jenkins has already done this.
I eventually managed to achieve everything I wanted, using a Groovy script in the Groovy Postbuild Plugin
I did a lot of exploring using the script console http://<jenkins>/script and also the Jenkins API class docs are handy.
Everyone's use is going to be a bit different as you have to dig down into the build plugins to get the info you need, but here's some bits of my code which may help.
First get the build you want:
def getProject(projectName) {
// in a postbuild action use `manager.hudson`
// in the script web console use `Jenkins.instance`
def project = manager.hudson.getItemByFullName(projectName)
if (!project) {
throw new RuntimeException("Project not found: $projectName")
}
project
}
// CloudBees folder plugin is supported, you can use natural paths:
project = getProject('MyFolder/TestJob')
build = project.getLastCompletedBuild()
The main test results (jUnit etc) seem to be available directly on the build as:
result = build.getTestResultAction()
// eg
failedTestNames = result.getFailedTests().collect{ test ->
test.getFullName()
}
To get the more specialised results from eg Violations plugin or Cobertura code coverage you have to look for a specific build action.
// have a look what's available:
build.getActions()
You'll see a list of stuff like:
[hudson.plugins.git.GitTagAction#2b4b8a1c,
hudson.scm.SCMRevisionState$None#40d6dce2,
hudson.tasks.junit.TestResultAction#39c99826,
jenkins.plugins.show_build_parameters.ShowParametersBuildAction#4291d1a5]
These are instances, the part in front of the # sign is the class name so I used that to make this method for getting a specific action:
def final VIOLATIONS_ACTION = hudson.plugins.violations.ViolationsBuildAction
def final COVERAGE_ACTION = hudson.plugins.cobertura.CoberturaBuildAction
def getAction(build, actionCls) {
def action = build.getActions().findResult { act ->
actionCls.isInstance(act) ? act : null
}
if (!action) {
throw new RuntimeException("Action not found in ${build.getFullDisplayName()}: ${actionCls.getSimpleName()}")
}
action
}
violations = getAction(build, VIOLATIONS_ACTION)
// you have to explore a bit more to find what you're interested in:
pylint_count = violations?.getReport()?.getViolations()?."pylint"
coverage = getAction(build, COVERAGE_ACTION)?.getResults()
// if you println it looks like a map but it's really an Enum of Ratio objects
// convert to something nicer to work with:
coverage_map = coverage.collectEntries { key, val -> [key.name(), val.getPercentageFloat()] }
With these building blocks I was able to put together a post-build script which compared the results for two 'unrelated' build jobs, then using the Groovy Postbuild plugin's helper methods to set the build status.
Hope this helps someone else.

Running Jenkins job simultaneously on all nodes

TLDR: I want to be able to run job simultaneously on multiple nodes in Jenkins pipeline. [ for example - build application x on nodes dev, test & staging nodes based on aws ]
I have a large group of nodes with the same label. I would like to be able to run a job in Jenkins that executes on all of the nodes with the same label as well as doing so simultaneously.
I saw a suggestion to use the matrix configuration option in Jenkins, but I can only think of one axis (the label group). When I try and run the job, it seems like it only executes once instead of 300 times (1 for each of the nodes in that label group).
What should my other axis be? Or...is there some plugin to do this? I had tried the NodeLabel Parameter Plugin, and choosing "run on all available online nodes", but it does not seem to run the jobs simultaneously.
Install
Parameterized Trigger Plugin
NodeLabel Parameter Plugin
For the job you want to run, enable Execute concurrent builds if necessary
Create another job besides the job you want to run on all slaves and configure it
Build > Add build step > Trigger/call builds on other projects
Add ParameterFactories > All Nodes for Label Factory > Label: the label of the nodes
The matrix build will work; use "Slaves" as the axis and expand the "Individual nodes" list to select all of your nodes.
Note that you will need to update the selection every time you add or remove a slave.
For a more maintainable solution, you could use the Job DSL plugin to set up a seed job that has the template for the build, then loops over each slave and creates a new job with the build label set to the name of the slave.
There is two plugins that you need: Paramitrized Trigger Plugin to be able to trigger other jobs as build step of your main job, and NodeLabel Plugin (read the BuildParameterFactory section for descrition of what you need) to specify the label.
The best and easiest way to accomplish this is using Elastic Axis plugin. 1. Install the pulgin.
2. Create a Multi Configuration job.(Install if not present)
3. In the job configuration you can find new axis added as Elastic axis. Add the label as shown below to get the job run on multiple slaves.
Taking a few of the above answers and adjusting them for 2.0 series.
You can now launch all a job on all nodes.
// The script triggers PayloadJob on every node.
// It uses Node and Label Parameter plugin to pass the job name to the payload job.
// The code will require approval of several Jenkins classes in the Script Security mode
def branches = [:]
def names = nodeNames()
for (int i=0; i<names.size(); ++i) {
def nodeName = names[i];
// Into each branch we put the pipeline code we want to execute
branches["node_" + nodeName] = {
node(nodeName) {
echo "Triggering on " + nodeName
build job: 'PayloadJob', parameters: [
new org.jvnet.jenkins.plugins.nodelabelparameter.NodeParameterValue
("TARGET_NODE", "description", nodeName)
]
}
}
}
// Now we trigger all branches
parallel branches
// This method collects a list of Node names from the current Jenkins instance
#NonCPS
def nodeNames() {
return jenkins.model.Jenkins.instance.nodes.collect { node -> node.name }
}
Taken from the code
https://jenkins.io/doc/pipeline/examples/#trigger-job-on-all-nodes
Rundeck might be a tool better suited to your needs. Can be setup to run several jobs in parallel and has a plugin for Jenkins: http://rundeck.org/
Rundeck is designed to integrate with larger systems. We generate the resource file from our configuration management database. Very easy to do see the documentation: http://rundeck.org/docs/administration/node-resource-sources.html.
Additionally plugins available for amazon and/or systems like puppet and chef: http://rundeck.org/plugins
I was looking for a way to run docker system prune on all nodes (with label docker). I ended with a pretty simple scripted pipeline, which AFAIK will only need the pipeline plugin to work:
#!/usr/bin/env groovy
def nodes = [:]
nodesByLabel('docker').each {
nodes[it] = { ->
node(it) {
stage("docker-prune#${it}") {
sh('docker system prune -af --filter "until=1440h"')
}
}
}
}
parallel nodes
Note: Requires Pipeline Utility Steps
What this does, it is looking for all nodes with label docker, then iterates over it and creates an associative array nodes with one step per found node (to be precise, what this is doing is cleaning all old docker stuff older then 60 days). parallel nodes starts to execute in parallel (on all found nodes simultaneously).
Hope that this will help someone.
Got it - No need for any special plugin!
I've created a parent job that triggers/call another build ,
And when I'm calling him I pass him the Label that I wan't the child job to run on.
So basically the parent job Only triggers the job I need ,
and the child job will run as many times as the number of slaves in that Label
(In my case 4 times).
Enable This project is parameterized, add a parameter of type Label, enter an arbitrary name for the label and select a default value such as a label covering a number of nodes or a conjuction (&&) of such labels. Enable Run on all nodes matching the label, keep Run regardless of result, keep Node eligibility at All nodes.
Solution: You can succinctly parallel the same build across multiple Jenkins nodes
This can be useful for building the same project on different environments ( for example: build node applications on test ,dev and staging environments )
Example:
pipeline {
agent { docker { image 'node:14-alpine' } }
stages {
stage('build') {
steps {
parallelTasks
}
}
}
}
def parallelTasks() {
def labels = ['test', 'dev', 'staging'] // labels for Jenkins node types we will build on
def builders = [:]
for (x in labels) {
def label = x
builders[label] = {
node(label) {
sh """#!/bin/bash -le
echo "build app on ${label} node"
cd /home/app
npm run build
"""
}
}
}
parallel builders
}

Resources