How to download source code by specific change set number in jenkins? - jenkins

node{
stage('Source Control Management'){
checkout([
$class: 'TeamFoundationServerScm',
credentialsConfigurer: [
$class: 'AutomaticCredentialsConfigurer'
],
projectPath: '$/Onprem/Source/Service',
serverUrl: 'http://abcd/',
useOverwrite: true,
useUpdate: true,
workspaceName: 'Hudson-${JOB_NAME}-${NODE_NAME}'
])
}
}
This pipeline script checkout the latest code whose chain set no is: 921
I want the pipeline to checkout only the previous code with chain set no: 917
What to do?

The plugin README says this available and describes how to do it for a Freestyle job:Use a versionspec argument to specify affected versions of items. Confirmed this works in a Freestyle job, but not a pipeline. Also, only works if supplied a Build Parameter, not as a regular parameter; odd.

Related

How to get a Jenkins Pipeline build to show change history (changes) for a specific branch

I am using a declarative pipeline and groovy scripts to check out my branch. I check out using the checkout step:
[$class: 'GitSCM',
branches: [[name: "${selectedBranch}"]],
browser: [$class: 'BitbucketWeb', repoUrl: 'myURL'],
doGenerateSubmoduleConfigurations: false,
extensions:
[[$class: 'CloneOption', noTags: false, reference: "${cloneReference}", shallow: true, timeout: 5]],
submoduleCfg: [],
userRemoteConfigs: [[url: "${projectDetails.repositoryAddress}"]]])
And that works great. However when looking at the change history, it shows the history for my shared library NOT for the actual branch checked out. This means I'm getting all the history for my jenkins groovy changes, but no history for the actual solution/source being built. I cannot figure out a way to overcome this.
On my Jenkins job I see this:
Started by user Me
Revision: 53eb41e0c05fd4cb466268947102990b2b14354e
GroovyImplementation
Revision: 825d8201904b000f479ebc91c9d244cfb956dd85
refs/remotes/origin/releases/release-2.18
On the "Changes" page I see changes for "GroovyImplementation" (of which there are frequently none) but I want changes for "refs/remotes/origin/releases/release-2.18" which is where the meaningful changes are.
Similarly, on the Stage View I see the number of commits for "GroovyImplementation" rather than the release branch.
How can I display checkout information for the release branch without using a multi-pipeline build?
Since you're using the branch selectedBranch - what you want is supposed to be supported out of the box by SCM step plugin (changelog by default is set to true).
However, I found it not working as well, so what you can do, is use the extension for this plugin: [$class: 'ChangelogToBranch', options: [compareRemote: 'myURL', compareTarget: "${selectedBranch}"]]
As you can see, extensions: gets an array of arrays, so you just add this extension to your existing, e.g.
extensions:
[
[$class: 'CloneOption', noTags: false, reference: "${cloneReference}", shallow: true, timeout: 5],
[$class: 'ChangelogToBranch', options: [compareRemote: 'myURL', compareTarget: "${selectedBranch}"]]
],
The changelog is calculated by default against a previous build, you can read it here https://github.com/jenkinsci/git-plugin#changelog-extensions. If you want the changelog to be calculated against a specific branch you can configure your job explicitly by adding additional behavior Calculate changelog against a specific branch

How do I dynamically load a Jenkins pipeline library from Perforce? [duplicate]

In continuation to jenkins-pipeline-syntax-for-p4sync - I am not able to get the "Poll SCM" option work for my pipeline job.
Here is my configuration:
"Poll SCM" is checked and set to poll every 10 minutes
Pipeline script contains the following:
node ('some-node') // not actual value
{
stage ('checkout')
{
checkout([
$class: 'PerforceScm',
credential: '11111111-1111-1111-1111-11111111111', // not actual value
populate: [
$class: 'AutoCleanImpl',
delete: true,
modtime: false,
parallel: [
enable: false,
minbytes: '1024',
minfiles: '1',
path: '/usr/local/bin/p4',
threads: '4'
],
pin: '',
quiet: true,
replace: true
],
workspace: [
$class: 'ManualWorkspaceImpl',
charset: 'none',
name: 'jenkins-${NODE_NAME}-${JOB_NAME}',
pinHost: false,
spec: [
allwrite: false,
clobber: false,
compress: false,
line: 'LOCAL',
locked: false,
modtime: false,
rmdir: false,
streamName: '',
view: '//Depot/subfolder... //jenkins-${NODE_NAME}-${JOB_NAME}/...' // not actual value
]
]
]
)
}
stage ('now do something')
{
sh 'ls -la'
}
}
Ran the job manually once
Still, polling does not work and job does not have a "Perforce Software Polling Log" link like a non-pipelined job has when configuring the perforce source and Poll SCM in the GUI.
It's like the PerforceSCM is missing a poll: true setting - or i'm doing something wrong.
Currently I have a workaround in which I poll perforce in a non-pipelined job which triggers a pipelined job, but then I have to pass the changelists manually and I would rather the pipeline job to do everything.
edit: versions
jenkins - 2.7.4
P4 plugin - 1.4.8
Pipeline plugin - 2.4
Pipeline SCM Step plugin - 2.2
If you go to the Groovy snippet generator and check the "include in polling" checkbox, you'll see that the generated code includes a line item for it:
checkout([
poll: true,
As an aside, you may run into problems at the moment using ${NODE_NAME} in your workspace name. The polling runs on the master, so it might not properly find the change number of your previous build. If that's the case, I know a fix for it should be coming shortly.
After updating all the plugins to latest (as of this post date) and restarting the jenkins server - the polling appears to be working with the exact same configuration (job now has the poll log link).
I'm not sure what exactly resolved the issue - but I consider it resolved.

Reusing stages of a jenkins pipeline in multiple jobs

My team is moving to Jenkins 2 and I am using the pipeline plugin so that our build can live in our repository. Because getting repositories allocated has lots of overhead in our company we have a single respository with many sub-projects & sub-modules in it.
What I want is separate builds and reporting of Junit/checkstyle/etc reports for each sub-module as well as a final "build and deploy" step for each sub-project putting it all together.
My current plan is to create separate jobs for each sub-module so that they get their own junit/checkstyle/etc reports page. Then have a multi-job project to orchestrate the sub-module builds for the sub-projects. Since all of the sub-projects are simple jar builds, I want to put bulk of the logic in a common file, lets call it JenkinsfileForJars at the root of the sub-project. So the repo structure is
sub-project
JenkinsfileForJars.groovy
sub-moduleA
Jenkinsfile
sub-moduleB
Jenkinsfile
My Jenkinsfile contains
def submoduleName = "submoduleA"
def pipeline
node {
pipeline = load("${env.WORKSPACE}/subproject/JenkinsfileForJars.groovy")
}
pipeline.build()
pipeline.results()
And my JenkinsfileForJars contains
def build() {
stage('Build') {
// Run the maven build
dir("subproject") {
sh "./gradlew ${submoduleName}:build"
}
}
}
def results() {
stage('Results') {
dir("subproject/${submoduleName}") {
junit 'build/test-results/TEST-*.xml'
archive 'build/libs/*.jar'
publishHTML([allowMissing: false, alwaysLinkToLastBuild: false, keepAll: false, reportDir: 'build/reports/cobertura/', reportFiles: 'frame-summary.html', reportName: 'Cobertura Report'])
publishHTML([allowMissing: false, alwaysLinkToLastBuild: false, keepAll: false, reportDir: 'build/reports/findbugs/', reportFiles: 'main.html', reportName: 'Fidbugs Report'])
publishHTML([allowMissing: false, alwaysLinkToLastBuild: false, keepAll: false, reportDir: 'build/reports/pmd/', reportFiles: 'main.html', reportName: 'PMD Report'])
step([$class: 'CheckStylePublisher', pattern: 'build/reports/checkstyle/main.xml', unstableTotalAll: '200', usePreviousBuildAsReference: true])
}
}
}
return this;
When I run the Jenkinsfile above I get the following error:
Running on master in /var/lib/jenkins/workspace/jobA
[Pipeline] {
[Pipeline] load
[Pipeline] { (/var/lib/jenkins/workspace/jobA/subproject/JenkinsfileForJars.groovy)
[Pipeline] }
[Pipeline] // load
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
java.lang.NullPointerException: Cannot invoke method build() on null object
As far as I can tell, I am following what is shown in the documents for loading manual scripts and the example given for a loaded script. I do not understand why my script is null after the load command.
How do I get my Jenkinsfile to load JenkinsfileForJars.groovy?
The problem is related to the SCM checkout as mentioned by Blake Mitchell in the comment above.
Since you are loading your groovy functions from a submodule, you will need to checkout the submodule first, preferably on a build agent /slave, if you would like to keep only bare repos on the master.
def pipeline
node( 'myAgentLabel' ) {
stage ( 'checkout SCM' ) {
checkout([
$class: 'GitSCM'
,branches: scm.branches
,extensions: scm.extensions
+ [[ $class: 'SubmoduleOption', disableSubmodules: false, parentCredentials: true, recursiveSubmodules: true, reference: '', trackingSubmodules: false]]
,doGenerateSubmoduleConfigurations: false
,userRemoteConfigs: scm.userRemoteConfigs
])
pipeline = load( "${env.WORKSPACE}/path/to/submodule/myGroovyFunctions.grooovy" )
}
pipeline.build()
}
Note that in the checkout example, access to scm.* attributes also needs to be whitelisted by an administrator in Jenkins (In-process script approval)
There might be two possible problems:
Why do you put the load in a node structure. This import does not need computational resources, so you do not need it there.
The call to build should be put inside a node structure. And probably the call to result should also be inside (the same) node structure to make sure, that the correct results are archived (if you use more than one (slave) nodes).
(This should probably be a comment below your question but I do not have enough points to add a comment there.)

Jenkinfile DSL how to specify target directory

I'm exploring Jenkins 2.0 pipelines. So far my file is pretty simple.
node {
stage "checkout"
git([url:"https://github.com/luxengine/math.git"])
stage "build"
echo "Building from pipeline"
}
I can't seem to find any way to set the directory that git will checkout to. I also can't find any kind of documentation related to that. I found https://jenkinsci.github.io/job-dsl-plugin/ but it doesn't seem to match what I see on other tutorials.
Clarification
Looks like you are trying to configure Pipeline job (formerly known as Workflow). This type of job is very distinct from Job DSL.
The purpose of Pipeline job is to:
Orchestrates long-running activities that can span multiple build slaves. Suitable for building pipelines (formerly known as workflows) and/or organizing complex activities that do not easily fit in free-style job type.
Where as Job DSL:
...allows the programmatic creation of projects using a DSL. Pushing job creation into a script allows you to automate and standardize your Jenkins installation, unlike anything possible before.
Solution
If you want to checkout your code to specific directory then replace git step with more general SCM checkout step.
Final Pipeline configuration should look like that:
node {
stage "checkout"
//git([url:"https://github.com/luxengine/math.git"])
checkout([$class: 'GitSCM',
branches: [[name: '*/master']],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'RelativeTargetDirectory',
relativeTargetDir: 'checkout-directory']],
submoduleCfg: [],
userRemoteConfigs: [[url: 'https://github.com/luxengine/math.git']]])
stage "build"
echo "Building from pipeline"
}
As a future reference for Jenkins 2.0 and Pipeline DSL please use built-in Snippet Generator or documentation.
This can be done by using the directive of dir:
def exists = fileExists '<your target dir>'
if (!exists){
new File('<your target dir>').mkdir()
}
dir ('<your target dir>') {
git url: '<your git repo address>'
}
First make clear that you are using Jenkins Job DSL.
You can do this like this:
scm {
git {
wipeOutWorkspace(true)
shallowClone(true);
remote {
url("xxxx....")
relativeTargetDir('checkout-folder')
}
}
}
https://jenkinsci.github.io/job-dsl-plugin/
This above address gives you the chance simply to type in upper left aread for example 'scm' and than it will show in which relationships 'scm' can be used. Than you can select 'scm-freestylejob' and afterwards click on the '***' than you can see the details.
The general start point for Jenkins Job DSL is here:
https://github.com/jenkinsci/job-dsl-plugin/wiki
You can of course ask here on SO or on Google Forum:
https://groups.google.com/forum/#!forum/job-dsl-plugin
pipeline {
agent any
stages{
stage("Checkout") {
steps {
dir('def exists = fileNotExists \'git\'') {
bat label: '', script: 'sh "mkdir.sh'
}
dir ('cm') {
git branch: 'dev',
credentialsId: '<your credential id>',
url: '<yours git url>'
}
}
} //End of Checkout stage
stage("TestShellScript") {
steps {
bat label: '', script: 'sh "PrintNumber.sh"'
}
}
}//End of stages
} // End of pipeline
Note: cat mkdir.sh
#!/bin/bash
#Create a directory
mkdir git
You are using the Pipeline Plugin, not the Job DSL Plugin. In the Pipeline Plugin, if you want to define something, where there is not yet a function available in the Pipeline syntax, you can define it yourself.

Updating Jira tickets from Jenkins workflow (jenkinsfile)

How can I update a jira issue from within a Jenkinsfile (jenkins-worflow/pipeline)?
Is there a way I could use the Jira Issue Updater plugin as a step in the Jenkinsfile?
I know I could use the Jira RestAPI, but I'm trying to figure out if I can re-use the functionality provided by the jira-updater-issue.
What I'm looking for is a something similar to the example below calling Junit archiver, and atifact archiver, but calling jira updater.
node {
git url: 'https://github.com/jglick/simple-maven-project-with-tests.git'
def mvnHome = tool 'M3'
sh "${mvnHome}/bin/mvn -B -Dmaven.test.failure.ignore verify"
step([$class: 'ArtifactArchiver', artifacts: '**/target/*.jar', fingerprint: true])
step([$class: 'JUnitResultArchiver', testResults: '**/target/surefire-reports/TEST-*.xml'])
}
The Jira Plugin is compatible with Pipeline.
This should work:
step([$class: 'hudson.plugins.jira.JiraIssueUpdater',
issueSelector: [$class: 'hudson.plugins.jira.selector.DefaultIssueSelector'],
scm: [$class: 'GitSCM', branches: [[name: '*/master']],
userRemoteConfigs: [[url: 'https://github.com/jglick/simple-maven-project-with-tests.git']]]])
You can get a full reference in the built-in Pipeline Snippet Generator.
The JIRA Steps Plugin provides a more declarative way to update an existing Jira Ticket:
node {
stage('JIRA') {
# Look at IssueInput class for more information.
def testIssue = [fields: [ // id or key must present for project.
project: [id: '10000'],
summary: 'New JIRA Created from Jenkins.',
description: 'New JIRA Created from Jenkins.',
customfield_1000: 'customValue',
// id or name must present for issuetype.
issuetype: [id: '3']]]
response = jiraEditIssue idOrKey: 'TEST-01', issue: testIssue
echo response.successful.toString()
echo response.data.toString()
}
}
Since you would like to use the Jenkinsfile to define your pipeline, that should be the preferred way for you to go...
As this was way harder for me than it should be, here is a working example. This will update a custom field of a ticket with a specific value:
step([$class: 'IssueFieldUpdateStep',
issueSelector: [$class: 'hudson.plugins.jira.selector.ExplicitIssueSelector', issueKeys: ticket],
fieldId: field,
fieldValue: value
])
The snippet generator did not work for me. The variables ticket, field and value are all strings. Starting from this you can look for options here: https://www.jenkins.io/doc/pipeline/steps/jira/
Yes, seems like this page answers your question:
https://wiki.jenkins-ci.org/display/JENKINS/Jira+Issue+Updater+Plugin
After you install the plugin, add a build step, or pre/post build step to call this plugin
There you can give it the REST URL to your Jira server, the creds and the JQL to find the issues

Resources