SparseCheckout in Jenkinsfile pipeline - jenkins

In a jenkinsfile, I have specified the folderName through SparseCheckoutPaths which I want to checkout. But I am getting a whole branch checkout instead.
checkout([$class: 'GitSCM',
branches: [[name: '*/branchName']],
extensions: [[$class: 'SparseCheckoutPaths', path: 'FolderName']],
userRemoteConfigs: [[credentialsId: 'someID',
url: 'git#link.git']]])

Here comes the answer to my own question. For a bit of background how does it work, there is flag/configuration for git client called sparsecheckout which is responsible for this kind of checkout. Additionally, a sparse-checkout named file is also required. For more info look here.
My problem was the syntax for the Jenkinsfile and correct one is as follows:
checkout([$class: 'GitSCM',
branches: [[name: '*/branchName']],
doGenerateSubmoduleConfigurations: false,
extensions: [
[$class: 'SparseCheckoutPaths', sparseCheckoutPaths:[[$class:'SparseCheckoutPath', path:'folderName/']]]
],
submoduleCfg: [],
userRemoteConfigs: [[credentialsId: 'someID',
url: 'git#link.git']]])
for more info, here comes the github-link

You can define a custom step sparseCheckout in a shared library that adds on top of the existing checkout scm.
vars/sparseCheckout.groovy:
def call(scm, files) {
if (scm.class.simpleName == 'GitSCM') {
def filesAsPaths = files.collect {
[path: it]
}
return checkout([$class : 'GitSCM',
branches : scm.branches,
doGenerateSubmoduleConfigurations: scm.doGenerateSubmoduleConfigurations,
extensions : scm.extensions +
[[$class: 'SparseCheckoutPaths', sparseCheckoutPaths: filesAsPaths]],
submoduleCfg : scm.submoduleCfg,
userRemoteConfigs : scm.userRemoteConfigs
])
} else {
// fallback to checkout everything by default
return checkout(scm)
}
}
Then you call it with:
sparseCheckout(scm, ['path/to/file.xml', 'path2])

Your syntax looks good, but, as seen in "jenkinsci/plugins/gitclient/CliGitAPIImpl.java", did you specify the right configuration?
private void sparseCheckout(#NonNull List<String> paths) throws GitException, InterruptedException {
boolean coreSparseCheckoutConfigEnable;
try {
coreSparseCheckoutConfigEnable = launchCommand("config", "core.sparsecheckout").contains("true");
} catch (GitException ge) {
coreSparseCheckoutConfigEnable = false;
}
In other words, is git config core.sparsecheckout equal to true in the repo you are about to checkout?

Related

restarting one stage in jenkins pipeline wiping out existing directory

I am using jenkins declarative pipeline jenkinsfile for our project. we want to try the option restart at stage.
pipeline {
agent { label 'worker' }
stages {
stage('clean directory') {
steps {
cleanWs()
}
}
stage('checkout') {
steps {
checkout([$class: 'GitSCM', branches: [[name: 'develop']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: devops], [$class: 'LocalBranch', localBranch: "**"]], userRemoteConfigs: [[credentialsId: 'xxxxxx', url: git#github.com/test/devops.git]]])
checkout([$class: 'GitSCM', branches: [[name: 'develop']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: harness], [$class: 'LocalBranch', localBranch: "**"]], userRemoteConfigs: [[credentialsId: 'xxxxxx', url: git#github.com/test/harness.git]]])
checkout([$class: 'GitSCM', branches: [[name: 'develop']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: automation], [$class: 'LocalBranch', localBranch: "**"]], userRemoteConfigs: [[credentialsId: 'xxxxxx', url: git#github.com/test/automation.git]]])
}
}
stage('build initial commit to release train') {
steps {
sh '''#!/bin/bash
export TASK="build_initial_commit"
cd automation
sh main.sh
'''
}
}
stage('deploy application') {
steps {
sh '''#!/bin/bashexport TASK="deploy"
cd automation
sh main.sh
'''
}
}
}
}
and in jenkins I am using 'Pipeline script from SCM'. Jenkinsfile is present in automation.git repo (which is also defined in checkout stage)
Whenever I am restarting stage from GUI from 3rd one .. the workspace directory automatically gets cleaned up and it checksout automation.git ..
and the run fails as the other cloned repos were got cleaned...
how to handle this.. I want to restart the stage without wiping out the workspace dir..
if we just want to run the 3rd step 'deploy application' ..
I am not able to do , as the step depends on all 3 repos.. and
while restarting only 3rd stage the workspace is getting wiped out.. and as checkout is done in 1st stage(skipped) ... job is failing
how do I run only 3rd stage with retaining the old workspace ..
How about this:
SHOULD_CLEAN = true
pipeline {
agent { label 'worker' }
stages {
stage('clean directory') {
steps {
script {
if (SHOULD_CLEAN) {
cleanWs()
SHOULD_CLEAN = false
} else {
echo 'Skipping workspace clean'
}
}
}
}

Jenkins not sending email if changeset not empty and status is not failure

I have something really weird happening.
I use jenkins scripted pipeline to send an email with email ext plugin and template groovy-html.template.
The email is properly sent if the changeset is empty or if the build result is failure, but if the build result is in (SUCCESS, UNSTABLE) and the changeset not empty, i never get the email...
I looked into all jenkins logs and did not find any error that could explain this behavior.
The issue is also happening with email template jelly html or groovy text.
Any idea why i'm getting this behaviour?
Here is my codesnipped:
emailext(
subject: 'Deployment',
body: '${SCRIPT, template="groovy-html.template"}',
to: 'email#address.com')
And here is the complete pipeline.
Would you like to try to use a declarative pipeline?
change this section
node('master') {
checkout(scm: [$class: 'GitSCM',
branches: [[name: "*/develop"]],
extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'repo1']],
userRemoteConfigs: [[credentialsId: 'bitbucket.jenkins',
url: 'urlToRepo.git']]],
changelog: true, poll: true)
showChangeLogs()
//currentBuild.result = 'FAILURE'
emailext(
subject: 'Deployment',
body: '${SCRIPT, template="groovy-html.template"}',
to: 'email#address.com')
}
by this one
pipeline {
agent any
stages {
stage('master') {
steps {
script {
checkout(scm: [$class: 'GitSCM',
branches: [[name: "*/develop"]],
extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'repo1']],
userRemoteConfigs: [[credentialsId: 'bitbucket.jenkins',
url: 'urlToRepo.git']]],
changelog: true, poll: true)
showChangeLogs()
//currentBuild.result = 'FAILURE'
}
}
}
}
post {
always {
emailext(
subject: 'Deployment',
body: '${SCRIPT, template="groovy-html.template"}',
to: 'email#address.com')
}
}
}

Jenkins pipeline checkout based on parameter provided

I have Jenkins declarative pipeline and want to checkout a branch based on parameter provided
def envToBranch = [:]
envToBranch['dev'] = 'develop'
envToBranch['stg'] = 'stage'
envToBranch['prod'] = 'master'
pipeline {
parameters {
choice(name: 'ENV', choices: ['dev', 'stg', 'prod'], description: 'Application environment')
}
stages {
stage('Checkout') {
steps {
checkout([$class: 'GitSCM',
branches: [[name: '<how-to-access-mapping-here>']],
doGenerateSubmoduleConfigurations: false,
extensions: [
[$class: 'SparseCheckoutPaths', sparseCheckoutPaths:[[$class:'SparseCheckoutPath', path:'ansible/']]]
],
submoduleCfg: [],
userRemoteConfigs: [
[credentialsId: 'my-creds',
url: 'git#github.com:MyOrg/my-repo.git']
]])
}
}
}
}
So how can I access a mapping within checkout directive? Am I able to do the same within script directive?

How to start jobs(parallel) from the main job with different inputs in jenkins?

I am using jenkins and having scripted syntax in jenkinsfile
In the main job after source checkout I need to run other job n times (parallel) with different inputs .
Any tips to start this?
def checkout(repo, branch) {
checkout(changelog: false,
poll: false,
scm: [$class : 'GitSCM',
branches : [[name: "*/${branch}"]],
doGenerateSubmoduleConfigurations: false,
recursiveSubmodules : true,
extensions : [[$class: 'LocalBranch', localBranch: "${branch}"]],
submoduleCfg : [], userRemoteConfigs: [[credentialsId: '', url: "${repo}"]]])
withCredentials([[$class : '',
credentialsId : '',
passwordVariable: '',
usernameVariable: '']]) {
sh "git clean -f && git reset --hard origin/${branch}"
}
}
node("jenkins02") {
stage('Checkout') {
checkout gitHubRepo, gitBranch
}
}
We do this by storing all the jobs we want to run in a Map and then pass it into the parallel step for execution. So you just setup the different params and add each definition into the map, then execute.
Map jobs = [:]
jobs.put('job-1', {
stage('job-1') {
node {
build(job: "myorg/job-1/master", parameters: [new StringParameterValue('PARAM_NAME','VAL1')], propagate: false)
}
}
})
jobs.put('job-2', {
stage('job-2') {
node {
build(job: "myorg/job-2/master", parameters: [new StringParameterValue('PARAM_NAME','VAL2')], propagate: false)
}
}
})
parallel(jobs)

Jenkins library step fails if not wrapped in script

I'm having a strange issue which I can't seem to quite understand. I have written a custom step which accepts parameters used to clone github/bitbucket repositories more easily. The step works just fine - it calls the appropriate checkout() for branches and prs, but for some reason this only works if you call it from a script { gitUtils.cloneRepo(...) }. It doesn't work in a declarative pipeline if you don't wrap it around with a script { } with a super strange exception:
WorkflowScript: 25: Expected a symbol # line 25, column 17.
gitUtils().getCredentials(repo)
^
WorkflowScript: 26: Expected a symbol # line 26, column 17.
gitUtils().cloneRepo(url: repo)
^
WorkflowScript: 27: Expected a symbol # line 27, column 17.
gitUtils().getRevision()
^
WorkflowScript: 26: Invalid parameter "url", did you mean "message"? # line 26, column 38.
gitUtils().cloneRepo(url: repo)
^
WorkflowScript: 27: Missing required parameter: "message" # line 27, column 17.
gitUtils().getRevision()
Any ideas why this is happening?
import java.lang.IllegalArgumentException
def call() {
return this
}
def cloneRepo(Map parameters = [url: null, branch: "master", credentials: null]) {
def url = parameters.getOrDefault("url", null)
def branch = parameters.getOrDefault("branch", "master")
def credentials = parameters.getOrDefault("credentials", null)
script {
if(!url) {
throw new IllegalArgumentException("cloneRepo() expects url argument to be present!")
}
if(credentials == null) {
credentials = getCredentials(url)
}
if (branch.matches("\\d+") || branch.matches("PR-\\d+")) {
if (branch.matches("PR-\\d+")) {
branch = branch.substring(3)
}
checkout changelog: false, poll: false, scm: [
$class: 'GitSCM',
branches: [[name: 'pr/' + branch]],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'LocalBranch', localBranch: 'pr/' + branch]],
submoduleCfg: [],
userRemoteConfigs: [[
credentialsId: credentials,
refspec: 'refs/pull/' + branch + '/head:pr/' + branch,
url: url
]]
]
} else {
checkout changelog: false, poll: false, scm: [
$class: 'GitSCM',
branches: [[name: branch]],
doGenerateSubmoduleConfigurations: false,
extensions: [],
submoduleCfg: [],
userRemoteConfigs: [[
credentialsId: credentials,
url: url
]]
]
}
}
}
The script{} step takes a block of Scripted Pipeline (which contains functionality provided by the Groovy language) and executes that in the Declarative Pipeline.
Since gitUtils.cloneRepo(...) is scripted pipeline block, you need to use script{} so that it can be embedded in a Declarative Pipeline step.

Resources