Can we use a single jenkins file for multibranch piepeline in jenkins using shared libraries? - jenkins

I am trying to write a jenkinsfile which will take the data from shared libraries in jenkins for multibranch pipeline, something like below:-
#Library('Template')_
if (env.BRANCH_NAME == 'master') {
jenkins1(PROJECTNAME: 'test', GITURL: 'http://test/test.git')
} else {
jenkins2(PROJECTNAME: 'test1', GITURL: 'http:////test/test.git')
}
so the pipeline take the shared library depending upon the if condition, if the branch is master if statement data should work or else should be build.

Yes that’s possible. Actually we’re using a multibranch project to test our changes to our shared library that way.
You have to use the library step to load the library instead of the #Library annotation, like:
if (condition) {
library(‘someLib#${env.BRANCH_NAME}’)
} else {
library(‘someOtherLib’)
}
See https://jenkins.io/doc/pipeline/steps/workflow-cps-global-lib/#library-load-a-shared-library-on-the-fly for all details.
By the way: In case you’re planning to do Pull Requests the following Post might be useful to you as well: https://stackoverflow.com/a/51915362/4279361

Related

jenkins job dsl plugin issue where none of the internal jobs have access to the external jobs

I am using jenkins jobDsl as follows:
#!groovy
node('master') {
stage('Prepare') {
deleteDir()
checkout scm
}
stage('Provision Jobs') {
jobDsl(targets: ['jenkins/commons.groovy', 'folderA/jenkins/jobA.groovy'].join('\n'),
removedJobAction: 'DELETE',
removedViewAction: 'DELETE',
sandbox: false)
}
}
Where I want to use from the jobA.groovy a function that is defined on commons.groovy.
Currently, the jobA.groovy doesn't have access to the function defined on commons.groovy, how can I allow this behavior?
Attached:
jobA.groovy:
test_job("param1", "param2")
common.groovy:
def test_job(String team, String submodule) {
pipelineJob("${team}/${submodule}/test_job") {
displayName("Test Job")
description("This is a Continuous Integration job for testing")
properties {
githubProjectUrl("githubUrl")
}
definition {
cpsScm {
scm {
git {
remote {
url('githubUrl')
credentials('credentials')
refspec('+refs/pull/*:refs/remotes/origin/pr/*')
}
branch('${sha1}')
scriptPath("scriptPath")
}
}
}
}
}
}
The idea would be to be able to call this method test_job("param1", "param2") from jobA.groovy with no issues and I am currently getting:
ERROR: (jobA.groovy, line 9) No signature of method: test_job() is applicable for argument types: (java.lang.String, java.lang.String)
JobDSL creates the jobs. Then at runtime you want your job to call your function. The function must be imported through a shared library.
Create a shared lib
here is a sample: https://github.com/sap-archive/jenkins-pipelayer
the most important piece there is that you need to create a vars/ folder that will define the functions you can call from your pipelines. host the lib on its own repo or orphan branch
Import a shared lib
To import a lib library in Jenkins. From Manage page, go to Configure System Under section Global Pipeline Libraries, add a new library with name of your choice, ie name-of-your-lib, default version master, modern scm git https://urlofyoursharedlib.git
Run a first time the jobDSL job, then go to the In Process Script Approval page and approve everything.
Use a shared lib
To import a library inside your job you must include at the top of the file, after the shebang the statement #Library('name-of-your-lib')_
There is also a similar statement that exists, "library 'name-of-your-lib'". this one is useful to "debug & fix" a shared library because when you hit that replay button you'll see the shared library files used in the pipeline
Finally if all you are trying is to create job templates, I would recommend to try to get what this shared library I shared is doing, it helps with creating declarative templates and solves issues and limitations you will encounter with jobdsl & shared pipeline

Multiple Jenkins pipelines for a single repo

At the moment I have two MultiJob Projects for a single repo:
First runs on develop branch
Second runs on all opened Pull Requests
Each has a lot of nested Freestyle jobs.They are are quite different.
I'm looking at switching to Pipeline-as-Code by using Jenkinsfile. So my question is is there a way to switch Jenkinsfile path/name based on, say branch name. I tried to use MultiBranch Pipeline job type, but it only allows to set a single Jenkinsfile path and it uses it across any branch including PullRequests.
Maybe there is a better way to achieve that? I'm open to discussion. Thank you
You can do it in one jenkinsfile by using when expression, I assume your pipeline is not quite big
pipeline {
agent any
stages {
stage("Set variables from external input") {
when {
branch "develop"
}
steps{
#add the thing which you want execute when branch is develop
}
}
stage("2 for Pull request") {
when {
expression {return !env.GIT_BRANCH.contains('master|develop')}
}
steps{
#add the thing which you want execute when branch is pull request
}
}
}
}

jenkins declarative pipeline ignoring changelog of jenkinsfiles

I have apps and their codes on git repositories. Also jenkinsfiles for building apps and these files on another repository. The problem is jenkins builds changelog. Jenkins add jenkinsfiles changelog to build changesets and I don't want to that. Because these changes are according to infrastructure not relevant with apps. How to prevent this? I didn't find any workaround or solution.
If I got well your question... I don't think you can remove Jenkinsfile changes from the change set that Jenkins reads from git, but instead, you can skip your pipeline to build if there are only changes on Jenkinsfile.
If it helps...
First place, you need to read the change set:
def changedFiles = []
for (changeLogSet in currentBuild.changeSets) {
for (entry in changeLogSet.getItems()) {
for (file in entry.getAffectedFiles()) {
changedFiles.add(file.getPath())
}
}
}
Then, you can check if it is only Jenkinsfile:
if (changedFiles.size() == 1 && changedFiles[0] == "Jenkinsfile"){
println "Only Jenkinsfile has changed... Skipping build..."
currentBuild.getRawBuild().getExecutor().interrupt(Result.SUCCESS) // this skips your build prematurely and makes the rest of the stages green
sleep(1) // you just need this
}else{
println "There are other changes rather than only Jenkinsfile, build will proceed."
}
P.S. You have several ways to terminate the jobs earlier without failing the build, but this one is the cleanest in my experience, even though you need to allow some Admin Signatures Permission on Jenkins (I've seen it in another thread here some time ago, can't find it now though)

How to configure basic branch build strategies plugin using job dsl?

The multi branch pipeline plugin, awesome as it is, doesn't build tags out of the box. The usage of the basic-branch-build-strategies-plugin is required to enable tag discovery and building.
My question is directly related to: Is there a way to automatically build tags using the Multibranch Pipeline Jenkins plugin?
This plugin works great in the UI but doesn't appear to be easily configurable using the Jenkins job dsl. Does anyone have any examples of how to set the branch strategies using the dsl (or dsl configure->) so that tags will be discovered and built?
Having examined the delta between the config.xml files when the settings are changed via ui, it looks like I need to be able to add this trait:
<org.jenkinsci.plugins.github__branch__source.TagDiscoveryTrait />
and this section under build strategies:
<buildStrategies
<jenkins.branch.buildstrategies.basic.TagBuildStrategyImpl
plugin="basic-branch-build-strategies#1.1.1">
<atLeastMillis>-1</atLeastMillis>
<atMostMillis>172800000</atMostMillis>
</jenkins.branch.buildstrategies.basic.TagBuildStrategyImpl>
</buildStrategies>
Something like
multibranchPipelineJob('pipline') {
...
branchSources {
branchSource {
source {
github {
...
traits {
...
gitTagDiscovery()
}
}
buildStrategies {
buildTags {
atLeastDays '-1'
atMostDays '20'
}
}
}
}
}
}
is what I've been working with. It's not documented in the plugin, but that doesn't stop the job-dsl plugin from dynamically generating the API calls for it.
You can see what the API for your specific Jenkins installation is by going to {your_jenkins_url}/plugin/job-dsl/api-viewer/index.html.
Sometimes things won't appear there because a plugins lacks support for job-dsl.
In that case you can still generate the xml with the Configure Block.
However, this is pretty clumsy to use.
Edit: At least if I use gitHubTagDiscovery() as suggested by the dynamically generated API, Jenkins will crash. Instead, the configure block has to be used to get all the discovery methods for github.
configure {
def traits = it / sources / data / 'jenkins.branch.BranchSource' / source / traits
traits << 'org.jenkinsci.plugins.github__branch__source.BranchDiscoveryTrait' {
strategyId(1)
}
traits << 'org.jenkinsci.plugins.github__branch__source.OriginPullRequestDiscoveryTrait' {
strategyId(1)
}
traits << 'org.jenkinsci.plugins.github__branch__source.TagDiscoveryTrait'()
}

How to include multiple pipeline scripts into jenkinsfile

I have a jenkins file as below
pipelineJob('My pipeline job'){
displayName('display name')
logRotator {
numToKeep(10)
daysToKeep(30)
artifactDaysToKeep(7)
artifactNumToKeep(1)
}
definition{
cps {
script(readFileFromWorkspace('./cicd/pipelines/clone_git_code.groovy'))
script(readFileFromWorkspace('./cicd/pipelines/install_dependencies_run_quality_checks.groovy'))
}
}
}
with above jenkinsfile the last script file is replacing other scripts.
Basically I have split tasks into multiple groovy files so that i wont repeat the same code in all jenkinsfile and reuse the same for other jobs as well, like I can now use the clone_git_code.groovy script in dev build as well as QA builds.
You have to use shared libraries (https://jenkins.io/doc/book/pipeline/shared-libraries/). You can define multiple groovy files with classes to return a processed object or simply creating calls with method where you define a step and the execution will be sequential.
I had this same issue when trying to include multiple scripts into a Jenkins job. After doing some research, I found the below solution to be the simplest:
definition {
cps {
script (
ScriptsLibrary.pipelineTest('did it work?') +
ScriptsLibrary.scmConf('repoURL_input', 'accessCredentials', 'activeBranch')
)
}
}
Add the "+" to concatenate the Strings. Got the job done for me :)

Resources