How to include multiple pipeline scripts into jenkinsfile - jenkins

I have a jenkins file as below
pipelineJob('My pipeline job'){
displayName('display name')
logRotator {
numToKeep(10)
daysToKeep(30)
artifactDaysToKeep(7)
artifactNumToKeep(1)
}
definition{
cps {
script(readFileFromWorkspace('./cicd/pipelines/clone_git_code.groovy'))
script(readFileFromWorkspace('./cicd/pipelines/install_dependencies_run_quality_checks.groovy'))
}
}
}
with above jenkinsfile the last script file is replacing other scripts.
Basically I have split tasks into multiple groovy files so that i wont repeat the same code in all jenkinsfile and reuse the same for other jobs as well, like I can now use the clone_git_code.groovy script in dev build as well as QA builds.

You have to use shared libraries (https://jenkins.io/doc/book/pipeline/shared-libraries/). You can define multiple groovy files with classes to return a processed object or simply creating calls with method where you define a step and the execution will be sequential.

I had this same issue when trying to include multiple scripts into a Jenkins job. After doing some research, I found the below solution to be the simplest:
definition {
cps {
script (
ScriptsLibrary.pipelineTest('did it work?') +
ScriptsLibrary.scmConf('repoURL_input', 'accessCredentials', 'activeBranch')
)
}
}
Add the "+" to concatenate the Strings. Got the job done for me :)

Related

jenkins job dsl plugin issue where none of the internal jobs have access to the external jobs

I am using jenkins jobDsl as follows:
#!groovy
node('master') {
stage('Prepare') {
deleteDir()
checkout scm
}
stage('Provision Jobs') {
jobDsl(targets: ['jenkins/commons.groovy', 'folderA/jenkins/jobA.groovy'].join('\n'),
removedJobAction: 'DELETE',
removedViewAction: 'DELETE',
sandbox: false)
}
}
Where I want to use from the jobA.groovy a function that is defined on commons.groovy.
Currently, the jobA.groovy doesn't have access to the function defined on commons.groovy, how can I allow this behavior?
Attached:
jobA.groovy:
test_job("param1", "param2")
common.groovy:
def test_job(String team, String submodule) {
pipelineJob("${team}/${submodule}/test_job") {
displayName("Test Job")
description("This is a Continuous Integration job for testing")
properties {
githubProjectUrl("githubUrl")
}
definition {
cpsScm {
scm {
git {
remote {
url('githubUrl')
credentials('credentials')
refspec('+refs/pull/*:refs/remotes/origin/pr/*')
}
branch('${sha1}')
scriptPath("scriptPath")
}
}
}
}
}
}
The idea would be to be able to call this method test_job("param1", "param2") from jobA.groovy with no issues and I am currently getting:
ERROR: (jobA.groovy, line 9) No signature of method: test_job() is applicable for argument types: (java.lang.String, java.lang.String)
JobDSL creates the jobs. Then at runtime you want your job to call your function. The function must be imported through a shared library.
Create a shared lib
here is a sample: https://github.com/sap-archive/jenkins-pipelayer
the most important piece there is that you need to create a vars/ folder that will define the functions you can call from your pipelines. host the lib on its own repo or orphan branch
Import a shared lib
To import a lib library in Jenkins. From Manage page, go to Configure System Under section Global Pipeline Libraries, add a new library with name of your choice, ie name-of-your-lib, default version master, modern scm git https://urlofyoursharedlib.git
Run a first time the jobDSL job, then go to the In Process Script Approval page and approve everything.
Use a shared lib
To import a library inside your job you must include at the top of the file, after the shebang the statement #Library('name-of-your-lib')_
There is also a similar statement that exists, "library 'name-of-your-lib'". this one is useful to "debug & fix" a shared library because when you hit that replay button you'll see the shared library files used in the pipeline
Finally if all you are trying is to create job templates, I would recommend to try to get what this shared library I shared is doing, it helps with creating declarative templates and solves issues and limitations you will encounter with jobdsl & shared pipeline

Multiple Jenkins pipelines for a single repo

At the moment I have two MultiJob Projects for a single repo:
First runs on develop branch
Second runs on all opened Pull Requests
Each has a lot of nested Freestyle jobs.They are are quite different.
I'm looking at switching to Pipeline-as-Code by using Jenkinsfile. So my question is is there a way to switch Jenkinsfile path/name based on, say branch name. I tried to use MultiBranch Pipeline job type, but it only allows to set a single Jenkinsfile path and it uses it across any branch including PullRequests.
Maybe there is a better way to achieve that? I'm open to discussion. Thank you
You can do it in one jenkinsfile by using when expression, I assume your pipeline is not quite big
pipeline {
agent any
stages {
stage("Set variables from external input") {
when {
branch "develop"
}
steps{
#add the thing which you want execute when branch is develop
}
}
stage("2 for Pull request") {
when {
expression {return !env.GIT_BRANCH.contains('master|develop')}
}
steps{
#add the thing which you want execute when branch is pull request
}
}
}
}

Integrate one jenkinsfile with declarative pipeline into another

tell me please, can I integrate one jenkinsfile with declarative pipeline into another?
The idea is to run in parallel several processes that are divided into different jenkinsfiles.
stage('run-parallel-branches') {
steps {
parallel(
a: {
echo "call Jenkinsfile 1"
},
b: {
echo "call Jenkinsfile 2"
}
)
}
}
Thanks for the help.
You have two solutions:
If your goal is to avoid repeating code, you can use Shared Libraries and extract the work done in Jenkinsfile1 and Jenkinsfile2 into a library, and then call this library instead.
Your two Jenkinsfile can have their dedicated jobs, and you would call them in the parallel stages (it will wait and propagate errors): build 'myJob1' & build 'myJob2'
The problem with this approach, however, is that the invoked job will not necessarily have the same SCM commit as the parent one (if commits entered meanwhile).

Jenkins DSL script - Test Failure - Found multiple extensions which provide method lastCompleted

Trying to create multijobs in Jenkins with DSL scripting.
There are multiple jobs in a phase and I want to create a consolidated report for the multijob from downstream jobs.
I am using copy artifact to copy the results of downstream jobs to the multijob's target dir. Using selector - lastCompleted()
However I am getting this an error saying multiple extensions providing the method and tests are failing. lastCompleted() is apparently present in copyArtifact and multijob plugins where in this case I require both.
Here is my script:
multiJob('dailyMultiJob') {
concurrentBuild(true)
logRotator(-1, 10, -1, 10)
triggers {
cron('H H(0-4) * * 0-6')
}
steps {
phase('Smoke Tests'){
phaseJob('JobA')
phaseJob('JobB')
phaseJob('JobC')
}
copyArtifacts{
selector{
lastCompleted()
}
projectName('JobA')
filter('target/allure-results/*.*')
target('/path/to/this/multijob/workspace')
flatten(false)
}
copyArtifacts{
selector{
lastCompleted()
}
projectName('JobB')
filter('target/allure-results/*.*')
target('/path/to/this/multijob/workspace')
flatten(false)
}
copyArtifacts{
selector{
lastCompleted()
}
projectName('JobC')
filter('target/allure-results/*.*')
target('/path/to/this/multijob/workspace')
flatten(false)
}
}
publishers {
allure {
results {
resultsConfig {
path('target/allure-results')
}
}
}
archiveArtifacts {
pattern('target/reports/**/*.*')
pattern('target/allure-results/**/*.*')
allowEmpty(true)
}
}
}
Getting this below error after running gradle tests
Caused by: javaposse.jobdsl.dsl.DslException: Found multiple extensions which provide method lastCompleted with arguments []: [[hudson.plugins.copyartifact.LastCompletedBuildSelector, com.tikal.jenkins.plugins.multijob.MultiJobBuildSelector]]
I am not sure if there is a way to indicate use specific artifact's method.
Been stuck on this for quite some time. Any helps are highly appreciated. Thank you in advance!
I had come across the same issue few months back.
There are two possible solutions to this issue.
1 - Keep only one plugin that will avoid the conflict. (Not recommended as it might break other jobs)
2- Use configure block to modify the xml file which will avoid this conflict & you can keep multiple plugins that support the same extensions. (Recommended solution)
Thanks,
Late update:
What I had to do is to switch to scripted pipeline jobs instead.
Configure blocks are not really allowed on all the methods you want to use and they are limited by design. I believe some plugins also don't allow it for security reasons.
Better do use Pipelines.

Is it possible to include calls in Declarative Pipelines in the shared library (Jenkins)?

As you know in shared libraries in Jenkins, it is possible to call in the Jenkinsfile a content of a file in vars folder.
For example in vars folder in the shared library we can have a file named build.groovy and in the Jenkinsfile we can call it by :
build {
parameter1 = "some param1"
parameter2 = "some param2"
}
As it is described in this section .
I have no problems having groovy files and calling them with call() method in Jenkinsfile.
But I want to customize a pipeline and make it as generic as possible.
So I want to call a groovy file contained in vars folder but in the same pipeline : Calling a genericStage.groovy in an other file contained in the same vars folder in the shared library.
So what I have is a groovy file in vars folder : genericStage.groovy
What I have is :
Pipeline{
agent{label myNode}
stages{
stage("init"){
//steps
}
genericStage{
parameter1 = "some param"
}
}
}
And in the genericStage :
def call(Closure body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
stage(config.parameter1){
steps{
//steps
}
}
}
But I get the error :
Expected a stage # line 125, column 6.
genericStage{
^
1 error
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1085)
So how to make calls in the same shared library as we do for the Jenkinsfile ?
In your shared library method defining a DSL style step like this you can either define a block of steps and script that does some common processing you need to share between jobs or stages in a job or a whole complete pipeline. So your genericStage.groovy can only contain what comes after the "// steps" comment and cannot contain the stage and steps definition like this. I do this type of library custom step quite a lot in the style you have here but without trying to define the stage / steps inside it and it works fine. What is happening for you here is the pipeline validation / parser is failing the main pipeline syntax before it even gets to handling your custom step because you don't have it wrapped in a stage and steps definition.
If you read the documentation link you have at the end there is a section on defining a complete declarative pipeline in a shared library. It says at the end "Only entire pipelines can be defined in shared libraries as of this time. This can only be done invars/*.groovy, and only in a call method. Only one Declarative Pipeline can be executed in a single build, and if you attempt to execute a second one, your build will fail as a result." This suggests that a partial pipeline like you have won't work, you should have just steps / script or a complete pipeline.

Resources