I'm trying to create a Jenkins pipeline job that uses different Jenkinsbuild files depending on the job parameter. Is there a way to load the Jenkins build file during the execution of the job, something like:
node {
stage("Determine build parameter") {
String jenkinsFile = .....
}
// here the Jenkins build file should be loaded
loadSomeHowBuildFile jenkinsFile
// ... and then the pipeline steps defined in the jenkinsFile are executed
}
It would be really great is this will work...
I found the solution which is quite simple. Jenkins only needs to load the file
node {
stage("Determine build file") {
String jenkinsFile = /path/to/Jenkins/build/file
}
// Here the Jenkins build file is loaded and executed
load jenkinsFile
}
Related
Our project uses a jenkins-shared-library which has generic pipeline stages. We are looking at adding a stage that will inspect the code coverage and fail the pipeline if the coverage targets aren't met. The Cobertura plugin available with jenkins is capable of doing so, but I am facing challenges implementing it. Is there a way to add a custom pipeline stage in our jenkinsfile that will run after the shared-library code is run all as part of the same pipeline? Is it possible to import 2 shared libraries and use both together as part of the same pipeline? I am relatively new to this and any help is greatly appreciated. Thanks!
To answer your questions :
Is there a way to add away to add a custom pipeline stage in our jenkinsfile that will run after the shared-library code is run all as part of the same pipeline?
Is it possible to import 2 shared libraries and use both together as part of the same pipeline ?
Yes to both of the question, you can add any number of custom stages in your pipeline and also it can run after the shared-library code is running. Example of Jenkinsfile and new shared library file as below:
# stagelibrary variable will be used later to contain old_stagelibraries and is filled in # stage ('Old stage')
def oldstagelibrary
# newstagelibrary variable will contain path of your new sharedlibrary
def newstagelibrary
stage('Old stage') {
steps {
script {
// Load Shared library Groovy file old_stagelibraries.Give your path of old_stagelibraries file which is created
oldstagelibrary = load 'C:\\Jenkins\\old_stagelibraries'
// Execute your function available in old_stagelibraries.groovy file.
oldstagelibrary.MyOld_library()
}
}
}
# Add your new stage in the Jenkinsfile and use your new_stagelibraries file that is created
stage('New stage') {
steps {
script {
// Load Shared library Groovy file new_stagelibraries which will contain your new functions.Give your path of new_stagelibraries file which is created
newstagelibrary = load 'C:\\Jenkins\\new_stagelibraries'
// Execute your function MyNew_library available in new_stagelibraries.groovy file.
newstagelibrary.MyNew_library()
}
}
}
Create a file named : new_stagelibraries(groovy file)
#!groovy
// Write or add Functions(definations of stages) which will be called from your jenkins file
def MyNew_library()
{
echo "Function execution of MyNew_library"
// You can add yoiur functionality here
}
return this
I have a main pipeline in jenkins that will checkout the code , compile, test and build, then push image to docker. This is the high level of CI pipeline that I have. Say job name "MainJobA"
I need to create a new job , just for JavaDoc generation. For that i have created a new script in Git Repo and configured the same in Pipeline job.
Now i need to execute this sub job of javadoc generation and publishing the html reports from the workspace of "MainJobA" . I need to run the SubJobA's pipeline stages from
/home/jenkins/workspace/MainJobA
How can i achieve this?
There is build step exist in jenkins declarative pipelines.
Use it like:
pipeline {
agent any
stages {
stage ("build") {
steps {
build 'Pipeline_B'
}
}
}
}
Currently we are developing centralized control system for our CI/CD projects. There are many projects with many branches so we are using multibranch pipeline ( This forces us to use Jenkinsfile from project branches so we can't provide custom Jenkinsfile like Pipeline projects ). We want to control everything under 1 git repo where for every project there should be kubernetes YAMLS's, Dockerfile and Jenkinsfile. When developer presses build button, Jenkinsfile from their project repo suppose to run our jenkinsfile. Is it possible to do this?
E.g. :
pipeline {
agent any
stages {
stage('Retrieve Jenkinsfile From Repo') { // RETRIEVE JENKINSFILE FROM REPO
steps {
git branch: "master",
credentialsId: 'gitlab_credentials',
url: "jenkinsfile_repo"
scripts {
// RUN JENKINSFILE FROM THE REPO
}
}
}
}
}
Main reason we are doing this, there are sensetive context in jenkinsfile like production database connections. We don't want to store jenkinsfile under developers' repo. Also you can suggest correct way to achieve that beside using only 1 repo.
EDIT: https://plugins.jenkins.io/remote-file/
This plugin solved all my problems. I could'not try comments below
As an option you can use pipeline build step.
pipeline {
agent any
stages {
stage ('build another job') {
steps {
build 'second_job_name_here'
}
}
}
}
Try load step
scripts {
// rename Jenkinsfile to .groovy
sh 'mv Jenkinsfile Jenkins.groovy'
// RUN JENKINSFILE FROM THE REPO
load 'Jenkinsfile.groovy'
}
According to the documentation in https://jenkinsci.github.io/job-dsl-plugin/#method/javaposse.jobdsl.dsl.helpers.wrapper.MavenWrapperContext.buildName
Following code should update build name in Build History in Jenkins jobs:
// define the build name based on the build number and an environment variable
job('example') {
wrappers {
buildName('#${BUILD_NUMBER} on ${ENV,var="BRANCH"}')
}
}
Unfortunately, it is not doing it.
Is there any way to change build name from Jenkins Job DSL script?
I know I can change it from Jenkins Pipeline Script but it is not needed for me in this particular job. All I use in the job is steps.
steps {
shell("docker cp ...")
shell("git clone ...")
...
}
I would like to emphasise I am looking for a native Jenkins Job DSL solution and not a Jenkins Pipeline Script one or any other hacky way like manipulation of environment variables.
I have managed to solve my issue today.
The script did not work because it requires build-name-setter plugin installed in Jenkins. After I have installed it works perfectly.
Unfortunately, by default jobdsl processor does not inform about missing plugins. The parameter enabling that is described here https://issues.jenkins-ci.org/browse/JENKINS-37417
Here's a minimal pipeline changing the build's display name and description. IMHO this is pretty straight forward.
pipeline {
agent any
environment {
VERSION = "1.2.3-SNAPSHOT"
}
stages {
stage("set build name") {
steps {
script {
currentBuild.displayName = "v${env.VERSION}"
currentBuild.description = "#${BUILD_NUMBER} (v${env.VERSION})"
}
}
}
}
}
It results in the following representation in Jenkins' UI:
setBuildName("your_build_name") in a groovyPostBuild step may do the trick as well.
Needs Groovy Postbuild Plugin.
I want to generate my pipeline plugin based jobs via Job DSL, which is contained in a git repository that is checked out by Jenkins.
However, I think it is not very nice to have the pipeline scripts as quoted Strings inside of the Job DSL script. So I want to read them into a string and pass that to the script() function:
definition {
cps {
sandbox()
script( new File('Pipeline.groovy').text )
}
}
}
Where do I have to put Pipeline.groovy for this to work? I tried putting it right next to my DSL script, and also in the resources/ folder of my DSL sources. But Jenkins always throws a "file not found".
have you tried readFileFromWorkspace()? it should be able find the files you checkout from git.
Ref the Job DSL pipelineJob: https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob, and hack away at it on http://job-dsl.herokuapp.com/ to see the generated config.
The Job DSL below creates a pipeline job, which pulls the actual job from a Jenkinsfile:
pipelineJob('DSL_Pipeline') {
def repo = 'https://github.com/path/to/your/repo.git'
triggers {
scm('H/5 * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master', '**/feature*')
scriptPath('misc/Jenkinsfile.v2')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
// the single line below also works, but it
// only covers the 'master' branch and may not give you
// enough control.
// git(repo, 'master', { node -> node / 'extensions' << '' } )
}
}
}
}
You/your Jenkins admins may want to separate Jenkins job creation from the actual job definition. That seems sensible to me ... it's not a bad idea to centralize the scripts to create the Jenkins jobs in a single Jenkins DSL repo.