Can I use tokens in Jenkins job config? - jenkins

I'd like to use the Jenkins job name in the config:
Maven root POM should be: {JOB_NAME}\pom.xml
SCM sub path should be: {JOB_NAME}
Are there tokens I could use here?

For SCM the answer is 'yes' - use ${JOB_NAME} (e.g. svn://myserver/myrepo/trunk/${JOB_NAME}).
In Maven build step it does not work. However, you may try using a custom workspace (push 'Advanced' button under 'Build' -> check 'use custom workspace') that contains ${JOB_NAME} (e.g. C:\workspaces\${JOB_NAME}) as a workaround.

Related

Jenkins Multibranch Pipeline: script / jenkinsfile as svn external

I have a multibranch pipeline in Jenkins. I want to include my script file (jenkinsfile) as svn file external into my development branches to organize the script centralized for all branches. Unfortunately the scan of the multibranch pipeline isn't able to find the script file as it is only looking inside the declared branch and not in the included svn external locations.
Has anyone an idea how can I fix this?
Below is an example of my svn structure, job config and further information.
SVN:
root/
scripts/
jenkinsfile
code/
version1/
branchX/
...
version11/
branchY/
...
SVN external property for branchX, branchY, etc.
Local path: jenkinsfile
URL: ^/scripts/jenkinsfile
Revision Peg: 12345
Multibranch job configuration:
Subversion
Project Repository Base: http://.../root/code/
Include branches: version1/branchX, version11/branchY
Build configuration
Mode: by Jenkinsfile
Script path: jenkinsfile
Log message of scan in multibranch pipeline:
...
Checking candidate branch /code/version1/branchX#HEAD
‘jenkinsfile’ not found
Does not meet criteria
...
I already tried to disable the lightweight checkout of the subversion scm plugin according to this advice:
Multibranch pipeline with jenkinsfile in svn:external
(I've added -Djenkins.scm.impl.subversion.SubversionSCMFileSystem.disable=true under <service><arguments>... in jenkins.xml)
But jenkins is still not able to find the script. And in fact if I put my script directly in e.g. branchX the disabled lightweight checkout leads to a double checkout into my workspace (first one to read the script file and second one as it's my first stage in the script itself).
Maybe my whole setup is wrong too or not the ideal way of doing?
I would be pleased about your help and tips. Thanks and Greetings!
If you are working on a linux or bsd(osx) system, you could create a hard-link from root/scripts/jenkinsfile to root/code/version#/branchX/jenkinsfile for each active branch
That way, each branch will have its own jenkinsfile available locally, enabling you to use the lightweight checkout, and any change you introduce to the jenkinsfile in any location will be available to all other branches (the file system will keep a single copy of the file, regardless of being accessible form many different locations).
The bash command to create such link will be
ln root/scripts/jenkinsfile root/code/version#/branchX/jenkinsfile
You will need to remember to create a new link each time a branch is created, or automate that using hooks

Jenkins Pipeline as Code with Docker Error

For one of my projects that I have on GitHub, I wanted to build it as a docker image and push it to my docker hub. The project is a sbt one with a Scala codebase.
Here is how my JenkinsFile is defined:
#!groovy
node {
// set this in Jenkins server under Manage Jenkins > Credentials > System > Global Credentials
docker.withRegistry('https://hub.docker.com/', 'joesan-docker-hub-credentials') {
git credentialsId: '630bd271-01e7-48c3-bc5f-5df059c1abb8', url: 'https://github.com/joesan/monix-samples.git'
sh "git rev-parse HEAD > .git/commit-id"
def commit_id = readFile('.git/commit-id').trim()
println comit_id
stage "build" {
def app = docker.build "Monix-Sample"
}
stage "publish" {
app.push 'master'
app.push "${commit_id}"
}
}
}
When I tried to run this from my Jenkins server, I get the following error:
java.io.FileNotFoundException
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:167)
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:159)
at jenkins.plugins.git.GitSCMFileSystem$3.invoke(GitSCMFileSystem.java:161)
at org.jenkinsci.plugins.gitclient.AbstractGitAPIImpl.withRepository(AbstractGitAPIImpl.java:29)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.withRepository(CliGitAPIImpl.java:65)
at jenkins.plugins.git.GitSCMFileSystem.invoke(GitSCMFileSystem.java:157)
at jenkins.plugins.git.GitSCMFile.content(GitSCMFile.java:159)
at jenkins.scm.api.SCMFile.contentAsString(SCMFile.java:338)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:101)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:59)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:232)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:404)
Finished: FAILURE
Since this is running inside a VM on Azure, I thought the VM was not able to reach outside, but that seems not to be the case as I was able to ssh into the VM and git pull from the Git repo. So what is the problem here? How could I make this work?
for me unchecking "lightweight checkout" fixed the issue
I experienced the exact same error. My setting:
Pipeline build inside a dockerized Jenkins (version 2.32.3)
In the configuration of the job, I specified a check out into a subdirectory: Open the configuration, e.g. https://myJenkins/job/my-job/configure. At the bottom, see section Pipeline -> Additional Behaviours -> Check out into a sub-directory with Local subdirectory for repo set to, e.g., my-sub-dir.
Expectation: Upon check out, the Jenkinsfile ends up in my-sub-dir/Jenkinsfile.
Via the option Script path, you configure the location of the Jenkinsfile so that Jenkins can start the build. I put my-sub-dir/Jenkinsfile as value.
I then received the exception you pasted in your question. I fixed it by setting Script Path to Jenkinsfile. If you don't specify a sub-directory for check out, then still try double checking values for Script Path.
Note: I have another Jenkins instance at work. There I have to specify Script Path including the custom check out sub-directory (as mentioned in Expectation above).
GO TO Job-->Config-->Pipline and uncheck checkbox lightweight checkout"
lightweight checkout : selected, try to obtain the Pipeline script contents >directly from
the SCM without performing a full checkout. The advantage of this mode
is its efficiency; however, you will not get any changelogs or polling
based on the SCM. (If you use checkout scm during the build, this will
populate the changelog and initialize polling.) Also build parameters
will not be substituted into SCM configuration in this mode. Only
selected SCM plugins support this mode.

Jenkins Pipeline: how to use the libraries built during previous stages?

We have decided to setup some Continous Integration process on our multi-repositories project. The idea is to automatically build for all target environments, and run the regression tests.
Jenkins seems like a comprehensive FOSS solution for that purpose, and it promotes the use of its Pipeline plugin.
For the examle, let's assume we have library A, which is a required dependency of library B. We created a freestyle project build A, which successfully clones and compiles A.
From the documentation and the snippet generator, we started a pipeline whose first step is to run build A:
node {
stage 'Build dependencies'
build 'build A'
//
stage 'Build executable'
git url: 'git#gitrepo:projectB', credentialsId: 'jenkins'
sh 'cmake -DPATH_TO_A=XXX ./'
// We do not know what to do then to use the built dependencies ?
// In particular, XXX should be replaced by a path to the header and binaries
// provided by A's build step.
}
We were unable to find how to then use this built dependency A in the build of the project B.
You can try copy the libA into the libB so that you can access it.
For detail, you can ref this
https://www.cloudbees.com/blog/copying-artifacts-between-builds-jenkins-workflow
Using folder notation
It should be pretty straightforward to get your other job using simple folder notation : ../project-A-name from your current pipeline's workspace.
Your script would look like :
node {
stage 'Build dependencies'
def jobAName = "A"
build "build ${jobAName}"
stage 'Build executable'
git url: 'git#gitrepo:projectB', credentialsId: 'jenkins'
sh "cmake -DPATH_TO_A=../${jobAName}/yourartifact"
}
Note that I replaced the simple quotes of your cmake step with double quotes to enable variable substitution.
Also you will note that I defined a variable to record the name of job A but of course you could just use the name of your job directly in the build and sh steps, but I find repeating constants error-prone.
Using Copy Artifact Plugin
As Tim mentionned it in his answer, you can also use Copy Artifact Plugin to copy job A artifact to your current project B pipeline. Your pipeline would look like this :
node {
stage 'Build dependencies'
def jobAName = "A"
build "build ${jobAName}"
stage 'Build executable'
step ([$class: 'CopyArtifact', projectName: "${jobAName}", filter: 'target/yourartifact', target: '.']);
sh 'cmake -DPATH_TO_A=yourartifact'
}
Again, I've used a variable and double quotes for variable substitution.
In this step filter param will allow you to select the relative path of your artifact within project A workspace while the target param will specify where you want to copy the artifact within project B workspace.

Jenkins multibranch pipeline with Jenkinsfile from different repository

I have a Git repository with code I'd like to build but I'm not "allowed" to add a Jenkinsfile in its root (it is a Debian package so I can't add files to upstream source). Is there a way to store the Jenkinsfile in one repository and have it build code from another repository? Since my code repository has several branches to build (one for each Debian release) this should be a multibranch pipeline. Commits in either the code or Jenkinsfile repositories should trigger a build.
Bonus complexity: I have several code/packaging repositories like this and I'd like to reuse the same Jenkinsfile for all of them. Thus it should somehow dynamically fetch the right Git URL to use. The branches to build have the same names across all repositories.
Short answer is : you cannot do that with a multibranch pipeline. Multibranch pipelines are only designed (at least for now) to execute a specific pipeline in Pipeline script from SCM style, with a fixed Jenkinsfile at the root of the project.
You can however use the Multi-Branch Project plugin made for multibranch freestyle projects. First, you need to define your multibranch freestyle configuration just like you would with a multibranch pipeline configuration.
Select this new item like shown below :
This type of configuration will behave exactly same as the multibranch pipeline type, i.e. it will create you a folder with the name of your configuration and a sub-project for each branch it automatically detected.
The implementation should then be a piece of cake :
Specify your SCM repository in the multibranch configuration
Call another build as part of your build/post-build as you would do in a standard freestyle project, except that you have to call a parameterized job (let's call it build-job) and give it your repository information, i.e. Git URL and current branch (you can use the pre-defined variables $GIT_URL and $GIT_BRANCH for this purpose)
In your build-job, just define either an inline pipeline or a pipeline script checked out from SCM, and inside this script do a SCM checkout and go on with the steps you need to build. Example of build-job pipeline content :
.
node() {
stage 'Checkout'
checkout scm: [$class: 'GitSCM', branches: [[name: '*/${GIT_BRANCH}']], userRemoteConfigs: [[url: '${GIT_URL}']]]
stage 'Build'
// Build steps...
}
Of course if your different multibranches projects need to be treated a bit differently, you could also use intermediate projects (let's say build-project-A, build-project-B, ...) that would in turn call the generic build-job pipeline)
The one, major drawback of this solution is that you will only have one job responsible for all of your builds, making it harder to debug. You would still have your multibranch projects going blue/red in case of success/error but you will have to go back to called build-job to find the real problem of your build.
The best way I have found is to use the Remote Jenkinsfile Provider plugin. https://plugins.jenkins.io/remote-file/
This will add an option "by Remote Jenkinsfile Provider plugin" under Build Configuration>Mode then you can point to another repo where the Jenkinsfile is. I find this to be a much better solution than the Pipeline Multibranch Defaults Plugin, which makes you store the Jenkins file in Jenkins itself, rather than in source control.
U can make use of this plugin
https://github.com/jenkinsci/pipeline-multibranch-defaults-plugin/blob/master/README.md
Where we need to configure the jenkinsfile on jenkins rather than having it on each branch of your repo
I have version 2.121 and you can do this two ways:
Way 1
In the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script from SCM" and enter the "SCM" information for how to find the "Jenkinsfile" that holds the script you want to run. It can be in the same repo you are finding branches in to create the jobs (if you put in the same GitHub repo's info) but I can't find a way to indicate that you just use the same branch for the file.
Way 2
Same as above, in the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script" and put a bit of Groovy in the text box to load whatever you want or to run some script that already got loaded into the workspace.
In my case, i have an escenario whith a gitlab project based on gradle who has dependencies on another gitlab preject based on gradle too (same dashboard, but differents commits, differents developers).
I have added the following lines into my Jenkinsfile (the one which depends)
stage('Build') {
steps {
git branch: 'dev', credentialsId: 'jenkins-generated-ssh-key', url: 'git#gitlab.project.com:root/coreProject.git'
sh './gradlew clean'
}
}
Note: Be awark on the order on the sentences.
If you have doubt on how to create jenkins-generated-ssh-key please ask me

Changing Jenkins build number

Is there a way to change the build number that is sent via email after a job completes? The problem is that are product builds are NOT being done by Jenkins, so we want to be able to get the build number(ie. from a text file) and update the build number in Jenkins to match it. I have tried to set the build number:
set BUILD_NUMBER=45
But the email is still showing the build number that Jenkins originally set.
If you have access to the script console (Manage Jenkins -> Script Console), then you can do this following:
Jenkins.instance.getItemByFullName("YourJobName").updateNextBuildNumber(45)
can be done with the plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Next+Build+Number+Plugin
more info:
http://www.alexlea.me/2010/10/howto-set-hudson-next-build-number.html
if you don't like the plugin:
If you want to change build number via nextBuildNumber file you should
"Reload Configuration from Disk" from "Manage Jenkins" page.
Under the job workspace folder, like:
C:\Program Files (x86)\Jenkins\jobs\job_name
there is a file named nextBuildNumber.
Setting the build number in the file and reloading the configuration from disk (Manage Jenkins menu) will force the next build you start to have the value from the file as BUILD_NUMBER.
If you have branch name including Forward Slash (using git flow for example), you will need to replace the Forward Slash with its Unicode character %2F within the branch name.
Here is an example for the pipeline My-Pipeline-Name and the branch release/my-release-branch-name
Jenkins.instance.getItemByFullName("My-Pipeline-Name/release%2Fmy-release-branch-name").updateNextBuildNumber(BUILD_NUMBER)
I was able to find out about this by running the following command which will list the different jobs (branches) for your pipeline
Jenkins.instance.getItem("My-Pipeline-Name").getAllJobs()
Hope it helps.
Perhaps a combination of these plugins may come in handy:
Parametrized build plugin - define some variable which holds your build number
Version number plugin - use the variable to change the build number
Build name setter plugin - use the variable to change the build number
You can change build number by updating file ${JENKINS_HOME}/jobs/job_name/nextBuildNumber on Jenkins server.
You can also install plugin Next Build Number plugin to change build number using CLI or UI
For multibranch pipeline projects, do this in the script console:
def project = Jenkins.instance.getItemByFullName("YourMultibranchPipelineProjectName")
project.getAllJobs().each{ item ->
if(item.name == 'jobName'){ // master, develop, feature/......
item.updateNextBuildNumber(#Number);
item.saveNextBuildNumber();
println('new build: ' + item.getNextBuildNumber())
}
}
Follow the steps: Jenkins Console > Manage Jenkins > Script Console, then write the script as:
Jenkins.instance.getItemByFullName("Your_job_name").updateNextBuildNumber(45)
By using environmental variables:
$BUILD_NUMBER =4

Resources