How to configure a Jenkins 2 Pipeline so that Jenkinsfile uses a predefined variable - jenkins

I have several projects that use a Jenkinsfile which is practically the same. The only difference is the git project that it has to checkout. This forces me to have one Jenkinsfile per project although they could share the same one:
node{
def mvnHome = tool 'M3'
def artifactId
def pomVersion
stage('Commit Stage'){
echo 'Downloading from Git...'
git branch: 'develop', credentialsId: 'xxx', url: 'https://bitbucket.org/xxx/yyy.git'
echo 'Building project and generating Docker image...'
sh "${mvnHome}/bin/mvn clean install docker:build -DskipTests"
...
Is there a way to preconfigure the git location as a variable during the job creation so I can reuse the same Jenkinsfile?
...
stage('Commit Stage'){
echo 'Downloading from Git...'
git branch: 'develop', credentialsId: 'xxx', url: env.GIT_REPO_LOCATION
...
I know I can set it up this way:
This project is parameterized -> String Parameter -> GIT_REPO_LOCATION, default= http://xxxx, and access it with env.GIT_REPO_LOCATION.
The downside is that the user is promted to start the build with the default value or change it. I would need that it were transparent to he user. Is there a way to do it?

You can use the Pipeline Shared Groovy Library plugin to have a library that all your projects share in a git repository. In the documentation you can read about it in detail.
If you have a lot of Pipelines that are mostly similar, the global variable mechanism provides a handy tool to build a higher-level DSL that captures the similarity. For example, all Jenkins plugins are built and tested in the same way, so we might write a step named buildPlugin:
// vars/buildPlugin.groovy
def call(body) {
// evaluate the body block, and collect configuration into the object
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
// now build, based on the configuration provided
node {
git url: "https://github.com/jenkinsci/${config.name}-plugin.git"
sh "mvn install"
mail to: "...", subject: "${config.name} plugin build", body: "..."
}
}
Assuming the script has either been loaded as a Global Shared Library
or as a Folder-level Shared Library the resulting Jenkinsfile will be
dramatically simpler:
Jenkinsfile (Scripted Pipeline)
buildPlugin {
name = 'git'
}
The example shows how a jenkinsfile passes name = git to the library.
I currently use a similar setup and am very happy with it.

Instead of having a Jenkinsfile in each Git repository, you can have an additional git repository from where you get the common Jenkinsfile - this works when using Pipeline type Job and selecting the option Pipeline script from SCM. This way Jenkins checks out the repo where you have the common Jenkinsfile before checking out the user repo.
In case the job can be triggered automatically, you can create a post-receive hook in each git repo that calls the Jenkins Pipeline with the repo as a parameter, so that the user does not have to manually run the job entering the repo as a parameter (GIT_REPO_LOCATION).
In case the job cannot be triggered automatically, the least annoying method I can think of is having a Choice parameter with a list of repositories instead of a String parameter.

Related

Teamcity Shared Library and Stash/Unstash Like Jenkins

I am currently a jenkins user and is exploring the Teamcity.
In jenkins we have a concept of shared libraries, which basically extends a generic groovy code into different jenkins pipeline and avoid re-writing the same functionality in each jenkins file following the DRY (don't repeat yourself) , hide implementation complexity, keep pipelines short and easier to understand
Example:
There could be a repository having all the Groovy functions like:
Repo: http:://github.com/DEVOPS/Utilities.git (repo Utilities)
Sample Groovy Scipt ==>> GitUtils.groovy with below functions
public void setGitConfig(String userName, String email) {
sh "git config --global user.name ${userName}"
sh "git config --global user.mail ${email}"
}
public void gitPush(StringbranchName) {
sh "git push origin ${branchName}"
}
In jenkinsfile we can just call this function like below (of course we need to define config in jenkins for it to know the Repo url for Shared library and give it a name):
Pipeline
//name of shared library given in jenkins
#Library('utilities') _
pipeline {
agent any
stages {
stage ('Example') {
steps {
// log.info 'Starting'
script {
def gitutil = new GitUtils()
gitutils.setGitConfig("Ray", "Ray#rayban.com")
}
}
}
}
}
And that's it anyone wanting the same function has to just include the library in jenkinsfile and use it in pipeline
Questions:
Can we migrate over the same to Teamcity, if yes how can it be done? We do not want to spend lot of time to re-writing
Jenkins also support Stashing and unstashing of workspace between stages, is the similar concept present in teamcity?
Example:
pipeline {
agent any
stages {
stage('Git checkout'){
steps {
stash includes: '/root/hello-world/*', name: 'mysrc'
}
}
stage('maven build'){
agent { label 'slave-1' }
steps {
unstash 'mysrc'
sh label: '', script: 'mvn clean package'
}
}
}
}
As for reusing common TeamCity Kotlin DSL libraries, this can be done via maven dependencies. For that you have to mention it in the pom.xml file within your DSL code. You can also consider using JitPack if your DSL library code is hosted on GitHub for example and you do not want to handle building it separately and publishing its maven artifacts.
Although with migration from Jenkins to TeamCity you will most likely have to rewrite the common library (if you still need one at all), as TeamCity project model and DSL are quite different to what you have in Jenkins.
Speaking of stashing/unstashing workspaces, it may be covered by either artifact rules and artifact dependencies (as described here: https://www.jetbrains.com/help/teamcity/artifact-dependencies.html) or repository clone mirroring on agents.

What does the pollSCM trigger refer to in this Jenkinsfile?

Consider the following setup using Jenkins 2.176.1:
A new pipeline project named Foobar
Poll SCM as (only) build trigger, with: H/5 * * * * ... under the assumption that this refers to the SCM configured in the next step
Pipeline script from SCM with SCM Git and a working Git repository URL
Uncheck Lightweight checkout because of JENKINS-42971 and JENKINS-48431 (I am using build variables in the real project and Jenkinsfile; also this may affect how pollSCM works, so I include this step here)
Said repository contains a simple Jenkinsfile
The Jenkinsfile looks approximately like this:
#!groovy
pipeline {
agent any
triggers { pollSCM 'H/5 * * * *' }
stages {
stage('Source checkout') {
steps {
checkout(
[
$class: 'GitSCM',
branches: [],
browser: [],
doGenerateSubmoduleConfigurations: false,
extensions: [],
submoduleCfg: [],
userRemoteConfigs: [
[
url: 'git://server/project.git'
]
]
]
)
stash 'source'
}
}
stage('OS-specific binaries') {
parallel {
stage('Linux') {
agent { label 'gcc && linux' }
steps {
unstash 'source'
echo 'Pretending to do a build here'
}
}
stage('Windows') {
agent { label 'windows' }
steps {
unstash 'source'
echo 'Pretending to do a build here'
}
}
}
}
}
}
My understanding so far was that:
a change to the Jenkinsfile (not the whole repo) triggers the pipeline on any registered agent (or as configured in the pipeline project).
said agent (which is random) uses the pollSCM trigger in the Jenkinsfile to trigger the pipeline stages.
But where does the pollSCM trigger poll (what SCM repo)? And if it's a random agent then how can it reasonably detect changes across poll runs?
then the stages are being executed on the agents as allocated ...
Now I am confused what refers to what. So here my questions (all interrelated which is why I keep it together in one question):
The pipeline project polls the SCM just for the Jenkinsfile or for any changes? The repository in my case is the same (for Jenkinsfile and source files to build binaries from).
If the (project-level) polling triggers at any change rather than changes to the Jenkinsfile
Does the pollSCM trigger in the Jenkinsfile somehow automagically refer to the checkout step?
Then ... what would happen, would I have multiple checkout steps with differing settings?
What determines what repository (and what contents inside of that) gets polled?
... or is this akin to the checkout scm shorthand and pollSCM actually refers to the SCM configured in the pipeline project and so I can shorten the checkout() to checkout scm in the steps?
Unfortunately the user handbook didn't answer any of those questions and pollSCM has a total of four occurrences on a single page within the entire handbook.
I'll take a crack at this one:
The pipeline project polls the SCM just for the Jenkinsfile or for any
changes? The repository in my case is the same (for Jenkinsfile and
source files to build binaries from).
The pipeline project will poll the repo for ANY file changes, not just the Jenkinsfile. A Jenkinsfile in the source repo is common practice.
If the (project-level) polling triggers at any change rather than
changes to the Jenkinsfile Does the pollSCM trigger in the Jenkinsfile
somehow automagically refer to the checkout step?
Your pipeline will be executed when a change to the repo is seen, and the steps are run in the order that they appear in your Jenkinsfile.
Then ... what would happen, would I have multiple checkout steps with
differing settings?
If you defined multiple repos with the checkout step (using multiple checkout SCM calls) then the main pipeline project repo would be polled for any changes and the repos you define in the pipeline would be checked out regardless of whether they changed or not.
What determines what repository (and what contents inside of that)
gets polled? ... or is this akin to the checkout scm shorthand and
pollSCM actually refers to the SCM configured in the pipeline project
and so I can shorten the checkout() to checkout scm in the steps?
pollSCM refers to the pipeline project's repo. The entire repo is cloned unless the project is otherwise configured (shallow clone, lightweight checkout, etc.).
The trigger defined as pollSCM polls the source-control-management (SCM), at the repository and branch in which this jenkinsfile itself (and other code) is located.
For Pipelines which are integrated with a source such as GitHub or BitBucket, triggers may not be necessary as webhooks-based integration will likely already be present. The triggers currently available are cron, pollSCM and upstream.
It works for a multibranch-pipeline as trigger to execute the pipeline.
When Jenkins polls the SCM, exactly this repository and branch, and detects a change (i.e. new commit), then this Pipeline (defined in jenkinsfile) is executed.
Usually then the following SCM Step checkout will be executed, so that the specified project(s) can be built, tested and deployed.
See also:
SCM Poll in jenkins multibranch pipeline
SehllHacks(2020): Jenkins: Scan Multibranch Pipeline Without Build

jenkins pipeline get repository url variable under pipeline script from scm

I'm using Jenkins file that located in my git repository.
I have configured new job using the pipeline script from SCM that point to my jenkinsfile. I'm trying to use in my Jenkins file pipeline script the git module in order to pull my data from my git repo without configure pre-static variable and just to use the variable of the repository URL under pipeline script from SCM that already was configured in my job .
There is a way to get somehow the variable Repository URL
from this plugin without using parameters in my Jenkins pipeline script.
I have already tried the environment variable GIT_URL and other stuff that related to git from here but this didn't work.
You can find all information about scm in scm variable (instance of GitSCM if you are using git).
You can get repository URL this way
def repositoryUrl = scm.userRemoteConfigs[0].url
But if you just want to checkout that repository you can simply invoke checkout scm without needing to specify anything else. See checkout step
from this post I found a way that you can use the checkout scm to get the git repo url like this:
checkout scm
def url = sh(returnStdout: true, script: 'git config remote.origin.url').trim()
but checkout scm will pull the code and I want to avoid from that.
So I found another way (not the pretty one):
node('master'){
try{
GIT_REPO_URL = null
command = "grep -oP '(?<=url>)[^<]+' /var/lib/jenkins/jobs/${JOB_NAME}/config.xml"
GIT_REPO_URL = sh(returnStdout: true, script: command).trim();
echo "Detected Git Repo URL: ${GIT_REPO_URL}"
}
catch(err){
throw err
error "Colud not find any Git repository for the job ${JOB_NAME}"
}
}
this is did the trick for me.
Probably not directly a solution for your particular case, as you're working with git.
But for those still working with SVN using the SubversionSCM, the repository URL can be obtained using
def repositoryUrl = scm.locations[0].remote
I believe that the best solution is like this answer.
An example using declarative pipeline:
pipeline {
agent any;
stages {
stage('test'){
steps {
script {
def s = checkout scm;
if (s.GIT_URL != null) print s.GIT_URL
else if (s.SVN_URL != null) print s.SVN_URL
else print s
}
}
}
}
}
Note - this does a full checkout. If that is not desirable, I would try to handle that in checkout parameters (like here)

Multi-branch configuration with externally-defined Jenkinsfile

I have an open-source project, that resides in GitHub and is built using a build farm, controlled by Jenkins.
I want to build it branch-wise using a pipeline, but I don't want to store Jenkinsfile inside the code. Is there a way to accomplish this?
I have encountered the same issue as you. While the idea of having the build process as part of the code is good, there is information that the Jenkinsfile would include that are not intrinsic to the project build itself, but rather are specific to the build environment instance, which may change.
The way I accomplished this is:
Encapsulate the core build process in a single script (build.py or build.sh). This may call specific build tools like Make, CMake, Ant, etc.
Tell Jenkins via the Jenkinsfile to call a function defined in a single global library
Define the global Jenkins build function to call the build script (e.g. build.py) with appropriate environment settings. For example, using custom tools and setting up the PATH.
So for step 2, create a Jenkinsfile in your project containing just the line
build_PROJECTNAME()
where PROJECTNAME is based on the name of your project.
Then use the Pipeline Shared Groovy Libraries Plugin and create a Groovy script in the shared library repository called vars/build_PROJECTNAME.groovy containing the code that sets up the environment and calls the project build script (e.g. build.py):
def call() {
node('linux') {
stage("checkout") {
checkout scm
}
stage("build") {
withEnv([
"PATH+CMAKE=${tool 'CMake'}/bin",
"PATH+PYTHON=${tool 'Python-3'}",
"PATH+NINJA=${tool 'Ninja'}",
]) {
execute 'python build.py'
}
}
}
}
First of all, why do you not want a Jenkinsfile in your code? The pipeline is just as much part of the code as would be your build file.
Other then that, you can load groovy files to be evaluated as a pipeline script. You can do this either from a different location with the from SCM option and then checkout the actual code. But this will force you to manually take care of the branch builds.
Another option would be to have a very basic Jenkinsfile that merely checkouts an external pipeline.
You would get something like this:
node{
deleteDir()
git env.flowScm
def flow = load 'pipeline.groovy'
stash includes: '**', name: 'flowFiles'
stage 'Checkout'
checkout scm // short hand for checking out the "from scm repository"
flow.runFlow()
}
Where the pipeline.groovy file would contain the actual pipeline would look like this:
def runFlow() {
// your pipeline code
}
// Has to exit with 'return this;' in order to be used as library
return this;

Jenkins multibranch pipeline with Jenkinsfile from different repository

I have a Git repository with code I'd like to build but I'm not "allowed" to add a Jenkinsfile in its root (it is a Debian package so I can't add files to upstream source). Is there a way to store the Jenkinsfile in one repository and have it build code from another repository? Since my code repository has several branches to build (one for each Debian release) this should be a multibranch pipeline. Commits in either the code or Jenkinsfile repositories should trigger a build.
Bonus complexity: I have several code/packaging repositories like this and I'd like to reuse the same Jenkinsfile for all of them. Thus it should somehow dynamically fetch the right Git URL to use. The branches to build have the same names across all repositories.
Short answer is : you cannot do that with a multibranch pipeline. Multibranch pipelines are only designed (at least for now) to execute a specific pipeline in Pipeline script from SCM style, with a fixed Jenkinsfile at the root of the project.
You can however use the Multi-Branch Project plugin made for multibranch freestyle projects. First, you need to define your multibranch freestyle configuration just like you would with a multibranch pipeline configuration.
Select this new item like shown below :
This type of configuration will behave exactly same as the multibranch pipeline type, i.e. it will create you a folder with the name of your configuration and a sub-project for each branch it automatically detected.
The implementation should then be a piece of cake :
Specify your SCM repository in the multibranch configuration
Call another build as part of your build/post-build as you would do in a standard freestyle project, except that you have to call a parameterized job (let's call it build-job) and give it your repository information, i.e. Git URL and current branch (you can use the pre-defined variables $GIT_URL and $GIT_BRANCH for this purpose)
In your build-job, just define either an inline pipeline or a pipeline script checked out from SCM, and inside this script do a SCM checkout and go on with the steps you need to build. Example of build-job pipeline content :
.
node() {
stage 'Checkout'
checkout scm: [$class: 'GitSCM', branches: [[name: '*/${GIT_BRANCH}']], userRemoteConfigs: [[url: '${GIT_URL}']]]
stage 'Build'
// Build steps...
}
Of course if your different multibranches projects need to be treated a bit differently, you could also use intermediate projects (let's say build-project-A, build-project-B, ...) that would in turn call the generic build-job pipeline)
The one, major drawback of this solution is that you will only have one job responsible for all of your builds, making it harder to debug. You would still have your multibranch projects going blue/red in case of success/error but you will have to go back to called build-job to find the real problem of your build.
The best way I have found is to use the Remote Jenkinsfile Provider plugin. https://plugins.jenkins.io/remote-file/
This will add an option "by Remote Jenkinsfile Provider plugin" under Build Configuration>Mode then you can point to another repo where the Jenkinsfile is. I find this to be a much better solution than the Pipeline Multibranch Defaults Plugin, which makes you store the Jenkins file in Jenkins itself, rather than in source control.
U can make use of this plugin
https://github.com/jenkinsci/pipeline-multibranch-defaults-plugin/blob/master/README.md
Where we need to configure the jenkinsfile on jenkins rather than having it on each branch of your repo
I have version 2.121 and you can do this two ways:
Way 1
In the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script from SCM" and enter the "SCM" information for how to find the "Jenkinsfile" that holds the script you want to run. It can be in the same repo you are finding branches in to create the jobs (if you put in the same GitHub repo's info) but I can't find a way to indicate that you just use the same branch for the file.
Way 2
Same as above, in the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script" and put a bit of Groovy in the text box to load whatever you want or to run some script that already got loaded into the workspace.
In my case, i have an escenario whith a gitlab project based on gradle who has dependencies on another gitlab preject based on gradle too (same dashboard, but differents commits, differents developers).
I have added the following lines into my Jenkinsfile (the one which depends)
stage('Build') {
steps {
git branch: 'dev', credentialsId: 'jenkins-generated-ssh-key', url: 'git#gitlab.project.com:root/coreProject.git'
sh './gradlew clean'
}
}
Note: Be awark on the order on the sentences.
If you have doubt on how to create jenkins-generated-ssh-key please ask me

Resources