how to execute jenkins pipeline from config file - jenkins

I have a generic multibranch project that I use on about 100 different git repos. The jenkins jobs are automatically generated and the only difference is the git repo.
Since they all build in the same way and I don't want to copy the same jenkins groovy file in all repos, I use "Build configuration -> mode -> by default jenkinsfile".
It breaks the rule to put the jenkinsfile in SCM as I would prefer to do. To minimize the impact, I would like that groovy file to only checkout the "real" jenkinsfile and execute it.
I use that script:
pipeline {
agent {label 'docker'}
stages {
stage('jenkinsfile checkout') {
steps {
checkout([$class: 'GitSCM',
branches: [[name: 'master']],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'RelativeTargetDirectory',
relativeTargetDir: 'gipc_synthesis']],
submoduleCfg: [],
userRemoteConfigs: [[url: 'ssh://git#camtl1bitmirror.gad.local:7999/mtlstash/mvt/gipc_synthesis.git']]]
)
}
}
stage('Load user Jenkinsfile') {
//agent any
steps {
load 'gipc_synthesis/jenkins/synthesis_job.groovy'
}
}
}
}
The problem I have with that is I can't have another pipeline in the groovy file I am loading. I don't want to define only functions but really the whole pipeline in that file. Any solution to that problem? I am also interested in solution that would completely avoid the whole issue.
Thank you.

You can have a shared library with your pipeline inside:
// my-shared.git: vars/build.groovy
def call(String pathToGit) // and maybe additional params
{
pipeline {
agent { ... }
stages {
stage('jenkinsfile checkout') {
steps {
checkout([$class: 'GitSCM',
branches: [[name: 'master']],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'RelativeTargetDirectory',
relativeTargetDir: 'gipc_synthesis']],
submoduleCfg: [],
userRemoteConfigs: [[url: pathToGit]]]
)
}
}
}
}
}
and use it in your Jenkinsfile e.g. like this:
#!groovy
#Library('my-shared') _
def pathToGit = 'ssh://git#camtl1bitmirror.gad.local:7999/mtlstash/mvt/gipc_synthesis.git'
build(pathToGit)

Related

How to get the latest git commit time of another repository from a Jenkins declarative pipeline

I have a declarative pipeline. In this pipeline I want to add a verify step to check the last code commit time which lies in another repository. Based on the commit time, I do have to proceed with the next steps.
I am facing issues in fetching the commit time. I am very new to git commands, so unable to resolve the issue.
Points - My Jenkins file is in one repository, and I need the commit details of another repository. I need specifically the commit time (Epoc)
My code looks like this -
#Library('xx-libs') _
pipeline {
agent any
options {
timeout(time: 2, unit: 'HOURS')
}
parameters {
xxxxx
}
environment {
xxxxxx
}
stages {
stage('Checkout source code') {
steps {
script {
checkout ([$class: 'GitSCM',
branches: [[name: '*/'+branch]],
extensions: [
[$class: 'CloneOption', noTags: true, timeout: 20],
[$class: 'RelativeTargetDirectory', relativeTargetDir: '/tmp/core/'],
[$class: 'SparseCheckoutPaths', sparseCheckoutPaths:[[$class:'SparseCheckoutPath', path:'code/System/Infra/Version/version.cpp']]],
[$class: 'CleanBeforeCheckout']
],
userRemoteConfigs: [[
credentialsId: 'azure-bearer-auto-updated',
url: 'https://xx.xx.com/xxx/xxx/_git/core'
]]
])
git_time = sh script: "git show -s --format=%ct", returnStdout: true
echo "$git_time"
}
}
}
} // stages
}

Jenkins "Git Parameter" plugin with "useRepository" option

I'm using "Git Parameter" plugin to allow users to pick branch\tag for configured repositories.
This plugin has "useRepository" option to allow linking with the configured repos in the "Pipeline script from SCM" option:
This assumes that i need to preconfigure some repos in the Jenkins pipeline (in the Jenkins UI) to be able to use "Git Parameter" plugin in the pipelines.
But i want to dynamically predefine list of repos from the pipeline code itself, without any configuration in the "Pipeline script from SCM" section.
Unfortunately this doesn't work.
I'm tried to add "checkout" block with "GitSCM" class before calling "gitParameter" but with no success.
Code example:
def app_components = [
BACKEND : ["NAME": "backend", "GIT": "ssh://git#xxx.local/_git/backend"],
FRONTEND : ["NAME": "frontend", "GIT": "ssh://git#xxx.local/_git/fronend"]
]
pipeline {
agent any
stages {
stage('Test') {
steps {
script {
dynamicParameters = []
app_components.eachWithIndex { name, components, index ->
checkout([
$class: 'GitSCM',
branches: [[name: '*/master']],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'CleanCheckout']],
submoduleCfg: [],
userRemoteConfigs: [[credentialsId: 'XXX', url: components.GIT]]
])
dynamicParameters << gitParameter(name: 'BRANCH_' + name, defaultValue: 'develop', quickFilterEnabled: true, type: 'PT_BRANCH_TAG', listSize: '10', useRepository: components.GIT)
}
def userInput = input(
id: 'userInput', message: 'Test message:?',
parameters: dynamicParameters
)
println(userInput);
}
}
}
}
Git parameters text fields are always empty:
Could you advice some solution in this scenario?
Play around with adding
gitParameter branchFilter: 'origin/(.*)',

Jenkinsfile Declarative Pipeline: Checking out another repo in stage step

I have my Jenkinsfile located in repo "A" and I was to checkout a repo in repo "B". What's the correct way to do that?
This is how I have currently doing it, but it's not working:
stage("Checkout my repo repo ") {
steps {
checkout([$class: 'GitSCM', branches: [[name: '*/master']],
extensions: [[$class: 'RelativeTargetDirectory',
relativeTargetDir: 'copy-repo-here']],
userRemoteConfigs: [[credentialsId: 'my-creds',
url: 'git#github.com:my-git-repo.git']]])
}
}
Any help is appreciated!

Checkout a specific folder from git using jenkins groovy "checkout" command

I'm pretty new to jenkins and groovy and I'm trying to do a sparse checkout in my jenkins file. Currently I simply do this:
stage('Check out branch from Gitlab'){
echo 'Pulling...' + env.BRANCH_NAME
checkout scm
}
I wish to execute a sparse checkout from a Jenkins Groovy script and I'm struggling to find a good way of doing this. Is there a way of using the "checkout" command to do this?
You should configure a set of parameters for the GitSCM more info here
A basic configuration is presented as an example below:
pipeline {
agent any
stages {
stage ("Git Checkout"){
steps {
script {
checkout([
$class: 'GitSCM',
branches: [[name: "devel"]],
doGenerateSubmoduleConfigurations: false,
extensions: [[
$class: 'RelativeTargetDirectory',
relativeTargetDir: "/tmp/jenkins/devel"
]],
submoduleCfg: [],
userRemoteConfigs: [[
credentialsId: 'jenkinsCredentialsId',
url: 'https://git.example.com/git/example'
]]
])
}
}
}
}
}
I attached a fully working Jenkins pipeline of one stage. It checks out the branch devel of the repository https://git.example.com/git/example on directory /tmp/jenkins/devel. Also please note that you should add (if not already done) the credentials of the repository in Jenkins Credentials (/jenkins/credentials/), in the above example is under id jenkinsCredentialsId
You can read the link for GitSCM to find out more details and attributes that you can configure.

Can I augment scm in Jenkinsfile?

It's taken me ages to understand what checkout scm really means in Jenkinsfile (checkout is a function and scm is a default global variable by the way).
Now that I've understood it, I want to augment scm for example to increase the timeout for a particular checkout or to set sparseCheckoutPaths. Is this possible? If so, how?
For Git, checkout scm is basically equivalent to :
checkout([
$class: 'GitSCM',
branches: scm.branches,
doGenerateSubmoduleConfigurations: scm.doGenerateSubmoduleConfigurations,
extensions: scm.extensions,
userRemoteConfigs: scm.userRemoteConfigs
])
If you want to add sparse checkout to the existing scm, what you would do is something like:
checkout([
$class: 'GitSCM',
branches: scm.branches,
doGenerateSubmoduleConfigurations: scm.doGenerateSubmoduleConfigurations,
extensions: scm.extensions + [$class: 'SparseCheckoutPaths', sparseCheckoutPaths:[[$class:'SparseCheckoutPath', path:'path/to/file.xml']]],
userRemoteConfigs: scm.userRemoteConfigs
])
Even better, you can define a custom step, sparseCheckout in a shared library.
def call(scm, files) {
if (scm.class.simpleName == 'GitSCM') {
def filesAsPaths = files.collect {
[path: it]
}
return checkout([$class : 'GitSCM',
branches : scm.branches,
doGenerateSubmoduleConfigurations: scm.doGenerateSubmoduleConfigurations,
extensions : scm.extensions +
[[$class: 'SparseCheckoutPaths', sparseCheckoutPaths: filesAsPaths]],
submoduleCfg : scm.submoduleCfg,
userRemoteConfigs : scm.userRemoteConfigs
])
} else {
// fallback to checkout everything by default
return checkout(scm)
}
}
Then you call it with:
sparseCheckout(scm, ['path/to/file.xml'])
You can definitely customize the checkout scm command to add more flexibility. Check out this link for all of the options - https://jenkins.io/doc/pipeline/steps/workflow-scm-step/
Timeouts:
$class: CheckoutOption timeout::::
Specify a timeout (in minutes) for checkout.
This option overrides the default timeout of 10 minutes.
You can change the global git timeout via the property org.jenkinsci.plugins.gitclient.Git.timeOut (see JENKINS-11286). Note that property should be set on both master and slave to have effect (see JENKINS-22547).
Type: int
SparseCheckoutPaths:
$class: SparseCheckoutPaths
Specify the paths that you'd like to sparse checkout. This may be used for saving space (Think about a reference repository). Be sure to use a recent version of Git, at least above 1.7.10

Resources