Jenkins: Run migrations only when migrations folder has changed - jenkins

I have a pipeline script, and would like to take different actions depending on the changes of the migrations folder.
Basically would be a workflow like this
Pull changes in the repository
Check if the migrations/ folder has new migrations or changes
If changes are present, run migrations, if not, continue
I'm not sure how could I achieve this, I'm using version 2.1 and the git plugin. This repo is on a private server

There's probably a way to do it directly with the plugin, but I only get the option for included regions if I add another branch source as "Single repository & branch", so for now I implemented this solution:
I added this to my Jenkinsfile, to check for changes on the migrations/ folder
script {
env.CONTAINS_MIGRATIONS = sh (
script: 'git diff --name-only --diff-filter=AMDR --cached HEAD^',
returnStdout: true
).trim()
if (env.CONTAINS_MIGRATIONS.contains('migrations')) {
// Do migrations related stuff
}
}
I'm doing this considering that is unlikely to have filename conflicts, and if they happen is not a big deal

In your case, 'included region' feature from Git plugin should help. See this answer for details.
So, for pipeline, you can generate the correct syntax using pipeline-syntax generator (under http://<JENKINS_IP>:<JENKINS_PORT>/job/<PATH_TO_PIPELINE_JOB>/pipeline-syntax/ job in Sample Step: checkout -> SCM: Git -> Additional Behaviours -> Polling ignores commits in certain paths). It will be something like this:
checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'PathRestriction', excludedRegions: '', includedRegions: 'migrations/.*']], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'test', url: 'http://test.com/test.git']]])
Check this documentation for details (extensions -> includedRegions).
For job dsl syntax it will be like this:
scm {
git {
remote {
...
}
extensions {
cleanBeforeCheckout()
disableRemotePoll() // this is important for path restrictions to work
configure { git ->
git / 'extensions' / 'hudson.plugins.git.extensions.impl.PathRestriction' {
includedRegions "somepath/.*"
excludedRegions "README.md\n\\.gitignore\npom.xml"
}
}
}
}
}
Also, you can use GitHub/GitLab/BitBucket webhooks to build a project when a change is pushed to repository.
See this example for Github and BitBucket configuration and this example for GitLab configuration.
If you want to build the project only for changes in migrations folder and not for any changes in repository, you can configure comment regex for triggering a build and add this specific comment (e.g., "[changes in migrations folder]") to the commit every time you want to trigger a build.

Related

Jenkins: Building multiple repos with different branches

I have multiple repos with their own jenkins files and when I am working on one repo I will need to build the others so I have an end to end app deployed for feature development. As the app runs on AWS with the containers deployed into EKS my preference is to be able to build and run on AWS.
There is an order to the building, the infrastructure needs to deployed first, before the backend services (there are 3) and the UI.
Ideally I can choose which branches from the 5 repos are deployed, and when a change on any branch that is deployed as part of the ephemeral environment occurs the pipeline will trigger.
So far what I am thinking is to have a jenkinsfile in each repo and create a 6th repo, which will have just a yaml file and jenkinsfile of its own. This pipeline job for this repo would take data from the yaml file about which branches to use, and trigger the other pipelines passing the branch to each, it would be the only repo with an actual pipeline job.
Has anyone tried this? I'm not sure if it's possible to have a pipeline watch multiple different repos and branches and act as an orchestrator, kicking off other pipelines.
There might be a much easier way to do this, I have read a lot of posts and articles but none seem to achieve what I want.
One of the approach can be writing single Jenkinsfile by combing all the stages from each repo into this single Jenkinsfile
stages {
stage('Infra Setup') {
steps {
// The below will clone your repo and will be checked out to master branch by default.
git credentialsId: 'jenkins_git_cred', url: '<your_git_url_for_clone>'
sh "git checkout branchname"
// Your steps
}
}
stage('Backend1 ') {
steps {
//If you want to checkout to a specific branch by default instead of master then use the below in your pipeline stage.
checkout([$class: 'GitSCM', branches: [[name: '*/branchname']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'jenkins_git_cred', url: 'your_git_url_for_clone']]])
}
}
stage('backend_n') {
steps {
// One or more steps need to be included within the steps block.
}
}
stage('UI') {
steps {
// One or more steps need to be included within the steps block.
}
}
}
You can generate syntax using jenkins pipeline-syntax
https://your-jenkins-url.com/pipeline-syntax/

Using Jenkins declarative pipeline, how do I fetch and compare another branch with a private github repo?

Background
My team wants to update several linting rules in our project, however, doing so will cause our Jenkins build pipeline which lints, tests and builds each feature branch to break. We don't want to lose the value of linting each feature branch before merging, so we agree that linting only the files that the feature branch changes is a reasonable way to introduce these new lint rules without forcing us to re-lint the whole project up-front. Given that our entire project is already linted, this seems like a reasonable move.
A while ago I wrote a git tool to do exactly this. It determines which files have changed since the feature branch diverged from master and outputs those files so they can be consumed by eslint, pycodestyle and other linters. Here's the source if you're interested in how this is done.
Problem
Jenkins declarative build process and it's GitHub Branch Source Plugin seem to have a brittle checkout behavior that can't be modified to checkout more than just the feature branch that it's called on to build.
If I call git fetch origin stage within a build step, Jenkins complains about missing credentials. I don't feel comfortable sticking in credentials into my pipeline file, I'd MUCH prefer to continue using the Git plugin to manage credentials to our private github repo and pull branches, however, I'm at a loss as to how to specify for it to fetch more than just the feature branch.
For reference, here's the relevant portions of my Jenkinsfile
As you can see, I've tried adding the GitSCM code block to no avail. I've read this medium article which solves a similar problem, but I'm not using SSH credentials and I'd prefer not to given than we're already managing credentials using the Git plugin.
pipeline {
agent any
tools {
nodejs 'node12.7.0'
}
stages {
stage('checkout') {
steps {
checkout([
$class: 'GitSCM',
branches: [[name: '*']],
extensions: scm.extensions,
userRemoteConfigs: [],
doGenerateSubmoduleConfigurations: true
])
}
}
stage('install') {
steps {
script {
sh 'git config --add remote.origin.fetch +refs/heads/master:refs/remotes/origin/master'
sh 'yarn install'
}
}
}
stage('lint & test') {
failFast true
parallel {
stage('lint') {
when {
not {
anyOf {
branch 'stage'; branch 'int'; branch 'prod'
}
}
}
steps {
script {
sh """
git submodule update --init
yarn run lint
"""
}
}
}
...
}
}
stage('deploy') {...}
}
}
post {
failure {
notifySlack()
}
}
}
Create credential in your Jenkins with ssh key and private key, which can be added to the checkout userRemoteConfigs which will be used while checking out (Value given down just an example of one the credential id in my Jenkins environment)
userRemoteConfigs: [[credentialsId: '7969s7612-adruj-au2cd-492msa802f']]
One frequent root cause - mentioned on the referenced medium article, too - is that Jenkins only checks out the current branch that needs to be build.
An easy option I just found to have other project branches available is to
Configure your pipeline job
Under Behaviors->General, add Specify ref specs
Optionally adjust the parameter to the refs you need, e.g. the branches to compare to. Or you can get all branches by maintaining the default +refs/heads/*:refs/remotes/#{remote}/* as shown in the screenshot:
Jenkins Pipeline Job - Ref Config
P.S.: This seems to be part of the GIT Jenkins plugin, but I couldn't find it in the docs...

How do I configure a Jenkins multi-branch pipeline to build as a submodule?

I have a project kuma organized with submodules:
kuma
Jenkinsfile (configured to test kuma)
locales
kumascript
Jenkinsfile (configured to test kumascript)
A bunch of other files
I'd like to configure a multi-branch pipeline in Jenkins that watches for branches on the kumascript repo, to:
Check out the master branch of kuma
Update the locales to the commit in the master branch (a regular git submodule update --init)
Update the kumascript submodule to the branch to test
Run the Jenkinsfile in the kumascript branch
Is this possible? Is there a better way to do this?
Here's what worked for me.
First, the Jekinsfile is read from the commit, before checkout, so it is easy to use the one in the kumascript submodule, and much, much harder (impossible?) to read it from a different repo.
In Jenkins 2.68 with Git plugin 3.4.1, I setup a multibranch pipeline. The one source is Git, pointing to the kumascript repository:
"Discover branches" finds branches in the repository and starts builds for them.
"Wipe out repository and force clone" works around an issue where jgit doesn't fetch a submodule repository before checking it out, and thus the target commit isn't available. It causes an error that looks like this in the Jenkins logs:
> git fetch --no-tags --progress https://github.com/mdn/kumascript +refs/heads/*:refs/remotes/origin/*
Checking out Revision 998d9e539127805742634ef1c850221cf04ca2c7 (build-with-locales-1340342)
org.eclipse.jgit.errors.MissingObjectException: Missing unknown 998d9e539127805742634ef1c850221cf04ca2c7
at org.eclipse.jgit.internal.storage.file.WindowCursor.open(WindowCursor.java:158)
at org.eclipse.jgit.lib.ObjectReader.open(ObjectReader.java:227)
at org.eclipse.jgit.revwalk.RevWalk.parseAny(RevWalk.java:859)
at org.eclipse.jgit.revwalk.RevWalk.parseCommit(RevWalk.java:772)
This issue appears to be reported in https://issues.jenkins-ci.org/browse/JENKINS-45729, and is reported fixed in Git client plugin 2.5.0.
Wiping out the repo appears to force the full fetch, and may be necessary when installing in a parent project.
Jenkins is now configured to create a build for each branch in the repository. To check it out as a submodule, the parent project will need to be manually checked out in the Jenkinsfile. I used Jenkin's "Pipeline Syntax" tool to help construct the command
After some formatting, this goes in my Jenkinsfile:
stage("Prepare") {
// Checkout Kuma project's master branch
checkout([$class: 'GitSCM',
userRemoteConfigs: [[url: 'https://github.com/mozilla/kuma']],
branches: [[name: 'refs/heads/master']],
extensions: [[$class: 'SubmoduleOption',
disableSubmodules: false,
parentCredentials: false,
recursiveSubmodules: true,
reference: '',
trackingSubmodules: false]],
doGenerateSubmoduleConfigurations: false,
submoduleCfg: []
])
// Checkout KumaScript in subfolder
dir('kumascript') {
checkout scm
}
}
This checks out the kuma project and its submodules, and then uses the "vanilla" checkout to checkout the requested branch, but in the submodule directory:
From then on, if I want to run a command in the kuma repo, I run it:
stage('Build') {
sh 'make build-kumascript VERSION=latest'
}
and if I want to run it in the kumascript submodule, I wrap it in dir:
stage('Lint') {
dir('kumascript') {
sh 'make lint VERSION=latest'
sh 'make lint-macros VERSION=latest'
}
}

I want to checkout the 2nd Repo in my Jenkins multibranch pipeline

In my Jenkinsfile, the "checkout scm" command will checkout whatever repo I have configured in the configuration panel.
But what if I add a 2nd repo to the Jenkins file - is there any way to check that out to a sepcific directory within the workspace? The catch is that I don't want to hard-code any URLs into my Jenkinsfile. Here's an illustration of what I'm trying to achive:
stage("Checkout") {
checkout scm // Works fine, checks out the 1st consifured repo to workspace.
dir("src") {
checkout scm // Checks out the exact same repo again, but how can I change this to colone the 2nd repo instead?
}
}
Basically - what could I put instead of the 2nd "checkout scm" that would make it pull the 2nd repo configured in the Multibranch pipeline web config?
And supposing this isn't actually possible - what's even the point of allowing users to provide more than one repo in the config-form if there's no way of checking it out in the script?
Use the url found at yourjenkinshostname.com/pipeline-syntax/ to generate a step for "checkout: General SCM". After that, fill out the info for the repo you want, and click "Additional Behaviors" and add one for "Checkout to Subdirectory".
Lastly click "Generate Pipeline Script". The output from that should be useable in your Jenkinsfile. Completed, the process looks like this:
Syntax Generator Example
Alternatively, if you're used to the checkout step, the "RelativeTargetDirectory" extension class can be used to do this. A checkout step with that included looks like this:
checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'test-dir']], submoduleCfg: [], userRemoteConfigs: [[url: 'https://github.com/jenkinsci/puppet-jenkins.git']]])
The key part being...
extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'test-dir']]
EDIT: According to issues.jenkins-ci.org/browse/JENKINS-32018, the multiple sources of a multibranch job is not for two different repositories, but rather for multiple sources of a single repository.
You'll need to hardcode in URLs, I'm afraid. On approach is to have two multibranch jobs. One has the SCM as repo A and hardcodes a checkout of repo B, the other has repo B as the SCM, and hardcodes a checkout of repo A.

Clean builds with Multibranch Workflow

Using Multibranch Workflow, the command to check out looks like
checkout scm
I can't find a way to tell Jenkins to perform a clean checkout. By "clean," I mean it should remove all files from the workspace that aren't under version control.
I'm not sure if this answers the original question or not (I couldn't tell if the intention was to leave some files in the workspace) but why not just remove the workspace first, this would allow a clean checkout:
stage ('Clean') {
deleteDir()
}
stage ('Checkout') {
checkout scm
}
I run into the same problem and here is my workaround.
I created a new scm object for the checkout and extended the extensions with the CleanBeforeCheckout. But i kept the other configurations like branches and userRemoteConfigs.
checkout([
$class: 'GitSCM',
branches: scm.branches,
extensions: scm.extensions + [[$class: 'CleanBeforeCheckout']],
userRemoteConfigs: scm.userRemoteConfigs
])
It's still not perfect because you have to create a new object :(
First, you can not assume that a workflow job has a workspace as it was for freestyle jobs. Actually, a workflow job can use more than one workspace (one for each node or ws block).
Said that, what I'm going to propose is a kind of hacky: modify the scm object before checkout to set up a CleanCheckout extension (you will have to approve some calls there).
import hudson.plugins.git.extensions.impl.CleanCheckout
scm.extensions.replace(new CleanCheckout())
checkout scm
But I'd prefer Christopher Orr's proposal, use a shell step after checkout (sh 'git clean -fdx').
Behaviors can be added when configuring the source. clean before checkout, clean after checkout and Wipe out repository and force clone. This removes the need to add logic to the declarative / scripted pipelines.
Adding Christopher-Orr's comment as an answer to just do:
stage('Checkout') {
checkout scm
sh 'git clean -fdx'
}
Jenkins currently contains a page to generate groovy pipeline syntax. Selecting the checkout step you should be able to add all the additional options that you're used to.
I generated the following which should do what you want:
checkout poll: false, scm: [$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'CleanBeforeCheckout']], submoduleCfg: [], userRemoteConfigs: [[url: 'ssh://repo/location.git']]]

Resources