I have one git repository where I have multiple folders, I want the Jenkins pipeline trigger when a specific folder gets changed.
How can I achieve this?
You can configure your Jenkinsfile to do that.
Take a look on Jenkins Built-in Conditions. You need "changeset" here.
e.g.
stages {
stage('yourStage') {
when { changeset "FOLDERNAME/*"}
steps {
//.... what to do...
}
}
}
Jenkins Built In Conditions
Related
I have a multibranch pipeline with the following behaviors:
And the following Jenkinsfile:
pipeline {
agent {
label 'apple'
}
stages {
stage('Lint') {
when {
changeRequest()
}
steps {
sh 'fastlane lint'
}
}
}
post {
success {
reportSuccess()
}
failure {
reportFailure()
}
}
}
I use a slave to run the actual build, but the master still needs to checkout the code to get the Jenkinsfile. For that, it seems to use the same behaviors as the one defined in the job even though it really only needs the Jenkinsfile.
My problem is that I want to discover pull requests by merging the pull request with the current target branch revision, but when there is a merge conflict the build will fail before the Jenkinsfile is executed. This prevents any kind of reporting done in post steps.
Is there a way to have the initial checkout not merge the target branch, but still have it merged when actually running the Jenkinsfile on a slave?
You may want to check out using "Current Pull Request revision" strategy, and then on a successful build issue a git merge command.
I have a jenkins pipeline job which bootstraps itself. The idea here is to self test changes, including changes to the jenkins pipeline groovy scripts, which are stored in git.
I was wondering how to bootstrap in a separate job than the build job. So for example, if I have a job called "build_linux", currently Jenkins will create a "workspace/build_linux" folder, and everything inside there will both bootstrap and build within there.
The issue here is that my build_repository.git is very large, and I don't want to check it out twice at the bootstrap stage and the build stage. However, if I don't check them both out in both stages, git (In actuality I'm repo syncing a manifest with both git repositories from the Google 'repo' scripts) complain that there's code in there which shouldn't be, and either fails or will try to clean it up (which is inefficient).
Ideally I'd just have both "workspace/bootstrap/" and "workspace/linux_build/" job folders.
For example, my pipeline looks like this:
bootstrap.gvy:
--> **entry point from my jenkins config**
stage('Bootstrap') {
node("$BOOTSTRAP_NODE") {
git checkout bootstrap_repository.git
git checkout build_repository.git // I don't want to do this here
pipeline = load "build/build.gvy"
}
pipeline.main(env.JOB_NAME) // which in this case is "build_linux"
}
build/build.gvy:
int main(String jobName) {
switch (jobname) {
case "build_linux":
doBuildLiunx()
break
}
}
def doBuildLinux() {
stage("Build") {
node("$BUILD_NODE") {
git checkout bootstrap_repository.git // I don't want to do this here
git checkout build_repository.git
}
}
}
Any suggestions for how to accomplish this? Perhaps I should use a different workspace for bootstrapping?
Thanks
I'm trying to find a way to pass a configuration for a Multibranch pipeline job into the jenkinsfile when it's executing.
My goal is to configure something like the following:
Branch : Server
"master" : "prodServer"
"develop" : "devServer"
"release/*", "hotfix/*" : "stagingServer"
"feature/Thing-I-Want-To-Change-Regularly" : "testingServer"
where I can then write a Jenkinsfile like this:
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
//branch is in config branches
}
steps {
//deploy to server
}
}
}
}
I'm having trouble finding a way to achieve this. EnvInject Plugin seems to be the solution for non-Pipeline projects, but it's currently got security issues and only partial Pipeline support.
If you want to deploy to different servers depending on the branch, in Multibranch Pipelines you can use:
when { branch 'master' } (decalrative)
or
${env.BRANCH_NAME} (scripted)
to access which branch you are on and then add logic to deploy to corresponding servers based on this.
Going to post my current best approach to a global config value and hope something better comes along.
In Manage Jenkins -> Configure System -> Global Properties you can define global Environment Variables which can be accessed from Jenkins jobs. Defining an MY_BRANCH variable there could be accessed from a pipeline.
when { branch: MY_BRANCH }
Or even a RegEx and used like this
when { expression { BRANCH_NAME ==~ MY_BRANCH } }
However, this has the disadvantage that the Environment Variables are shared between every Jenkins job, not just across all branches of a single job. So careful naming will be necessary.
I have git monorepo with different apps. Currently I have single Jenkinsfile in root folder that contains pipeline for app alls. It is very time consuming to execute full pipeline for all apps when commit changed only one app.
We use GitFlow-like approach to branching so Multibranch Pipeline jobs in Jenkins as perfect fit for our project.
I'm looking for a way to have several jobs in Jenkins, each one will be triggered only when code of appropriate application was changed.
Perfect solution for me looks like this:
I have several Multibranch Pipeline jobs in Jenkins. Each one looks for changes only to given directory and subdirectories. Each one uses own Jenkinsfile. Jobs pull git every X minutes and if there are changes to appropriate directories in existing branches - initiates build; if there are new branches with changes to appropriate directories - initiates build.
What stops me from this implementation
I'm missing a way to define commit to which folders must be ignored during scan execution by Multibranch pipeline. "Additional behaviour" for Multibranch pipeline doesn't have "Polling ignores commits to certain paths" option, while Pipeline or Freestyle jobs have. But I want to use Multibranch pipeline.
Solution described here doesnt work for me because if there will be new branch with changes only to "project1" then whenever Multibranch pipeline for "project2" will be triggered it will discover this new branch anyway and build it. Means for every new branch each of my Multibranch pipelines will be executed at least once no matter if there was changes to appropriate code or not.
Appreciate any help or suggestions how I can implement few Multibranch pipelines watching over same git repository but triggered only when appropriate pieces of code changed
This can be accomplished by using the Multibranch build strategy extension plugin. With this plugin, you can define a rule where the build only initiates when the changes belong to a sub-directory.
Install the plugin
On the Multibranch pipeline configuration, add a Build strategy
Select Build included regions strategy
Put a sub-folder on the field, such as subfolder/**
This way the changes will still be discovered, but they won't initiate a build if it doesn't belong to a certain set of files or folders.
This is the best approach I'm aware so far. But I think the best way would be a case where the changes doesn't even get discovered.
Edit: Gerrit Code Review plugin configuration
In case you're using the Gerrit Code Review plugin, you can also prevent new changes to be discovered by using a custom query:
I solved this by creating a project that builds other projects depending on the files changed. For example, from your repo root:
/Jenkinsfile
#!/usr/bin/env groovy
pipeline {
agent any
options {
timestamps()
}
triggers {
bitbucketPush()
}
stages {
stage('Build project A') {
when {
changeset "project-a/**"
}
steps {
build 'project-a'
}
}
stage('Build project B') {
when {
changeset "project-b/**"
}
steps {
build 'project-b'
}
}
}
}
You would then have other Pipeline projects with their own Jenkinsfile (i.e., project-a/Jenkinsfile).
I know that this post is quite old, but I solved this problem by changing the "include branches" parameter for SVN repositories (this can possibly also be done using the property "Filter by name (with wildcards)" for git repos). Instead of supplying only the actual branch name, I also included the subfolder. So instead of only supplying "trunk", I used "trunk/subfolder". This limits scanning to only that specific directory. Note that I have not yet fully tested this solution.
Jenkins declarative pipelines offer a post directive to execute code after the stages have finished. Is there a similar thing to run code before the stages are running, and most importantly, before the SCM checkout?
For example something along the lines of:
pre {
always {
rm -rf ./*
}
}
This would then clean the workspace of my build before the source code is checked out.
pre is a cool feature idea, but doesn't exist yet. skipDefaultCheckout and checkout scm (which is the same as the default checkout) are the keys:
pipeline {
agent { label 'docker' }
options {
skipDefaultCheckout true
}
stages {
stage('clean_workspace_and_checkout_source') {
steps {
deleteDir()
checkout scm
}
}
stage('build') {
steps {
echo 'i build therefore i am'
}
}
}
}
For the moment there are no pre-build steps but for the purpose you are looking for, it can be done in the pipeline job configurarion and also multibranch pipeline jobs, when you define where is your jenkinsfile, choose Additional Behaviours -> Wipe out repository & force clone.
Delete the contents of the workspace before building, ensuring a fully fresh workspace.
If you do not really want to delete everything and save some network usage, you can just use this other option: Additional Behaviours -> Clean before checkout.
Clean up the workspace before every checkout by deleting all untracked files and directories, including those which are specified in .gitignore. It also resets all tracked files to their versioned state. This ensures that the workspace is in the same state as if you cloned and checked out in a brand-new empty directory, and ensures that your build is not affected by the files generated by the previous build.
This one will not delete the workspace but just reset the repository to the original state and pull new changes if there are some.
I use "Prepare an environment for the run / Script Content"