In a multibranch pipeline job, I have configured builds (basic linting) to scan across branches for a jenkins file. I still have to perform this build manually however. What is the property I can set to enable polling of GitHub or, even better, triggered on new commits.
In general, I'm trying to find a way to learn how all GUI fields map to keys I can use in the properties(); method. There is no way for me to translate between GUI form field and script key-value option.
node('master') {
properties([
[$class: 'BuildDiscarderProperty', strategy: [$class: 'LogRotator', artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '10']],
[$class: 'BuildTriggerProperty???', strategy: 'Build when a change is pushed to GitHub???']
]);
...
}
Jenkins version 2.7
I'm trying to find a way to learn how all GUI fields map to keys I can use in the properties(); method.
If I got you correctly, the answer is:
Go to to your Pipeline project page
Find Pipeline Syntax link in the left-side menu and follow it
Find Snippet Generator link in the left-side menu and follow it
Select properties: Set job properties from Sample Step dropdown
Choose whatever you want and click Generate Groovy
Profit =)
This does not work (anymore?) as the only options are:
Build Triggers:
Build periodically
Build when another project is promoted
Build whenever a SNAPSHOT dependency is built
Monitor Docker Hub/Registry for image changes
Periodically if not otherwise run
Stash Pull Requests Builder
On a simple "pipeline" build, you can specify:
Build when a change is pushed to BitBucket
But MultiBranch doesn't have this option.
Related
In a Jenkins / git / Gerrit setup I currently maintain a bunch of "pipeline-scm" jobs: e.g. one of them gets triggered by Gerrit on an incoming change request in order to validate this change.
There is a "Gerrit Trigger" plugin for Jenkins that provides that provides job with a git refspec and the branch that change relates to - among lots of other details.
When you setup a "Pipeline script from SCM" a job instance (build) will be provided with an scm object which can be used to checkout the change and also gets used by Jenkins itself to show some meta information on the build's overview, e.g. the commits that define this change:
Now here comes my problem:
For [..] reasons I need to turn this "Pipeline script from SCM" job into a "Pipeline script" job. I.e. instead of just using the auto-generated scm object (which is being used on the master node already to fetch the pipeline script (and most likely to utilize the meta information about this change)) to run checkout scm in my pipeline script I now have to create this object manually:
scm = [
$class: "GitSCM",
userRemoteConfigs: [[
credentialsId: <SOME ID>,
url: <REPO-URL>,
]],
branches: [[
name: <BRANCH-NAME>,
]],
]
checkout scm
This raises two questions:
I know the Gerrit plugin provides details about a change in environment variables like GERRIT_PATCHSET_REVISION, GERRIT_REFSPEC and GERRIT_BRANCH - but how do I use them? I've seen examples which define branches using GERRIT_BRANCH, others use GERRIT_PATCHSET_REVISION others use a branch pattern like */master.
how does the meta information which can be extracted from a valid scm instance get to the build's overview page? I guess the pipeline-scm mechanism makes the master node populate this page automatically, but now the master node has no scm object in advance anymore. Is there a mechanism like "currentBuild.updateMetaInfo(scm)`?
Is there any documentation about how to properly set up and use a scm object (as Jenkins does) in a standalone pipeline job (i.e. without the scm mechanism)?
Update:
This scm configuration works for me in terms of checkout:
scm = [
$class: "GitSCM",
userRemoteConfigs: [[
credentialsId: <MY_ID>,
url: <MY_REPO_URL>,
refspec: env["GERRIT_REFSPEC],
]],
branches: [[name: "FETCH_HEAD"]],
];
Unfortunately even after checkout scm no "Changes" show up, instead the overview page only displays this:
The build matrix in the job overview also states there are no changes:
Regarding your first question on how to use the GERRIT_ variables with the Jenkins Git SCM, check the docs:
To get the Git Plugin to download your change; set Refspec to
$GERRIT_REFSPEC and the Choosing strategy to Gerrit Trigger. This may
be under ''Additional Behaviours/Strategy For Choosing What To Build'
rather than directly visible as depicted in the screenshot. You may
also need to set 'Branches to build' to $GERRIT_BRANCH. If this does
not work for you set Refspec to refs/changes/:refs/changes/* and
'Branches to build' to *$GERRIT_REFSPEC.
Regarding the second one:
The checkout step should add the information to the build status page.
Regarding your closing question:
The scm object is just a convenience object that represents the scm you would configure in Pipeline from SCM, so you dont need to provide the details specified there in the Pipeline script again.
To generate a checkout step for a Pipeline without SCM, use the Snippet Generator from the Pipeline Syntax page.
Scenario:
A developer creates a PR against the master branch. Jenkins (Cloudbees) goes to work building and validating that PR as well as generating a text file containing a build_info.txt file as an artifact.
When the PR is merged into master I need to be able to access the artifact that was created in the PR validation step, extract the version information it contains and commit that version information into master along with PR code changes.
Problem:
I've printed out the env vars during the merge to master script (run_pr_merge), but I haven't seen information (It might be there, I just don't recognize it) that would allow me to link back to the PR job that is being merged or a way to say "give me the artifacts that were created during this PR's build and validation job"
My script looks something like this:
if (isMasterBranch()) {
// run this when code is pushed to master
sh "bash ./run_pr_merge.sh" // which requires build_info.txt from PR build
} else {
// run this for each PR build
sh "bash ./build_and_validate.sh"
archiveArtifacts 'build_info.txt'
}
Am not super familiar with Jenkins/Cloudbees, so there might be a better way of structuring the pipeline to achieve what I need but am hoping there's a relatively easy way to get hold of the PR info being merged int master.
Have looked at copyArtifacts but again I'm not sure how to to reference the PR being merged. Any help greatly appreciated.
There's a CopyArtifact plugin for Jenkins that can be used to copy artifacts from another job. It supports both "Freestyle" jobs and pipeline. You need to provide job URL, and, optionally, build number (by default it will use last successful build). You can craft the URL of the job provided you have the PR number.
In our environment, the URL for a specific job looks something like:
https://jenkins.mycompany.com/job/mycompany/job/myproj/job/PR-<number>/<build_no>/
Provided I have the PR number, I can build the needed URL with e.g.
def pr_number = 12345
def job_url = "https://jenkins.mycompany.com/job/mycompany/job/myproj/job/PR-${pr_number}/"
step([
$class: 'CopyArtifact',
filter: 'build_info.txt',
fingerprintArtifacts: true,
optional: true,
projectName: job_url
// default selector is "last successful build"
// selector: [$class: 'SpecificBuildSelector',
// buildNumber: build_number]
])
// check the file is there
sh "cat ${WORKSPACE}/build_info.txt"
I want to list my branches as parameter in Jenkins. It's possible in the freestyle job (using the git parameter plugin). But I don't know how to make it work inside a pipeline?
The plugin tells us they have added pipeline support but there isn't an example somewhere.
For a declarative Pipeline, you can add a git Parameter like this:
pipeline{
agent any
parameters {
gitParameter(
branch: '',
branchFilter: ".*",
defaultValue: "",
description: '',
listSize: '10',
name: 'Version',
quickFilterEnabled: false,
selectedValue: 'NONE',
sortMode: 'ASCENDING_SMART',
tagFilter: "*",
type: 'PT_BRANCH_TAG',
useRepository: 'git#github.com:foo/bar.git')
}
stages{
stage ("echo Git Tag") {
steps {
echo "${params.Version}"
}
}
}
}
The example above will show you all branches and tags available on the repo.
if you want to display only tags, change the type to
type: 'PT_TAG'
if you only want to show specific tags you can filter, for example, only show tags that start with "foo"
tagFilter: "foo*"
If you want to see more details, just check out the Pipeline Syntax Generator. you will find this at:
Sample Step -> properties -> This project is parameterised -> add Parameter -> git Parameter
I advice you to please go through multi-branch pipeline plugin
Let's say you have more than one branch available in GIT. Creating a multi-branch pipeline Job allows you to distinguish and run branch based Jenkins Jobs under single project.
Apart from GIT, It also support Bit-bucket, GitHub, Subversion, Mercurial, Single repository & branch.
I am creating a Jenkins job using grovvy pipeline scripts (I am new at this). I am stuck at a place where I want to trigger another Job with some build options set.
Basically, without grovvy pipeline script, I can do above (as shown in picture) using Parameterized Trigger Plugin and it provides me useful variables like ${TRIGGERED_BUILD_NUMER_} (as shown in the picture, I am triggering job named Another-Job) and I can also set options like "Block until the triggered projects to finish their builds" and the options below them (as shown in the picture)
I, actually, don't know how to do this using pipeline script. Can someone help me in this or point me to the appropriate documentations?
Thanks in advance!
You can use the build step that does exactly that :
build job: 'Another-Job', parameters: [
[$class: 'StringParameterValue', name: 'operation', value: "${OPERATION}" ],
[$class: 'StringParameterValue', name: 'beanstalk_application_version', value: "${TRIGGERED_BUILD_NUMBER_ANother-Job}-{GIT-COMMIT}" ]]
2 things worth noting :
The "block until triggered project is finished" is the default option of this build step, and this step also propagate any downstream error by default. You can use propagate and wait params if you want do deactivate this default behaviour.
Environment variables or Groovy-defined variables are all available with the same notation, as they would have been available with your freestyle triggering job. Just make sure you use double quotes and not simple quotes around your variables, otherwise the variables won't be interpreted and replaced when triggering downstream jobs.
To build a job with default settings simply write:
build 'Another-Job'
To build a job with parameters:
build job: 'Another-Job', parameters: [string(name: 'some-param-name', value: 'some-param-default-value')]
In general to write pipeline code I suggest you work closely with the pipeline-syntax documentation provided by any running jenkins at:
http://my-jenkins-url/job/my-job-name/pipeline-syntax/
I have a Git repository with code I'd like to build but I'm not "allowed" to add a Jenkinsfile in its root (it is a Debian package so I can't add files to upstream source). Is there a way to store the Jenkinsfile in one repository and have it build code from another repository? Since my code repository has several branches to build (one for each Debian release) this should be a multibranch pipeline. Commits in either the code or Jenkinsfile repositories should trigger a build.
Bonus complexity: I have several code/packaging repositories like this and I'd like to reuse the same Jenkinsfile for all of them. Thus it should somehow dynamically fetch the right Git URL to use. The branches to build have the same names across all repositories.
Short answer is : you cannot do that with a multibranch pipeline. Multibranch pipelines are only designed (at least for now) to execute a specific pipeline in Pipeline script from SCM style, with a fixed Jenkinsfile at the root of the project.
You can however use the Multi-Branch Project plugin made for multibranch freestyle projects. First, you need to define your multibranch freestyle configuration just like you would with a multibranch pipeline configuration.
Select this new item like shown below :
This type of configuration will behave exactly same as the multibranch pipeline type, i.e. it will create you a folder with the name of your configuration and a sub-project for each branch it automatically detected.
The implementation should then be a piece of cake :
Specify your SCM repository in the multibranch configuration
Call another build as part of your build/post-build as you would do in a standard freestyle project, except that you have to call a parameterized job (let's call it build-job) and give it your repository information, i.e. Git URL and current branch (you can use the pre-defined variables $GIT_URL and $GIT_BRANCH for this purpose)
In your build-job, just define either an inline pipeline or a pipeline script checked out from SCM, and inside this script do a SCM checkout and go on with the steps you need to build. Example of build-job pipeline content :
.
node() {
stage 'Checkout'
checkout scm: [$class: 'GitSCM', branches: [[name: '*/${GIT_BRANCH}']], userRemoteConfigs: [[url: '${GIT_URL}']]]
stage 'Build'
// Build steps...
}
Of course if your different multibranches projects need to be treated a bit differently, you could also use intermediate projects (let's say build-project-A, build-project-B, ...) that would in turn call the generic build-job pipeline)
The one, major drawback of this solution is that you will only have one job responsible for all of your builds, making it harder to debug. You would still have your multibranch projects going blue/red in case of success/error but you will have to go back to called build-job to find the real problem of your build.
The best way I have found is to use the Remote Jenkinsfile Provider plugin. https://plugins.jenkins.io/remote-file/
This will add an option "by Remote Jenkinsfile Provider plugin" under Build Configuration>Mode then you can point to another repo where the Jenkinsfile is. I find this to be a much better solution than the Pipeline Multibranch Defaults Plugin, which makes you store the Jenkins file in Jenkins itself, rather than in source control.
U can make use of this plugin
https://github.com/jenkinsci/pipeline-multibranch-defaults-plugin/blob/master/README.md
Where we need to configure the jenkinsfile on jenkins rather than having it on each branch of your repo
I have version 2.121 and you can do this two ways:
Way 1
In the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script from SCM" and enter the "SCM" information for how to find the "Jenkinsfile" that holds the script you want to run. It can be in the same repo you are finding branches in to create the jobs (if you put in the same GitHub repo's info) but I can't find a way to indicate that you just use the same branch for the file.
Way 2
Same as above, in the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script" and put a bit of Groovy in the text box to load whatever you want or to run some script that already got loaded into the workspace.
In my case, i have an escenario whith a gitlab project based on gradle who has dependencies on another gitlab preject based on gradle too (same dashboard, but differents commits, differents developers).
I have added the following lines into my Jenkinsfile (the one which depends)
stage('Build') {
steps {
git branch: 'dev', credentialsId: 'jenkins-generated-ssh-key', url: 'git#gitlab.project.com:root/coreProject.git'
sh './gradlew clean'
}
}
Note: Be awark on the order on the sentences.
If you have doubt on how to create jenkins-generated-ssh-key please ask me