I am running my builds in Kubernetes agents, so poll SCM won't work since the agents will be removed once the build is completed.
I am using plastic SCM and I found there is an option called poll on the controller for it in the readme.
I am using the cm commands in my Jenkins file for the SCM environment variables.
Eg:-
scmvars = cm(
branch: "${Branch_Name}",
changeset: "${Change_Set}",
repository: "${Repo_Name}",
server: "${Plastic_Server_Url}",
cleanup: 'DELETE'
)
Later I print the variables as:
println scmvars.PLASTICSCM_BRANCH, scmvars.PLASTICSCM_CHANGESET_ID
But I didn't see the Poll on controller option in the pipeline syntax and in the documentation.
Related
Consider the following setup using Jenkins 2.176.1:
A new pipeline project named Foobar
Poll SCM as (only) build trigger, with: H/5 * * * * ... under the assumption that this refers to the SCM configured in the next step
Pipeline script from SCM with SCM Git and a working Git repository URL
Uncheck Lightweight checkout because of JENKINS-42971 and JENKINS-48431 (I am using build variables in the real project and Jenkinsfile; also this may affect how pollSCM works, so I include this step here)
Said repository contains a simple Jenkinsfile
The Jenkinsfile looks approximately like this:
#!groovy
pipeline {
agent any
triggers { pollSCM 'H/5 * * * *' }
stages {
stage('Source checkout') {
steps {
checkout(
[
$class: 'GitSCM',
branches: [],
browser: [],
doGenerateSubmoduleConfigurations: false,
extensions: [],
submoduleCfg: [],
userRemoteConfigs: [
[
url: 'git://server/project.git'
]
]
]
)
stash 'source'
}
}
stage('OS-specific binaries') {
parallel {
stage('Linux') {
agent { label 'gcc && linux' }
steps {
unstash 'source'
echo 'Pretending to do a build here'
}
}
stage('Windows') {
agent { label 'windows' }
steps {
unstash 'source'
echo 'Pretending to do a build here'
}
}
}
}
}
}
My understanding so far was that:
a change to the Jenkinsfile (not the whole repo) triggers the pipeline on any registered agent (or as configured in the pipeline project).
said agent (which is random) uses the pollSCM trigger in the Jenkinsfile to trigger the pipeline stages.
But where does the pollSCM trigger poll (what SCM repo)? And if it's a random agent then how can it reasonably detect changes across poll runs?
then the stages are being executed on the agents as allocated ...
Now I am confused what refers to what. So here my questions (all interrelated which is why I keep it together in one question):
The pipeline project polls the SCM just for the Jenkinsfile or for any changes? The repository in my case is the same (for Jenkinsfile and source files to build binaries from).
If the (project-level) polling triggers at any change rather than changes to the Jenkinsfile
Does the pollSCM trigger in the Jenkinsfile somehow automagically refer to the checkout step?
Then ... what would happen, would I have multiple checkout steps with differing settings?
What determines what repository (and what contents inside of that) gets polled?
... or is this akin to the checkout scm shorthand and pollSCM actually refers to the SCM configured in the pipeline project and so I can shorten the checkout() to checkout scm in the steps?
Unfortunately the user handbook didn't answer any of those questions and pollSCM has a total of four occurrences on a single page within the entire handbook.
I'll take a crack at this one:
The pipeline project polls the SCM just for the Jenkinsfile or for any
changes? The repository in my case is the same (for Jenkinsfile and
source files to build binaries from).
The pipeline project will poll the repo for ANY file changes, not just the Jenkinsfile. A Jenkinsfile in the source repo is common practice.
If the (project-level) polling triggers at any change rather than
changes to the Jenkinsfile Does the pollSCM trigger in the Jenkinsfile
somehow automagically refer to the checkout step?
Your pipeline will be executed when a change to the repo is seen, and the steps are run in the order that they appear in your Jenkinsfile.
Then ... what would happen, would I have multiple checkout steps with
differing settings?
If you defined multiple repos with the checkout step (using multiple checkout SCM calls) then the main pipeline project repo would be polled for any changes and the repos you define in the pipeline would be checked out regardless of whether they changed or not.
What determines what repository (and what contents inside of that)
gets polled? ... or is this akin to the checkout scm shorthand and
pollSCM actually refers to the SCM configured in the pipeline project
and so I can shorten the checkout() to checkout scm in the steps?
pollSCM refers to the pipeline project's repo. The entire repo is cloned unless the project is otherwise configured (shallow clone, lightweight checkout, etc.).
The trigger defined as pollSCM polls the source-control-management (SCM), at the repository and branch in which this jenkinsfile itself (and other code) is located.
For Pipelines which are integrated with a source such as GitHub or BitBucket, triggers may not be necessary as webhooks-based integration will likely already be present. The triggers currently available are cron, pollSCM and upstream.
It works for a multibranch-pipeline as trigger to execute the pipeline.
When Jenkins polls the SCM, exactly this repository and branch, and detects a change (i.e. new commit), then this Pipeline (defined in jenkinsfile) is executed.
Usually then the following SCM Step checkout will be executed, so that the specified project(s) can be built, tested and deployed.
See also:
SCM Poll in jenkins multibranch pipeline
SehllHacks(2020): Jenkins: Scan Multibranch Pipeline Without Build
For one of my projects that I have on GitHub, I wanted to build it as a docker image and push it to my docker hub. The project is a sbt one with a Scala codebase.
Here is how my JenkinsFile is defined:
#!groovy
node {
// set this in Jenkins server under Manage Jenkins > Credentials > System > Global Credentials
docker.withRegistry('https://hub.docker.com/', 'joesan-docker-hub-credentials') {
git credentialsId: '630bd271-01e7-48c3-bc5f-5df059c1abb8', url: 'https://github.com/joesan/monix-samples.git'
sh "git rev-parse HEAD > .git/commit-id"
def commit_id = readFile('.git/commit-id').trim()
println comit_id
stage "build" {
def app = docker.build "Monix-Sample"
}
stage "publish" {
app.push 'master'
app.push "${commit_id}"
}
}
}
When I tried to run this from my Jenkins server, I get the following error:
java.io.FileNotFoundException
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:167)
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:159)
at jenkins.plugins.git.GitSCMFileSystem$3.invoke(GitSCMFileSystem.java:161)
at org.jenkinsci.plugins.gitclient.AbstractGitAPIImpl.withRepository(AbstractGitAPIImpl.java:29)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.withRepository(CliGitAPIImpl.java:65)
at jenkins.plugins.git.GitSCMFileSystem.invoke(GitSCMFileSystem.java:157)
at jenkins.plugins.git.GitSCMFile.content(GitSCMFile.java:159)
at jenkins.scm.api.SCMFile.contentAsString(SCMFile.java:338)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:101)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:59)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:232)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:404)
Finished: FAILURE
Since this is running inside a VM on Azure, I thought the VM was not able to reach outside, but that seems not to be the case as I was able to ssh into the VM and git pull from the Git repo. So what is the problem here? How could I make this work?
for me unchecking "lightweight checkout" fixed the issue
I experienced the exact same error. My setting:
Pipeline build inside a dockerized Jenkins (version 2.32.3)
In the configuration of the job, I specified a check out into a subdirectory: Open the configuration, e.g. https://myJenkins/job/my-job/configure. At the bottom, see section Pipeline -> Additional Behaviours -> Check out into a sub-directory with Local subdirectory for repo set to, e.g., my-sub-dir.
Expectation: Upon check out, the Jenkinsfile ends up in my-sub-dir/Jenkinsfile.
Via the option Script path, you configure the location of the Jenkinsfile so that Jenkins can start the build. I put my-sub-dir/Jenkinsfile as value.
I then received the exception you pasted in your question. I fixed it by setting Script Path to Jenkinsfile. If you don't specify a sub-directory for check out, then still try double checking values for Script Path.
Note: I have another Jenkins instance at work. There I have to specify Script Path including the custom check out sub-directory (as mentioned in Expectation above).
GO TO Job-->Config-->Pipline and uncheck checkbox lightweight checkout"
lightweight checkout : selected, try to obtain the Pipeline script contents >directly from
the SCM without performing a full checkout. The advantage of this mode
is its efficiency; however, you will not get any changelogs or polling
based on the SCM. (If you use checkout scm during the build, this will
populate the changelog and initialize polling.) Also build parameters
will not be substituted into SCM configuration in this mode. Only
selected SCM plugins support this mode.
I have a Git repository with code I'd like to build but I'm not "allowed" to add a Jenkinsfile in its root (it is a Debian package so I can't add files to upstream source). Is there a way to store the Jenkinsfile in one repository and have it build code from another repository? Since my code repository has several branches to build (one for each Debian release) this should be a multibranch pipeline. Commits in either the code or Jenkinsfile repositories should trigger a build.
Bonus complexity: I have several code/packaging repositories like this and I'd like to reuse the same Jenkinsfile for all of them. Thus it should somehow dynamically fetch the right Git URL to use. The branches to build have the same names across all repositories.
Short answer is : you cannot do that with a multibranch pipeline. Multibranch pipelines are only designed (at least for now) to execute a specific pipeline in Pipeline script from SCM style, with a fixed Jenkinsfile at the root of the project.
You can however use the Multi-Branch Project plugin made for multibranch freestyle projects. First, you need to define your multibranch freestyle configuration just like you would with a multibranch pipeline configuration.
Select this new item like shown below :
This type of configuration will behave exactly same as the multibranch pipeline type, i.e. it will create you a folder with the name of your configuration and a sub-project for each branch it automatically detected.
The implementation should then be a piece of cake :
Specify your SCM repository in the multibranch configuration
Call another build as part of your build/post-build as you would do in a standard freestyle project, except that you have to call a parameterized job (let's call it build-job) and give it your repository information, i.e. Git URL and current branch (you can use the pre-defined variables $GIT_URL and $GIT_BRANCH for this purpose)
In your build-job, just define either an inline pipeline or a pipeline script checked out from SCM, and inside this script do a SCM checkout and go on with the steps you need to build. Example of build-job pipeline content :
.
node() {
stage 'Checkout'
checkout scm: [$class: 'GitSCM', branches: [[name: '*/${GIT_BRANCH}']], userRemoteConfigs: [[url: '${GIT_URL}']]]
stage 'Build'
// Build steps...
}
Of course if your different multibranches projects need to be treated a bit differently, you could also use intermediate projects (let's say build-project-A, build-project-B, ...) that would in turn call the generic build-job pipeline)
The one, major drawback of this solution is that you will only have one job responsible for all of your builds, making it harder to debug. You would still have your multibranch projects going blue/red in case of success/error but you will have to go back to called build-job to find the real problem of your build.
The best way I have found is to use the Remote Jenkinsfile Provider plugin. https://plugins.jenkins.io/remote-file/
This will add an option "by Remote Jenkinsfile Provider plugin" under Build Configuration>Mode then you can point to another repo where the Jenkinsfile is. I find this to be a much better solution than the Pipeline Multibranch Defaults Plugin, which makes you store the Jenkins file in Jenkins itself, rather than in source control.
U can make use of this plugin
https://github.com/jenkinsci/pipeline-multibranch-defaults-plugin/blob/master/README.md
Where we need to configure the jenkinsfile on jenkins rather than having it on each branch of your repo
I have version 2.121 and you can do this two ways:
Way 1
In the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script from SCM" and enter the "SCM" information for how to find the "Jenkinsfile" that holds the script you want to run. It can be in the same repo you are finding branches in to create the jobs (if you put in the same GitHub repo's info) but I can't find a way to indicate that you just use the same branch for the file.
Way 2
Same as above, in the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script" and put a bit of Groovy in the text box to load whatever you want or to run some script that already got loaded into the workspace.
In my case, i have an escenario whith a gitlab project based on gradle who has dependencies on another gitlab preject based on gradle too (same dashboard, but differents commits, differents developers).
I have added the following lines into my Jenkinsfile (the one which depends)
stage('Build') {
steps {
git branch: 'dev', credentialsId: 'jenkins-generated-ssh-key', url: 'git#gitlab.project.com:root/coreProject.git'
sh './gradlew clean'
}
}
Note: Be awark on the order on the sentences.
If you have doubt on how to create jenkins-generated-ssh-key please ask me
I am configuring a post commit hook and would like to be able to trigger a build on the branch that has been committed.
So far i have set up the post-commit hook file
curl http://jenkins.local:8080/git/notifyCommit?url=GITHUB_URL/REPO_NAME.git
Within Jenkins i have set
This build is parameterized
**String Paramenter**
Name: branch
Branches to build: $branch
How can i get jenkins to build the branch that has just been committed?
See http://<Your Jenkins>/job/<Your job's name>/api/:
Perform a build
If the build has parameters, post to this URL [http://<Your Jenkins>/job/<Your job's name>/buildWithParameters] and provide the parameters as form data.
I need to know which branch is being built in my Jenkins multibranch pipeline in order for it to run steps correctly.
We are using a gitflow pattern with dev, release, and master branches that all are used to create artifacts. The dev branch auto deploys, the other two do not. Also there are feature, bugfix and hotfix branches. These branches should be built, but not produce an artifact. They should just be used to inform the developer if there is a problem with their code.
In a standard build, I have access to the $GIT_BRANCH variable to know which branch is being built, but that variable isn't set in my multibranch pipeline. I have tried env.GIT_BRANCH too, and I tried to pass $GIT_BRANCH as a parameter to the build. Nothing seems to work. I assumed that since the build knows about the branch being built (I can see the branch name at the top of the console output) that there is something that I can use - I just can't find any reference to it.
The env.BRANCH_NAME variable contains the branch name.
As of Pipeline Groovy Plugin 2.18, you can also just use BRANCH_NAME
(env isn't required but still accepted.)
There is not a dedicated variable for this purpose yet (JENKINS-30252). In the meantime you can take advantage of the fact that the subproject name is taken from the branch name, and use
env.JOB_NAME.replaceFirst('.+/', '')
This has now been resolved, see Krzysztof KrasoĊ's answer.
There are 2 branches to consider in a Jenkins multibranch pipeline job:
The Jenkins job branch - env.BRANCH_NAME. This may have the same name as a git branch, but might also be called PR-123 or similar
The git branch - env.GIT_BRANCH. This is the actual branch name in git.
So a job might have BRANCH_NAME=PR-123 and GIT_BRANCH=my-scm-branch-name
Jenkins documentation has a list of all the env variable for your perusal here
Another way is using the git command to obtain the branch name on the current jenkins pipeline. For example, you can add the following snippet to print the branch name in your Jenkinsfile.
...
script {
def BRANCH = sh(returnStdout: true, script: 'git rev-parse --abbrev-ref HEAD').trim()
echo ${BRANCH}
}
...
I found this stackoverflow post example useful: Git Variables in Jenkins Workflow plugin
sh '//...
git rev-parse --abbrev-ref HEAD > GIT_BRANCH'
git_branch = readFile('GIT_BRANCH').trim()
echo git_branch
//...
'