We are using jenkins for manual triggered jobs to deploy some code.
Jenkinsfile describing our pipeline is located in a jenkinsfile dedicated repo (not in deployed code repo).
We are using declarative pipeline syntax and shared libraries in our jenkinsfiles.
In BlueOcean interface there are 2 interesting attributes (branch and commit) automatically filled when using plugins to trigger pipelines (like github organization).
I'm searching a way to set/update these 2 attributes manually from within the pipeline code for our manual pipelines.
Job description and name can be easily updated using something like :
stage('Set pipeline description'){
steps {
script {
currentBuild.description = "Deploying branch ${branch} on ${targetEnv}"
}
}
}
But I didn't find anywhere how to update branch or commit values.
Did anybody try this ?
This issue is reported as bug (see link).
"We are using GitLab web hooks to trigger Jenkins Pipeline project builds on new commit push to GitLab. Build is triggered, CI commit status report is being sent back to gitlab, but can't see branch and commit field in Header-details element."
Please vote on this issue on Jenkins CI website if you want for the issue to be resolved sooner.
Related
I am pretty new to Jenkins, and when I make a pull request, the Github webhook triggers a rebuild on all branches in the repo. However, I only want the branch associated with the pull request to be built. I suspect that all branches are built due to nothing specified in the "Branch specifier."
Is there a way to specify to build only the branch the pull request was made to?
Thanks!
To resolve your issue try to use multibranch pipeline job or use filter in simple pipeline job.
To use multibranch pipeline job you need to install Pipeline: Multibranch plugin.
Pipeline job example
In simple pipeline job choose "Build when a change is pushed to GitLab" in "Build trigger" section -> Advanced -> Filter branches by name
I have a pipeline project added to Jenkins. I would like to tell Jenkins to build a specific commit in this project by using a REST query.
The project is configured as Jenkinsfile pipeline project. I would like Jenkins to get the Jenkinsfile from this specific commit I've selected, and then the pipeline script would checkout the code and perform the build.
My problem is that I can't force Jenkins to take Jenkinsfile from my selected commit. I can configure it so it'll get Jenkinsfile from master or any other branch, but I don't know how to parametrize the place where I want Jenkins to take Jenkinsfile from.
I.e. when building commit foo, I want Jenkins to checkout the repository I've provided, switch to commit foo and use Jenkinsfile from this commit. Next build job would want to use Jenkins to build commit bar; in such case Jenkins would clone the repo, switch to the bar commit, and start the Jenkinsfile from the bar commit.
Is this action supported by Jenkins?
Duck debugging in action.
My problem was resolved by making sure the "lightweight checkout" in the Git repository settings pane was un-checked. A bug in Jenkins?
I am currently using Cloudbees Jenkins Coreas my Jenkins solution.
I am using Jenkins Pipelines to write our Jenkins job configuration. These pipelines are stored in GitHub repositories. Each Jenkins job when created is connected to a GitHub Repository where the source code is pulled from, and that's where the Jenkinsfile is stored and Jenkins reads from.
Below are some high-level photos for how our Jenkins jobs are configured.
The advantage of the way these jobs are configured is the Jenkinsfile is always read from the master branch. Meaning if a rouge developer tries to remove stages from the Jenkinsfile from within there own branch, it doesn't matter because the Jenkinsfile is always read from the master branch (which is always protected).
However, the one massive drawback to this - is how do teams and developers who are devops engineerings make changes to the Jenkinsfile? For example, let's say a developer creates a branch called feature-jenkins-search and they edit the Jenkinsfile adding a new stage in the pipeline. Whenever they push these changes to GitHub to test - they can't test as it's always read from the master branch? Meaning devops engineerings have to work directly on the master branch? Surely this is not the best way to go and there is a better configuration to set?
We do want to still provide the security that if a developer is rougue and
You should really look into the Jenkins multi-branch pipeline feature. The Jenkins multi-branch pipeline allows to create a single configuration item in Jenkins (a bit like a folder) that can detect all the branches and pull requests in a GitHub repository with a Jenkinsfile and build them using automatically created jobs. Inside this multi-branch pipeline object when it is configured in Jenkins, you will find a number of jobs to build the various branches and pull-requests in the GitHub repository.
So your developers should maintain a Jenkinsfile in every branch they work on in GitHub to build that branch in your Jenkins server.
It is possible to make the Jenkinsfile do branch specific handling if required with conditional stages / when conditions in the Jenkinsfile pipelines in each branch.
You can lock down the master branch so that code and Jenkinsfile changes from other branches can only be merged with an approved PR (pull request). There is good integration between Jenkins and GitHub such that you can configure the master branch to only allow a PR to be merged if the PR is buildable in Jenkins. So if developers add new stages / processing to a Jenkinsfile on a branch being merged to master, it should be validated so that builds of your master branch are not broken.
There is a lot of configurability in the Jenkins multi-branch pipeline object for detection and handling of branches and it may be necessary to experiment to get it right for what you need with your team. If you cannot find this feature in Jenkins, it is probably because the correct Jenkins pipeline and GitHub related plugins are not installed.
You could also have a look at a similar Jenkins feature called the Jenkins GitHub Organization Folder which allows to detect and build all repos and branches at a GitHub Organization level. But when starting out, I would suggest to look into the multi-branch pipeline at the single repo level first.
These features are discussed in the Jenkins pipeline documentation. We use these features with our internal GitHub and Jenkins server and it works very well.
I think you will find the idea of using a single Jenkinsfile in the master branch to be used for building all branches is unworkable, as you have seen!
I have successfully setup a webhook trigger in bitbucket for a Jenkins freestyle project, for test purposes.
Unfortunately my Jenkins project is using the Pipeline format, and I am unable to get Bitbucket to trigger that kind of project; the problem seems to be that there is no Jenkins project registered to pull from the repo that the Bitbucket webhook is coming from, and Jenkins replies with:
Error: Jenkins response: No git jobs using repository: ssh://git#myhost:7999/xxx/testing-jenkins.git and branches: master No Git consumers using SCM API plugin for: ssh://git#myhost:7999/xxx/testing-jenkins.git
The pipeline project is setup in a way that the Jenkinsfile is to be found in the given repository (ssh://git#myhost:7999/xxx/testing-jenkins.git), by using the "Pipeline script from SCM" option.
Therefore there is actually a kind of "git consumer" for the Pipeline, but this does not seem to be taken into account by Jenkins, probably because this is not a real project source, but a pipeline source.
Are there any examples of integration of Bitbucket and Jenkins Pipeline projects? I have been unable to find any.
If your are looking for a full Bitbucket and Jenkins Pipeline, I highly recommend to use the Bitbucket Branch Source Plugin. The plugin will discover all Branches and Pull Requests and build all who have a JenkinsFile in the root of repo.
You can also use create a project as Bitbucket Team, who will scan all repo of your organization:
See the official doc of CloudBees
I was struggling with the same problem. Following are the key points I followed.
In Jenkins pipeline job,
Under Build Triggers, check 'Trigger builds remotely (e.g., from scripts)' and fill in the 'Authentication Token' with some random and unique token.
In BitBucket repository,
Go to Settings > Services
Select 'Jenkins' from the drop down and 'Add service'.
Check 'Csrf Enabled'
Endpoint : http://username:apitoken#yourjenkinsurl.com/
You can find username and apitoken at Jenkins home > People
Select the user and click on configure. Under 'API Token' click on the 'Show API Token' button and you see the username and apitoken
Module name : This is optional. It can be any particular file or folder which is to be watched.
Project name : The project name in Jenkins.
If the job is in some folder structure, say I have 'MyTestFolder/MyTestPipelineJob', Project name to be mentioned is 'MyTestFolder/job/MyTestPipelineJob'
Token : 'Authentication Token' created in Jenkins job.
You are ready to go!!
I referred http://felixleong.com/blog/2012/02/hooking-bitbucket-up-with-jenkins/ and some of my instincts. :)
A simple solution is to use Generic Webhook Trigger Plugin in Jenkins.
You would need to
Enable it in a free style or pipeline job.
Configure a token string
Construct JSONPath:s to gather whatever you need from the Bitbucket Webhook.
Add the plugin endpoint in Bitbucket. JENKINS_URL/generic-webhook-trigger/invoke?token=whatever_you_picked
The plugin will give you clear feedback when it is invoked so that troubleshooting is made easy.
It is up to you to pick whatever values you need from the webhook in order to clone the correct repository or whatever it is you want to do when the it is invoked.
I have this same issue. My workaround was just to create a freestyle project that can be triggered by the WebHook, and have the the Pipeline triggered by that project's completion.
In the mean time, here's the Jenkins bug you can watch for a fix:
https://issues.jenkins-ci.org/browse/JENKINS-38447
Spend hours figuring out how to do this in 2017.10
Like #JPLemelin described, new a Jenkins item using a Bitbucket Team/project
ref to the doc: https://support.cloudbees.com/hc/en-us/articles/115000051132-How-to-Trigger-Multibranch-Jobs-from-BitBucket-Cloud- , install the plugin: The BitBucket Branch Source plugin.
go to bitbucket, and add webhook: ${your-jenkins-url}/bitbucket-scmsource-hook/notify
after these 3 steps, I finally make the pipeline jobs run after new commit into bitbucket
I had the same exact issue...
The cause was using */master for branch specifier. I needed to spell it out: origin/master (no wildcards).
It works well now.
I was finally able to make this work with Jenkinsfile in Multi Branch Pipeline:
In Bitbucket i created a webhook with my Jenkins-URL, my clone-URL and in the webhook i put the following URL (exact the url in the project of Jenkins):
http://<jenkins>/git/notifyCommit?url=http://<user>#<bitbucket>/scm/<project>/<repo>.git
When i test the trigger the result is the following:
No git jobs using repository: http://<user>#<bitbucket>/scm/<project>/<repo>.git and branches:
Scheduled indexing of <repo>
So it didn't trigger any jobs, but it triggered the multi branch scanning, so my changed branches are build.
I am migrating my old Jenkins free-style job to multi-branch pipeline. I also want to use GitLab hook with them.
My problem is the branch detection. I am doing it manually but I want it to be automatic: when a new branch is pushed to git, GitLab trigger a Jenkins job that trigger the branch detection if the branch parameter from GitLab is not known for Jenkins at the moment. Is this possible to do it or doesn't this exist?
FYI: I tried to launch the multi-branch pipeline job but Jenkins says:
ERROR: No parameterized job named XXX found.
Enable "Build Periodcally" in your multibranch job configuration and the branch indexing will automatically started.
What you really need is a branch source plugin for GitLab with webhook integration, which is tracked as an RFE in JIRA.
Failing that, use a plain Git branch source and configure GitLab to send Jenkins notifications to /git/notifyCommit (IIRC) as documented on the Git plugin wiki. Need specify only a url, no other details. The branch indexing this triggers should both detect new or removed branches, and changes to the head of an existing branch, and schedule builds accordingly.
You can set webhook in GitLab for push events and URL like http://<yourserver>/git/notifyCommit?url=<URL of the Git repository>.
See https://wiki.jenkins-ci.org/display/JENKINS/Git+Plugin#GitPlugin-Pushnotificationfromrepository
GitLab notifies Jenkins on push events which should trigger branch detection also for multibranch pipeline.
I didn't receive the answer I wanted and I ran into this issue today that answered the question :
https://github.com/jenkinsci/gitlab-plugin/issues/298
TLDR: Multi-branch pipeline are not supported yet to be triggered by gitlab commit easily. There is a workaround. Look at the link above.