Jenkins pipeline job start by hook by release - jenkins

I have a Bitbucked repo, and I want to satrt my Jenkins pipeline job only afrer commit with tag like "release-1.0.*"
So, I seted my job up with pipeline script:
pipeline {
agent any
stages {
stage ('Prepare') {
when {
tag "release*"
}
steps {
git branch: 'tag1', url: 'git#bitbucket.org:m*********ny/tests.git'
}
}
stage ('Deploy') {
steps {
sshPublisher(publishers: [sshPublisherDesc(configName: "JenkinsSrv", transfers: [sshTransfer(execCommand: 'pwd')])])
}
}
}
post ('POST BUILD'){
always {
echo 'This is post action!!!'
}
}
}
Also, I turned on Bitbucked webhook plugin, than my repo notify Jenkins about new changes.
But my solution doesn't work. Help me resolve this case.
enter image description here

According to the official documentation for a Jenkins pipeline, the option you are looking for is the changelog condition inside the when directive. For example:
when { changelog 'release*' }

Related

How to invoke the job from one slave(server) to another slave(server) in jenkins

Please, can you advise as I am planning to invoke a job that is on different server. For example:
Slave1 server1: deploy job
Slave2 server2: build job
I want build job trigger the deploy job. Any suggestion, please
If you are trying to run two Jobs in the same Jenkins server. Your Pipeline should look something like below. Here from build Job you can call the Deploy Job.
Build Job
pipeline {
agent { label 'server2' }
stages {
stage('Build') {
steps {
build job: 'DeployJobName'
}
}
}
}
Deploy Job
pipeline {
agent { label 'server1' }
stages {
stage('Deploy') {
steps {
// Deploy something
}
}
}
}
Update
If you want to trigger a Job in a different Jenkins server, you can use a Plugin like RemoteTriggerPlugin or simply using the Jenkins API.
curl http://serer1:8080/job/DeployJobName/build?token=YOUR_REMOTE_TRIGGER_TOKEN
pipeline {
agent { label 'server2' }
stages {
stage('Build') {
steps {
sh 'curl http://server1:8080/job/DeployJobName/build?token=YOUR_REMOTE_TRIGGER_TOKEN'
}
}
}
}

jenkins configuration for building on different branches

I am doing code review with gerritcodereview and I need to create a jenkins pipeline for CI, CD. I am using the events triggered by gerrit trigger plugin.
I want to obtain this:
PastchSet Created
build start on refs/changes/**/**/** branch
report results to gerrit for code review
Change Merged(into develop) or Ref Updated(develop)
build start on origin/develop branch
deploy code to internal server
Ref Updated(master)
build start on origin/master branch
deploy code to external server
Questions for which I didn't find good answers:
do I need to use a simple pipeline or multibranch pipeline?
how do I start the build on the correct branch?
how can I checkout the correct branch using a Jenkinsfile instead of using the configuration page?
You should create multibranch pipeline, and write your declarative/scripted
pipeline in Jenkinsfile
example pipeline
pipeline {
agent any
tools {
maven 'maven-3.3.6'
jdk 'jdk-11'
}
options {
buildDiscarder(logRotator(numToKeepStr: '5'))
}
stages {
stage('Build/Test') {
when {
changeRequest()
}
steps {
sh "mvn clean verify"
}
post {
success {
gerritReview labels: [Verified: 1], message: "Successful build, ${env.RUN_DISPLAY_URL}."
}
unstable {
gerritReview labels: [Verified: 0], message: "Unstable build, ${env.RUN_DISPLAY_URL}"
}
failure {
gerritReview labels: [Verified: -1], message: "Failed build, ${env.RUN_DISPLAY_URL}"
}
}
}
stage('Deploy') {
when {
branch 'develop'
}
steps {
sh 'mvn deploy'
}
}
}
}
stage build&test will run for any change in changeRequest, any new change, or patchset will trigger this stage
stage deploy will be triggered for any change merged to develop.
You could have multiple stages for one branch, they will be executed in sequence

How to build multiple projects the same time using jenkins pipeline

I already wrote an example Jenkinsfile to checkout and build and deploy a signal project. Is there a way to do all these for multiple project in different git repo the same time just using one Jenkinsfile ? I know I can set up these projects as independent jobs and use a Jenkinsfile to call them,but I'm wondering if I can do this without independent jobs.
Thanks.
You can make use of Job DSL Plugin to achieve this.
Jenkins Job DSL API will help you to write DSL scripts. You can find all the built-in DSL methods that will be needed to construct jobs.
Example pipeline script:
pipeline {
agent any
stages {
stage('Job1') {
steps {
//Pipeline Job
jobDsl scriptText: '''pipelineJob(\"$job1\") {
definition {
cpsScm {
scm {
git {
remote{
name('origin')
url('https://github.com/satta19/user-node.git')
credentials('git2-cred')
}
branch ('master')
}
}
scriptPath('Jenkinsfile')
}
}
}'''
}
}
stage('Job2') {
steps {
//Freestyle job
jobDsl scriptText: '''job(\"$job2\") {
steps {
shell(\'echo Hello World!\')
}
}'''
}
}
}
}
Note: I have taken the jobs name as string parameter i.e. $job1 and $job2 in the above example pipeline script.

Is there a way to add pre-build step for Jenkins pipeline?

Currently I'm able to use a post directive in my Jenkinsfile. Is there a way to trigger a pre-build step similar to this ?
post {
always {
sh '''rm -rf build/workspace'''
}
}
I believe this newer question may have the answer: Is there a way to run a pre-checkout step in declarative Jenkins pipelines?
pre is a cool feature idea, but doesn't exist yet. skipDefaultCheckout
and checkout scm (which is the same as the default checkout) are the
keys:
pipeline {
agent { label 'docker' }
options {
skipDefaultCheckout true
}
stages {
stage('clean_workspace_and_checkout_source') {
steps {
deleteDir()
checkout scm
}
}
stage('build') {
steps {
echo 'i build therefore i am'
}
}
}
}

Create a Build Pipeline View based on Pipeline job using Job DSL plugin in Jenkins

Since there's a limitation in Jenkins Pipeline's that you cannot add a manual build step without hanging the build (see for example this stackoverflow question) I'm experimenting with a combination of Jenkins Pipeline and Build Pipeline Plugin using the Job DSL plugin.
My plan was to create a Job DSL script that first executes the the Jenkins Pipeline (defined in a Jenkinsfile) and then create a downstream job that deploys to production (this is the manual step). I've created this Job DSL script as a test:
pipelineJob("${REPO_NAME} jobs") {
logRotator(-1, 10)
def repo = "https://path-to-repo/${REPO_NAME}.git"
triggers {
scm('* * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
publishers {
buildPipelineTrigger("${REPO_NAME} deploy to prod") {
parameters {
currentBuild()
}
}
}
}
freeStyleJob("${REPO_NAME} deploy to prod") {
}
buildPipelineView("$REPO_NAME Build Pipeline") {
selectedJob("${REPO_NAME} jobs")
}
where REPO_NAME is defined as an environment variable. The Jenkinsfile looks like this:
node {
stage('build'){
echo "building"
}
stage('run tests'){
echo "running tests"
}
stage('package Docker'){
echo "packaging"
}
stage('Deploy to Test'){
echo "Deploying to Test"
}
}
The problem is that the selectedJob points to "${REPO_NAME} jobs" which doesn't seem to be a valid option as "Initial Job" in the Build Pipeline Plugin view (you can't select it manually either).
Is there a workaround for this? I.e. how can I use a Jenkins Pipeline as the "Initial Job" for the Build Pipeline Plugin?
From the documentation on yourDomain.com/plugin/job-dsl/api-viewer/index.html#method/javaposse.jobdsl.dsl.views.NestedViewsContext.envDashboardView
It shows that buildPipelineView can only be used within a View block which is inside of a Folder block.
Folder {
View {
buildPipelineView {
}
}
}

Resources