How to trigger a task inside a job from jenkins pipeline - jenkins

I have a Maven project that builds a war file and a separate batch task inside the build job to deploy it to a server (this is basically a shell script executed on Jenkins).
This is the pipeline script:
node {
stage('Build') {
build job: 'core UAT'
}
stage('Deploy') {
build job: 'core UAT'
}
Is there a way to specify something like:
stage('Deploy') {
build job: 'core UAT -> Deploy'
}
I suppose I could manually copy paste the batch task steps into the pipeline stage, but I want to trigger as if it was run explicitly.
This may help:
task screenshot

Related

How to use a Jenkinsfile for these build steps?

I'm learning how to use Jenkins and working on configuring a Jenkins file instead of the build using the Jenkins UI.
The source code management step for building from Bitbucket:
The build step for building a Docker container:
The build is of type multi configuration project:
Reading the Jenkins file documentation at https://www.jenkins.io/doc/book/pipeline/jenkinsfile/index.html and creating a new build using Pipeline :
I'm unsure how to configure the steps I've configured via the UI: Source Code Management & Build. How to convert the config for Docker and Bitbucket that can be used with a Jenkinsfile ?
The SCM will not be changed, regardless if you are using UI configuration or a pipeline, although in theory you can do the git clone from the steps in the pipeline, if you really insist convert the SCM steps in pure pipeline steps.
The pipeline will can have multiple stages, and each of the stages can have different execution environment. You can use the Docker pipeline plug-in, or you can use plain sh to issue the docker commands on the build agent.
Here is small sample from one of my manual build pipelines:
pipeline {
agent none
stages {
stage('Init') {
agent { label 'docker-x86' }
steps {
checkout scm
sh 'docker stop demo-001c || true'
sh 'docker rm demo-001c || true'
}
}
stage('Build Back-end') {
agent { label 'docker-x86' }
steps {
sh 'docker build -t demo-001:latest ./docker'
}
}
stage('Test') {
agent {
docker {
label 'docker-x86'
}
}
steps {
sh 'docker run --name demo-001c demo-001:latest'
sh 'cd test && make test-back-end'
}
}
}
}
You need to create a Pipeline type of a project and specify the SCM configuration in the General tab. In the Pipeline tab, you will have option to select Pipeline script or Pipeline script from SCM. It's always better to start with the Pipeline script while you are building and modifying your workflow. Once it's stabilized, you can add it to the repository.

Jenkins - Execute Pipeline from Specific folder

I have a main pipeline in jenkins that will checkout the code , compile, test and build, then push image to docker. This is the high level of CI pipeline that I have. Say job name "MainJobA"
I need to create a new job , just for JavaDoc generation. For that i have created a new script in Git Repo and configured the same in Pipeline job.
Now i need to execute this sub job of javadoc generation and publishing the html reports from the workspace of "MainJobA" . I need to run the SubJobA's pipeline stages from
/home/jenkins/workspace/MainJobA
How can i achieve this?
There is build step exist in jenkins declarative pipelines.
Use it like:
pipeline {
agent any
stages {
stage ("build") {
steps {
build 'Pipeline_B'
}
}
}
}

Is it possible to have multiple jenkinsfile and custom name for jenkinsfile

My whole scripts are in one branch of repo and I have multiple jenkins pipeline job.
1. smoke
2. Regression
3. Epic wise Execution
each have a different pipeline script. So is it possible to have multiple jenkins file with custom name ?
pipeline {
node('Slave-Machine-1') {
env.NODE_HOME="${tool '8.9.4'}"
env.PATH="${env.NODE_HOME}/bin:${env.PATH}"
def AUTO = ''
stage("Install Dependency") {
sshagent(['agent-id']) {
sh 'npm install'
sh 'npm run webdriver-install'
}
}
stage("smoke") {
sh 'npm run smoke-test'
}
}
}
This is my sample pipeline script. similarly i have multiple pipeline scripts
You can name your pipeline scripts random_joe or anything you like as long as:
You do not use multibranch or organization pipeline projects, which specifically look for the filename Jenkinsfile to automatically create new jobs
You do not mind your text editor not syntax highlighting the pipeline scripts until you add the extension .groovy to them
It is advisable to follow conventions wherever not impracticable though.

how to run post build task in jenkins slave?

I have a requirement that i need to run postbuild task on jenkins slave machine.I cannot use the property "Restrict where this project can be run" because no need to run entire project in slave.
Two possibilites:
Use jenkins pipeline
node("master") {
stage("do main build thing") {
// do something
}
}
node("slave") {
stage("do postbuild") {
// do post build task
}
}
More information about the jenkins pipeline https://jenkins.io/doc/book/pipeline/jenkinsfile/
Use a second job
You can configure a job which only executes the post build task and restrict it to the slave with "Restrict where this project can be run". On the main job you add a post build action: https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Trigger+Plugin

Auto generate build pipeline for gradle build using Jenkinsfile

I am trying to create a build pipeline based upon the Gradle tasks. I have viewed JenkinsFile configuration Pipeline-as-code-demo but I am unable to create a pipeline for gradle tasks. Please suggest me a possible way so that I can use the Jenkinsfile to automatically show the build pipeline just by reading the configurations from the Jenkinsfile.
Thankyou
In case your project uses Gradle Wrapper you can use the following snippet in your Jenkinsfile:
stage('Gradle Build') {
if (isUnix()) {
sh './gradlew clean build'
} else {
bat 'gradlew.bat clean build'
}
}
If you checkout to subdirectory sub-dir you might want to use
stage('Gradle Build') {
if (isUnix()) {
dir('sub-dir') {sh './gradlew clean build'}
} else {
dir('sub-dir') {bat 'gradlew.bat clean build'}
}
}
In case you're using Artifactory to resolve your build dependencies or to deploy your build artifacts, it is recommended to use the Pipeline DSL for Gradle build with Artifactory.
Here's an example taken from the Jenkins Pipeline Examples page:
node {
// Get Artifactory server instance, defined in the Artifactory Plugin administration page.
def server = Artifactory.server "SERVER_ID"
// Create an Artifactory Gradle instance.
def rtGradle = Artifactory.newGradleBuild()
stage 'Clone sources'
git url: 'https://github.com/jfrogdev/project-examples.git'
stage 'Artifactory configuration'
// Tool name from Jenkins configuration
rtGradle.tool = "Gradle-2.4"
// Set Artifactory repositories for dependencies resolution and artifacts deployment.
rtGradle.deployer repo:'ext-release-local', server: server
rtGradle.resolver repo:'remote-repos', server: server
stage 'Gradle build'
def buildInfo = rtGradle.run rootDir: "gradle-examples/4/gradle-example-ci-server/", buildFile: 'build.gradle', tasks: 'clean artifactoryPublish'
stage 'Publish build info'
server.publishBuildInfo buildInfo
}
Otherwise, you can simply run the gradle command with the sh or bat Pipeline steps.
In jenkins you can creates a jenkins pipeline using a script which is written in Jenkinsfile.
We write a script using 'stages' and 'node' as building block, these building blocks allow you to specify instructions that should be executed as part of jenkins pipeline.
To execute gradle build using JenkinsFile first check for Operating system and call appropriate shell that can execute that gradle task, as below:
Jenkinsfile
stage 'build_Project'
node{
if(isUnix()){
sh 'gradle build --info'
}
else{
bat 'gradle build --info'
}
}
Above code snippet create a stage with name build_project and execute gradle build script of the current project.

Resources