How to use NodeLabel parameter plugin in declarative pipeline - jenkins

Im trying to convert my freestyle job to a declarative pipeline job since the pipeline provides more flexibility. I cannot figure out how to use the NodeLabel parameter plugin (https://wiki.jenkins.io/display/JENKINS/NodeLabel+Parameter+Plugin) in a pipeline however.
pipeline {
agent any
parameters {
// Would like something like LabelParameter here
}
stages {
stage('Dummy1') {
steps {
cleanWs()
sh('ls')
sh('pwd')
sh('hostname')
}
}
stage('Dummy2') {
steps {
node("comms-test02") {
sh('ls')
sh('pwd')
sh('hostname')
}
}
}
}
I basically just need a way to start the job using a parameter that specifies where to build the job (using slave label).
Jenkins requires an agent field to be present which i set to 'any'. But it doesnt seem like there is a labelparameter available ?
As an alternative I tried using the 'node' command (https://jenkins.io/doc/pipeline/steps/workflow-durable-task-step/#-node- allocate node). But that leaves me with two running jobs which, while working, doesnt look that pretty.
Does anyone if the NodeLabel parameter plugin can be used ? or maybe someone has a cleaner approach ?
Edit: Maybe I wasn't clear. I need to be able to run jobs on different nodes. The node to run on should be decided when triggering the job through a parameter. The node label plugin does this perfectly. However, I have not been able to reproduce this behavior in pipeline.

Here's a full example:
pipeline {
parameters {
choice(name: 'node', choices: [nodesByLabel('label')], description: 'The node to run on') //example 1: just listing all the nodes with label
choice(name: 'node2', choices: ['label'] + nodesByLabel('label'), description: 'The node to run on') //example 2: add the label itself as the first choice to make "Any of the nodes" the default choice
}
agent none
stages {
stage('Test') {
agent { label params.node}
stages {
stage('Print environment settings') {
steps {
echo "running on ${env.NODE_NAME}"
sh 'printenv | sort'
}
}
}
}
}
}

Let's say you added the parameter(say named slaveName) using the NodeLabel plugin on your pipeline. You now need to extract the value of slaveName and feed it into the agent->node->label field.
You can specify the node using the node property inside the agent.
Like this -
agent
{
node
{
label "${slaveName}"
}
}

The following script worked for me to run the multiple jobs parallelly on different Node.
I have taken the reference from the build step plugin documentation.
https://www.jenkins.io/doc/pipeline/steps/pipeline-build-step/
def build_one()
{
parallel one: {
stage('XYZ') {
catchError(buildResult: 'SUCCESS', stageResult:'FAILURE') {
build job: 'yourDownStreamJob', parameters: [[$class: 'NodeParameterValue', name: 'NodeToRun',labels: ['nodeName'], nodeEligibility: [$class: 'AllNodeEligibility']], string(name: 'ParentBuildName', value: "XX"), string(name: 'Browser', value: 'chrome'), string(name: 'Environment', value: 'envName')]
}
}
},
two : {
stage('SecondArea') {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
build job: 'yourDownStreamJob', parameters: [[$class: 'NodeParameterValue', name: 'NodeToRun',labels: ['Your'], nodeEligibility: [$class: 'AllNodeEligibility']], string(name: 'ParentBuildName', value: "XYX"), string(name: 'Browser', value: 'firefox'), string(name: 'Environment', value: 'envName')]
}
}
}
}
build_one()

Related

set pipelineTriggers in Jenkinsfile to set 'Trigger builds remotely (e.g., from scripts)'

I have a Jenkinsfile and I want to set a pipelineTrigger property for my stage 'setup parameters'
#! /usr/bin/env groovy
pipeline {
agent any
stages {
stage('setup parameters'){
steps{
script{
properties([
parameters([
string(name: 'payload', defaultValue: '')
]),
pipelineTriggers(])
])
}
}
}
What i'm trying to do is after the first attempted run of the Job, the following checkbox should be checked with token filled out.
When I have looked for the pipeline syntax, it does not list this as one of the trigger options.
Thanks!
It's authenticationToken in the pipelineJob:
pipelineJob('project-name') {
definition {
...
}
parameters {
...
}
authenticationToken('TOKENHERE')
}
https://jenkinsci.github.io/job-dsl-plugin/#method/javaposse.jobdsl.dsl.jobs.WorkflowJob.authenticationToken

When doing multiple jobs in a pipeline Jenkinsfile, how do I capture the logs for the given build

As the title states, I want to capture the logs for all the stages in my build, which looks like this:
pipeline {
agent any
stages {
stage('Build First Repo') {
steps {
build job: 'jobOne', parameters: [string(name: 'branch', value: "${params.branch}")], quietPeriod: 1
}
}
stage ('Build Second Repo') {
steps {
build job: 'jobTwo', parameters: [string(name: 'branch', value: "${params.someOtherBranch}")], quietPeriod: 1
}
}
stage ('Deploy') {
steps {
build job: 'jobThree', parameters: [string(name: 'buildEnvironment', value: "${params.environment}")], quietPeriod: 1
}
}
stage ('Remote Build') {
steps {
build job: 'jobFour', parameters: [string(name: 'Hosts', value: "${params.hosts}")], quietPeriod: 1
}
}
}
post {
always {
mail to:me#mydomain.com, subject: "${currentBuild.currentResult} - ${currentBuild.fullDisplayName}", body:"...${currentBuild.rawBuild.getLog(100)}"
}
}
}
Currently, I can only get the pipeline build log (which I am e-mailing in the post/always section), which is helpful but not sufficient; I'd like to get the logs from each of the stages. I thought of maybe capturing them per stage and creating an environment variable or something but I'm not sure how to even access the logs for the build of those jobs. Can someone point me in the right direction on how to capture the logs for those jobs?
You can add a post section after an any stage.
And you can setup sendind emails there.
To send single email from the build you can use stash/unstash command. You can stash log at each section and finally unstash them and send an email.

different parameters in each branch in jenkins multibranch declarative pipeline

I am using Jenkins scripted pipeline in a multibranch job.
There is a parameter that is only supposed to be available in the trunk, not in any of the branches of the multibranch job.
Currently with scripted pipeline this is easy to do (inside a shared library or directly on Jenkinsfile):
def jobParams = [
booleanParam(defaultValue: false, description: 'param1', name: 'param1')
]
if (whateverCondition) {
jobParams.add(booleanParam(defaultValue: false, description: 'param2', name: 'param2'))
}
properties([
parameters(jobParams)
])
I am currently trying to migrate to jenkins declarative syntax, but i don't see a simple way to create a parameter that is only available in some conditions (i know i can ignore it, but i don't really want it to show it at all).
The only solution so far is to move the pipeline to a shared library also (possible since Declarative 1.2). I don't like this solution because the entire pipeline must be replicated, which seem a bit too extreme just for one line.
if (whateverCondition) {
pipeline {
agent any
parameters {
booleanParam(defaultValue: false, description: 'param1', name: 'param1')
booleanParam(defaultValue: false, description: 'param2', name: 'param2')
}
(...)
}
} else {
pipeline {
agent any
parameters {
booleanParam(defaultValue: false, description: 'param1', name: 'param1')
}
(...)
}
}
Is there a way i can extract just the part of parameter definition of the declarative pipeline to a global variable of a shared library or something?
Thanks in advance for any help!

Append to Job properties

My job parameters defined in job-dsl.groovy are overwritten by those defined in pipeline.
I am using job-dsl-plugin and Jenkins pipeline to generate Jenkins job for each git branch. Sine my code is stored in gitLab they require gitLab integration. I am providing that using gitlab-plugin. The problem is with the 'gitLabConnection' it looks like it can be only applied from inside the Jenkins pipeline.
So if in job-dsl I would do:
branches.each { branch ->
String safeBranchName = branch.name.replaceAll('/', '-')
if (safeBranchName ==~ "^release.*")
{
return
}
def branch_folder = "${basePath}/${safeBranchName}"
folder branch_folder
pipelineJob("$branch_folder/build") {
logRotator {
numToKeep 20
}
parameters {
stringParam("BRANCH_NAME", "${safeBranchName}", "")
stringParam("PROJECT_NAME", "${basePath}", "")
{
}
And then in my Jenkins pipeline I would add the 'gitLabConnection'
node('node_A') {
properties([
gitLabConnection('gitlab.internal')
])
stage('clean up') {
deleteDir()
}
///(...)
I have to do it like:
node('node_A') {
properties([
gitLabConnection('gitlab.internal'),
parameters([
string(name: 'BRANCH_NAME', defaultValue: BRANCH_NAME, description: ''),
string(name: 'PROJECT_NAME', defaultValue: PROJECT_NAME, description: '')
])
])
stage('clean up') {
deleteDir()
}
///(...)
So that my BRANCH_NAME and PROJECT_NAME are not overwritten.
Is there another way to tackle this ?
Is it possible to append the 'gitLabConnection('gitlab.internal')' to the properties in the Jenkins pipeline ?
Unfortunately it doesn't seem like there is a way to do this yet. There's some discussion about this at https://issues.jenkins-ci.org/browse/JENKINS-43758 and I may end up opening a feature request to allow people to "append to properties"
There are 2 ways for solving this. The first one uses only Jenkins pipeline code, but if you choose this path the initial job run will most likely fail. This initial fail will happen, because at the time of first job run, the pipeline creates Jenkins job parameters. Once the parameters are created, job will work.
Option '1' - using Jenkins pipeline Only.
In 'Pipeline Syntax'/'Snippet Generator' check:
'This project is parameterised'.
Add parameter(s) you need, and hit 'Generate Pipeline Script'. In my case I get:
properties([
gitLabConnection(gitLabConnection: 'my_gitlab_connection', jobCredentialId: '', useAlternativeCredential: false),
[$class: 'JobRestrictionProperty'],
parameters([
string(defaultValue: 'test', description: 'test', name: 'test', trim: false)
]),
throttleJobProperty(categories: [], limitOneJobWithMatchingParams: false, maxConcurrentPerNode: 0, maxConcurrentTotal: 0, paramsToUseForLimit: '', throttleEnabled: false, throttleOption: 'project')
])
Option '2' - It's more complicated but, also far more powerfull. The one I finally took, because of the issues described above.
Use Jenkins job DSL plugin - https://github.com/jenkinsci/job-dsl-plugin
Gitlab plugin works quite well with it https://github.com/jenkinsci/gitlab-plugin#declarative-pipeline-jobs

How to invoke a jenkins pipeline A in another jenkins pipeline B

I have two Jenkins pipelines, let's say pipeline-A and pipeline-B. I want to invoke pipeline-A in pipeline-B. How can I do this?
(pipeline-A is a subset of pipeline-B. Pipeline-A is responsible for doing some routine stuff which can be reused in pipeline-B)
I have installed Jenkins 2.41 on my machine.
Following solution works for me:
pipeline {
agent
{
node {
label 'master'
customWorkspace "${env.JobPath}"
}
}
stages
{
stage('Start') {
steps {
sh 'ls'
}
}
stage ('Invoke_pipeline') {
steps {
build job: 'pipeline1', parameters: [
string(name: 'param1', value: "value1")
]
}
}
stage('End') {
steps {
sh 'ls'
}
}
}
}
Adding link of the official documentation of "Pipeline: Build Step" here:
https://jenkins.io/doc/pipeline/steps/pipeline-build-step/
A little unclear if you want to invoke another pipeline script or job, so I answer both:
Pipeline script
The "load" step will execute the other pipeline script. If you have both scripts in the same directory, you can load it like this:
def pipelineA = load "pipeline_A.groovy"
pipelineA.someMethod()
Other script (pipeline_a.groovy):
def someMethod() {
//do something
}
return this
Pipeline job
If you are talking about executing another pipeline job,
the "build job" step can accomplish this:
build job: '<Project name>', propagate: true, wait: true
propagate: Propagate errors
wait: Wait for completion
If you have paramters on the job, you can add them like this:
build job: '<Project name>', parameters: [[$class: 'StringParameterValue', name: 'param1', value: 'test_param']]
As mentioned by #Matias Snellingen and #Céline Aussourd, in the case of launching a multibranch job you have to specify the branch to build like this :
stage ('Invoke_pipeline') {
steps {
build job: 'pipeline1/master', parameters: [
string(name: 'param1', value: "value1")
]
}
}
In my case it solved the problem.
I am going to post my solution, which is similar to #Michael COLL, #Matias Snellingen, and #Céline Aussourd.
For the multibranch pipeline I am using the following code in Jenkinsfile to trigger my multibranch B with multibranch A (in the example there are two cases for pipeline and multibranch pipeline):
post {
always {
echo 'We are in post part and Jenkins build with QA tests is going to be triggered.'
// For triggering Pipeline
//build job: 'WGF-QA WITH ALLURE', parameters: [string(name: 'QA-Automation', value: 'value from Build pipeline')]
// For triggering Multibranch Pipeline
build job: 'Testing QA/QA Selenium Tests/feature%2FGET-585', parameters: [string(name: 'QA-Automation', value: 'value from Build pipeline')]
}
}
Just be sure to define the whole path to the branch as is defined in the case and instead of / in branch name use %2F (feature/GET-585 -> feature%2FGET-585).
To add to what #matias-snellingen said. If you have multiple functions, the return this should be under the function that will be called in the main pipeline script. For example in :
def someMethod() {
helperMethod1()
helperMethod2()
}
return this
def helperMethod1(){
//do stuff
}
def helperMethod2(){
//do stuff
}
The someMethod() is the one that will be called in the main pipeline script
Another option is to create a package, load it and execute it from the package.
package name.of.package
import groovy.json.*
def myFunc(var1) {
return result
}
Than consume it
#Library('name_of_repo')
import name.of.package.*
utils = new name_of_pipeline()
// here you can invoke
utils.myFunc(var)
hope it helps

Resources