Append to Job properties - jenkins

My job parameters defined in job-dsl.groovy are overwritten by those defined in pipeline.
I am using job-dsl-plugin and Jenkins pipeline to generate Jenkins job for each git branch. Sine my code is stored in gitLab they require gitLab integration. I am providing that using gitlab-plugin. The problem is with the 'gitLabConnection' it looks like it can be only applied from inside the Jenkins pipeline.
So if in job-dsl I would do:
branches.each { branch ->
String safeBranchName = branch.name.replaceAll('/', '-')
if (safeBranchName ==~ "^release.*")
{
return
}
def branch_folder = "${basePath}/${safeBranchName}"
folder branch_folder
pipelineJob("$branch_folder/build") {
logRotator {
numToKeep 20
}
parameters {
stringParam("BRANCH_NAME", "${safeBranchName}", "")
stringParam("PROJECT_NAME", "${basePath}", "")
{
}
And then in my Jenkins pipeline I would add the 'gitLabConnection'
node('node_A') {
properties([
gitLabConnection('gitlab.internal')
])
stage('clean up') {
deleteDir()
}
///(...)
I have to do it like:
node('node_A') {
properties([
gitLabConnection('gitlab.internal'),
parameters([
string(name: 'BRANCH_NAME', defaultValue: BRANCH_NAME, description: ''),
string(name: 'PROJECT_NAME', defaultValue: PROJECT_NAME, description: '')
])
])
stage('clean up') {
deleteDir()
}
///(...)
So that my BRANCH_NAME and PROJECT_NAME are not overwritten.
Is there another way to tackle this ?
Is it possible to append the 'gitLabConnection('gitlab.internal')' to the properties in the Jenkins pipeline ?

Unfortunately it doesn't seem like there is a way to do this yet. There's some discussion about this at https://issues.jenkins-ci.org/browse/JENKINS-43758 and I may end up opening a feature request to allow people to "append to properties"

There are 2 ways for solving this. The first one uses only Jenkins pipeline code, but if you choose this path the initial job run will most likely fail. This initial fail will happen, because at the time of first job run, the pipeline creates Jenkins job parameters. Once the parameters are created, job will work.
Option '1' - using Jenkins pipeline Only.
In 'Pipeline Syntax'/'Snippet Generator' check:
'This project is parameterised'.
Add parameter(s) you need, and hit 'Generate Pipeline Script'. In my case I get:
properties([
gitLabConnection(gitLabConnection: 'my_gitlab_connection', jobCredentialId: '', useAlternativeCredential: false),
[$class: 'JobRestrictionProperty'],
parameters([
string(defaultValue: 'test', description: 'test', name: 'test', trim: false)
]),
throttleJobProperty(categories: [], limitOneJobWithMatchingParams: false, maxConcurrentPerNode: 0, maxConcurrentTotal: 0, paramsToUseForLimit: '', throttleEnabled: false, throttleOption: 'project')
])
Option '2' - It's more complicated but, also far more powerfull. The one I finally took, because of the issues described above.
Use Jenkins job DSL plugin - https://github.com/jenkinsci/job-dsl-plugin
Gitlab plugin works quite well with it https://github.com/jenkinsci/gitlab-plugin#declarative-pipeline-jobs

Related

How to trigger a build on commit to branch in scripted pipeline

triggers { pollSCM('H */4 * * 1-5') } will this work for merge to branch
I see we have 3 options for triggers this seems to be fo declarative
corn
pollScm
upstream
where in for scripted pipeline is that something like this
properties([pipelineTrigger([triggers('gitPush')])])
OR
properties([pipelineTriggers([githubPush()])])// With this should I also enable a option on Jenkins instance
You can also use Generic Webhook Trigger plugin.
You will need to create a webhook in github and in Jenkins Pipeline something like below.
triggers {
GenericTrigger( genericVariables: [
[defaultValue: '', key: 'commit_author', regexpFilter: '', value: '$.pusher.name'],
[defaultValue: '', key: 'ref', regexpFilter: '', value: '$.ref']
],
causeString: '$commit_author committed to $ref',
printContributedVariables: false,
printPostContent: false,
regexpFilterText: '$ref',
regexpFilterExpression: 'refs/heads/develop',
token: '12345'
}
Hope this helps.
Use something like this in you Jenkinsfile. Use Only those option which you need. Remove which you don't want.
pipeline {
agent any
triggers {
github(
triggerOnPush: true,
triggerOnMergeRequest: true,
triggerOpenMergeRequestOnPush: "source",
branchFilterType: "All",
triggerOnNoteRequest: false)
}
stages {
...
}
}
NOTE: Make sure you have done all the webhook configuration bewteen github and jenkins. and installed webhook plaugin on jenkins

How to use NodeLabel parameter plugin in declarative pipeline

Im trying to convert my freestyle job to a declarative pipeline job since the pipeline provides more flexibility. I cannot figure out how to use the NodeLabel parameter plugin (https://wiki.jenkins.io/display/JENKINS/NodeLabel+Parameter+Plugin) in a pipeline however.
pipeline {
agent any
parameters {
// Would like something like LabelParameter here
}
stages {
stage('Dummy1') {
steps {
cleanWs()
sh('ls')
sh('pwd')
sh('hostname')
}
}
stage('Dummy2') {
steps {
node("comms-test02") {
sh('ls')
sh('pwd')
sh('hostname')
}
}
}
}
I basically just need a way to start the job using a parameter that specifies where to build the job (using slave label).
Jenkins requires an agent field to be present which i set to 'any'. But it doesnt seem like there is a labelparameter available ?
As an alternative I tried using the 'node' command (https://jenkins.io/doc/pipeline/steps/workflow-durable-task-step/#-node- allocate node). But that leaves me with two running jobs which, while working, doesnt look that pretty.
Does anyone if the NodeLabel parameter plugin can be used ? or maybe someone has a cleaner approach ?
Edit: Maybe I wasn't clear. I need to be able to run jobs on different nodes. The node to run on should be decided when triggering the job through a parameter. The node label plugin does this perfectly. However, I have not been able to reproduce this behavior in pipeline.
Here's a full example:
pipeline {
parameters {
choice(name: 'node', choices: [nodesByLabel('label')], description: 'The node to run on') //example 1: just listing all the nodes with label
choice(name: 'node2', choices: ['label'] + nodesByLabel('label'), description: 'The node to run on') //example 2: add the label itself as the first choice to make "Any of the nodes" the default choice
}
agent none
stages {
stage('Test') {
agent { label params.node}
stages {
stage('Print environment settings') {
steps {
echo "running on ${env.NODE_NAME}"
sh 'printenv | sort'
}
}
}
}
}
}
Let's say you added the parameter(say named slaveName) using the NodeLabel plugin on your pipeline. You now need to extract the value of slaveName and feed it into the agent->node->label field.
You can specify the node using the node property inside the agent.
Like this -
agent
{
node
{
label "${slaveName}"
}
}
The following script worked for me to run the multiple jobs parallelly on different Node.
I have taken the reference from the build step plugin documentation.
https://www.jenkins.io/doc/pipeline/steps/pipeline-build-step/
def build_one()
{
parallel one: {
stage('XYZ') {
catchError(buildResult: 'SUCCESS', stageResult:'FAILURE') {
build job: 'yourDownStreamJob', parameters: [[$class: 'NodeParameterValue', name: 'NodeToRun',labels: ['nodeName'], nodeEligibility: [$class: 'AllNodeEligibility']], string(name: 'ParentBuildName', value: "XX"), string(name: 'Browser', value: 'chrome'), string(name: 'Environment', value: 'envName')]
}
}
},
two : {
stage('SecondArea') {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
build job: 'yourDownStreamJob', parameters: [[$class: 'NodeParameterValue', name: 'NodeToRun',labels: ['Your'], nodeEligibility: [$class: 'AllNodeEligibility']], string(name: 'ParentBuildName', value: "XYX"), string(name: 'Browser', value: 'firefox'), string(name: 'Environment', value: 'envName')]
}
}
}
}
build_one()

different parameters in each branch in jenkins multibranch declarative pipeline

I am using Jenkins scripted pipeline in a multibranch job.
There is a parameter that is only supposed to be available in the trunk, not in any of the branches of the multibranch job.
Currently with scripted pipeline this is easy to do (inside a shared library or directly on Jenkinsfile):
def jobParams = [
booleanParam(defaultValue: false, description: 'param1', name: 'param1')
]
if (whateverCondition) {
jobParams.add(booleanParam(defaultValue: false, description: 'param2', name: 'param2'))
}
properties([
parameters(jobParams)
])
I am currently trying to migrate to jenkins declarative syntax, but i don't see a simple way to create a parameter that is only available in some conditions (i know i can ignore it, but i don't really want it to show it at all).
The only solution so far is to move the pipeline to a shared library also (possible since Declarative 1.2). I don't like this solution because the entire pipeline must be replicated, which seem a bit too extreme just for one line.
if (whateverCondition) {
pipeline {
agent any
parameters {
booleanParam(defaultValue: false, description: 'param1', name: 'param1')
booleanParam(defaultValue: false, description: 'param2', name: 'param2')
}
(...)
}
} else {
pipeline {
agent any
parameters {
booleanParam(defaultValue: false, description: 'param1', name: 'param1')
}
(...)
}
}
Is there a way i can extract just the part of parameter definition of the declarative pipeline to a global variable of a shared library or something?
Thanks in advance for any help!

Initialize Jenkins with Pipeline Job

I am building Jenkins with a Dockerfile, and during the Docker build I would like to have Jenkins pre-configured with a set of jobs. I find this works well with Jobs DSL, where jobs are seeded, but I have yet to preconfigure the "Pipeline" DSL. Given the direction of Jenkins and use of Jenkisfile, Pipeline, etc, I think there must be some way to allow Jenkins to automatically run with a set of jobs that were built using the Pipeline approach
Example Pipeline:
pipeline {
agent {
label 'cft'
}
parameters {
string(name: 'StackName', defaultValue: 'cft-stack', description: 'The name to give the CFT stack.')
string(name: 'KeyName', defaultValue: 'ACCOUNT', description: 'The account key to use for encryption.')
string(name: 'VpcId', defaultValue: 'vpc-1234', description: 'The VPC to assign to the cluster resources.')
string(name: 'SubnetID', defaultValue: 'subnet-1234, subnet-6789', description: 'The subnet(s) to assign to the cluster resources.')
stages {
stage('Build') {
steps {
s3Download(file:'cft.yaml'
, bucket:'cft-resources'
, path:'cft.yaml'
, force:true)
cfnUpdate(stack:"${params.StackName}"
, file:"cft.yaml"
, params:[
"SnapshotId=${params.SnapshotId}",
"KeyName=${params.KeyName}",
"VpcId=${params.VpcId}"
]
, timeoutInMinutes: 20
)
}
}
}
post {
failure {
echo 'FAILURE'
cfnDelete(stack:"${params.StackName}")
}
}
}
Dockerfile:
COPY ./groovy/*.groovy /usr/share/jenkins/ref/init.groovy.d/
Pipeline's Groovy files differ from the Groovy code that can be executed to configure Jenkins. You can't add pipelines the way you're trying to do.
Your options include
copy the XML file for the job definition (pointing to your repo, as the pipeline should be in the Jenkinsfile in the repo)
create a job using Groovy and configure it (not really practicable IMHO)
use JobDSL (again, with XML as starting point) to specify your Jenkins jobs. An example for automatically adding this can be found in tknerr/jenkins-pipes-infra.

Pass variables across Jenkins Multi-Branch Declarative Pipeline Projects

I have project A and project B. I would like to pass parameters (like the BranchName and ArtifactoryID) from project A to project B. Both are multi-branch pipelines using a Declarative Script Jenkinsfile.
When I use the Snippet Generator it tells me the project "is not parameterized". When looking at the config of the multi-branch pipeline, I don't see a way to parameterize it. What am I missing? (see attached)
A google result shows this, but I'm not sure how it's supposed to pass params between multi-branch pipelines: https://issues.jenkins-ci.org/browse/JENKINS-32780
I figured this out. I leveraged an answer from a comment here: Pipeline pass parameters to downstream jobs
For a detailed explanation using my example shown above, my Project A jenkinsfile would have the following before the stages:
parameters
{
string(name: 'BRANCH_PASSED_OVER', defaultValue: '${env.BRANCH_NAME}', description: 'pass branch value')
string(name: 'PERSON2', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')
}
...and the following for the build step phase
stage('Build downstream')
{
steps
{
build job: 'BUILD/CMTest2/' + env.BRANCH_NAME.replaceAll("/", "%2F"), wait: false, parameters: [string(name: 'PERSON2', value: params.PERSON2), string(name: 'PASS_BRANCH_NAME', value: env.BRANCH_NAME)]
}
}
In Project B then in my jenkinsfile I could call the param like so:
stage('Collect Info')
{
steps
{
echo "Hello ${params.PERSON2}"
echo "PASS_BRANCH_NAME: ${params.PASS_BRANCH_NAME}"
}
}

Resources