I want to implement reusable functions/methods to use in my Jenkins pipeline...
listAllUnitTests([
$class: 'MyUtilities',
arg1: 'foo',
arg2: 'bar'
])
What's not clear is how to actually do it; is this a plugin, extension, what is it?
The Hunt
So I started with something familiar, such as Git checkout...
node {
checkout([
$class: 'GitSCM',
branches: scm.branches,
extensions: scm.extensions,
userRemoteConfigs: scm.userRemoteConfigs
])
}
Looking at the source for GitSCM, a Jenkins plugin, the checkout method appears to be fairly standard; no special annotations or anything else, although I'm not sure how the pipeline arguments align with the method signature because there's clearly a mismatch. I suspect I'm on the wrong track.
#Override
public void checkout(
Run<?, ?> build,
Launcher launcher,
FilePath workspace,
TaskListener listener,
File changelogFile,
SCMRevisionState baseline)
throws IOException, InterruptedException {
Question
I'll keep it simple: how do I implement parameterized functionality to invoke from Jenkins pipelines to achieve something like this?
node {
stage('test'){
myUtilMethod([
$class: 'MyUtilities',
arg1: 'foo',
arg2: 'bar'
])
}
}
You can implement one or more libraries using https://github.com/jenkinsci/workflow-cps-global-lib-plugin/
I recommend explicitly specifying that you need the library (with the #Library annotation as mentioned in the page above), and not make it implicitly available. This way you can use a specific branch of it on test jobs while developing and testing your library.
Check out fabric8 for a pretty comprehensive set of examples: https://github.com/fabric8io/fabric8-pipeline-library
Related
I am trying for a lot of time to get the current branch name in MultibranchPipeline Job inside an Active Choice Reactive Reference Parameter Formatted HTML parameter script block
[
$class: 'DynamicReferenceParameter',
choiceType: 'ET_FORMATTED_HTML',
name: 'TestParam',
omitValueField: true,
description: 'Test.',
script: [
$class: 'GroovyScript',
fallbackScript: [
classpath: [],
sandbox: false,
script: '''
return """
<p>FallbackScript. Error in main script</p>
"""
'''
],
script: [
classpath: [],
sandbox: false,
script: '''
String branchName = env.BRANCH_NAME
return """
<p>${branchName}</p>
"""
'''
]
]
]
The thing is that, I believe, the BRANCH_NAME param is injected after you press the Build button.
I've tried a lot of things, and I mean, A LOT, still I didn't manage to find a way. The scm variable doesn't exist as well, I tried to find something with the jenkins.model.Jenkins.instance but no luck.
Is it possible? I would love to ask this question on their Github repo, but issues are not allowed to be opened. Also to open an issue on Jenkins you need a Jira account or something. SO is the only place.
Thanks to Michael's answer, I managed to find a way to make this work. There are a lot more to it than meets the eye, but I will get through all details. I also answered this question here.
I make the assumption that the reader is familiar with the Active Choices plugin. Also, I played with this in a multibranch pipeline job. You might encounter different behaviours with other kinds of jobs.
The parameters sadly don't have access to the environment variables. This is a bit of a limitation which I hope will be fixed/thought of in the future by the plugin's maintainers.
Some environment variables are only populated at build time, like BRANCH_NAME. In this case, even if we had access to the env vars we wouldn't have the actual value at hand.
To be able to use the env.BRANCH_NAME we need two reactive parameters.
The plugin has a parameter named FORMATTED_HIDDEN_HTML. This parameter doesn't get displayed to the user. This is great since we wouldn't want to see in a multibranch pipeline job a parameter with the same name as the branch we are currently on.
To set this parameter, we can write something like this in a Jenkinsfile.
[
$class: 'DynamicReferenceParameter',
choiceType: 'ET_FORMATTED_HIDDEN_HTML',
name: 'BranchName',
omitValueField: true,
script: [
$class: 'GroovyScript',
fallbackScript: [
classpath: [],
sandbox: true,
script: '''
return '<p>error</p>'
'''
],
script: [
classpath: [],
sandbox: true,
script: """
return '<input name="value" value="${env.BRANCH_NAME}" type="text">'
"""
]
]
]
There are a lot of things to note here.
The sandbox property is set to true. If you don't do that, you would need to accept the script in the ScriptApproval menu in Jenkins.
We use triple-double quotes when we define the script property.
script: """
return '<input name="value" value="${env.BRANCH_NAME}" type="text">'
"""
When the job is started for the first time, the BRANCH_NAME variable is populated. This results in a string interpolation which gets your script property in the following state:
script: """
return '<input name="value" value="myBranchName" type="text">'
"""
If we would've used triple-single quotes, we would get an error like:
hudson.remoting.ProxyException: groovy.lang.MissingPropertyException: No such property: env for class: WorkflowScript
This gets us back to the fact that we don't have access to the environment variables.
What to conclude from this? Well, if we use triple-double quotes, first we have a string interpolation, then the script is run.
The HTML element that must be used is input. This is explained in the docs if you read it carefully. Not only that but also the name property must be set to value. This is also explained in the docs.
omitValueField should be set to true, or else you will get a trailing comma in your value. E.g.: myBranchName,
Basically, the first time you run the job you get your branch name populated via string interpolation. Only after the second build, you will have the value to use. You will always reference the previous value.
After all that, you can reference this parameter in other Active Choices parameter types via referencedParameters property.
I desperately needed this because I have a complex use case scenario. I'm making requests to an Azure Container Registry to get all the tags for a certain image for a certain branch.
This plugin is great, I'm glad it exists. I would've loved a lot more documentation and examples thoguh.
Have a look at Groovy's string interpolation.
tl;dr You can access values by using """ and ${variable}
script: """
return <p>${env.BRANCH_NAME}</p>
"""
I am using jenkins jobDsl as follows:
#!groovy
node('master') {
stage('Prepare') {
deleteDir()
checkout scm
}
stage('Provision Jobs') {
jobDsl(targets: ['jenkins/commons.groovy', 'folderA/jenkins/jobA.groovy'].join('\n'),
removedJobAction: 'DELETE',
removedViewAction: 'DELETE',
sandbox: false)
}
}
Where I want to use from the jobA.groovy a function that is defined on commons.groovy.
Currently, the jobA.groovy doesn't have access to the function defined on commons.groovy, how can I allow this behavior?
Attached:
jobA.groovy:
test_job("param1", "param2")
common.groovy:
def test_job(String team, String submodule) {
pipelineJob("${team}/${submodule}/test_job") {
displayName("Test Job")
description("This is a Continuous Integration job for testing")
properties {
githubProjectUrl("githubUrl")
}
definition {
cpsScm {
scm {
git {
remote {
url('githubUrl')
credentials('credentials')
refspec('+refs/pull/*:refs/remotes/origin/pr/*')
}
branch('${sha1}')
scriptPath("scriptPath")
}
}
}
}
}
}
The idea would be to be able to call this method test_job("param1", "param2") from jobA.groovy with no issues and I am currently getting:
ERROR: (jobA.groovy, line 9) No signature of method: test_job() is applicable for argument types: (java.lang.String, java.lang.String)
JobDSL creates the jobs. Then at runtime you want your job to call your function. The function must be imported through a shared library.
Create a shared lib
here is a sample: https://github.com/sap-archive/jenkins-pipelayer
the most important piece there is that you need to create a vars/ folder that will define the functions you can call from your pipelines. host the lib on its own repo or orphan branch
Import a shared lib
To import a lib library in Jenkins. From Manage page, go to Configure System Under section Global Pipeline Libraries, add a new library with name of your choice, ie name-of-your-lib, default version master, modern scm git https://urlofyoursharedlib.git
Run a first time the jobDSL job, then go to the In Process Script Approval page and approve everything.
Use a shared lib
To import a library inside your job you must include at the top of the file, after the shebang the statement #Library('name-of-your-lib')_
There is also a similar statement that exists, "library 'name-of-your-lib'". this one is useful to "debug & fix" a shared library because when you hit that replay button you'll see the shared library files used in the pipeline
Finally if all you are trying is to create job templates, I would recommend to try to get what this shared library I shared is doing, it helps with creating declarative templates and solves issues and limitations you will encounter with jobdsl & shared pipeline
I am trying to create Multibranch Pipeline Jobs using Job DSL, but I want to disable concurrent builds on each branch. I have tried the following code snippet but it didn't work, "Do not allow concurrent builds" is still unchecked on new branches.
multibranchPipelineJob("${FOLDER_NAME}/${JOB_NAME}") {
branchSources {
git {
remote("https://gitlab.com/${REPO_PATH}")
credentialsId('gitlab_credentials')
includes('*')
}
}
configure {
def factory = it / factory(class: 'com.cloudbees.workflow.multibranch.CustomBranchProjectFactory')
factory << disableConcurrentBuilds()
}
orphanedItemStrategy {
discardOldItems {
numToKeep(1)
}
}
}
I also tried this in configure closure:
factory << properties {
disableConcurrentBuilds()
}
But this one caused following exception to be thrown:
19:03:50 groovy.lang.GroovyRuntimeException: Ambiguous method overloading for method groovy.util.Node#leftShift.
19:03:50 Cannot resolve which method to invoke for [null] due to overlapping prototypes between:
19:03:50 [class groovy.util.Node]
19:03:50 [class java.lang.String]
I have this need as well. I notice that in my jenkins instance the jobDSL api docs indicate that disableConcurrentBuilds() property is NOT supported in multibranch pipeline jobs.
I just returned to a related discussion I was having with #tknerr in which he pointed out that there IS a rate limiting feature available to multibranch pipelines via jobDSL.
My team just ran into a problem with pollSCM triggering running amok due to this Jenkins bug, and so I'm implementing this in jobDSL to make our jobs more robust to this. Like you, I had wanted to just "disableConcurrentBuilds" like can be done in pipelines, but since rate limiting appears to be the only solution currently available to multibranch pipelines, I experimented with putting this in our jobDSLs:
strategy {
defaultBranchPropertyStrategy {
props {
rateLimitBranchProperty {
count(2)
durationName("hour")
}
}
}
}
This is of course a horrible workaround, since it places a nasty dependency in the jobDSL of needing to know how long builds take, but i'm willing to accept this alternative to having to push disableConcurrentBuilds option to the Jenkinsfile on hundreds of branches.
It is also barely even effective at achieving the goal, since we want to allow concurrent builds across branches, but want to prevent individual branch jobs from being built "too fast".
We should check if there is a feature request in Jenkins for this (your original request).
In my jenkins instance (v2.222.3, Pipeline:multibranch v2.22), the setting is described here for applying it to "all branches":
https://<my_jenkins_fqdn>/plugin/job-dsl/api-viewer/index.html#path/multibranchPipelineJob-branchSources-branchSource-strategy-allBranchesSame-props-rateLimit
and here for applying it to specific branches:
https://<my_jenkins_fqdn>/plugin/job-dsl/api-viewer/index.html#path/multibranchPipelineJob-branchSources-branchSource-strategy-namedBranchesDifferent-defaultProperties-rateLimit
EDIT: Also wanted to link to a related Jenkins issue here.
I'm migrating a Free Style job to a Pipeline on Jenkins. The Freestyle Job uses the ExportParametersBuilder (Export Parameters to File) plug-in. This is important for our workflow because the application expects the parameters as a JSON file.
I have tried with a Basic Step, as documented in Pipeline: Basic Steps - Jenkins documentation (search for ExportParametersBuilder):
step([
$class: 'ExportParametersBuilder',
filePath: 'config/parameters',
fileFormat: 'json',
keyPattern: '',
useRegexp: 'false'
])
But when I try to run the Pipeline I get the following error:
No known implementation of interface jenkins.tasks.SimpleBuildStep is named ExportParametersBuilder
The Pipeline Job is running on the same Jenkins instance as the Freestyle Job (which is currently working). So, the Plug-in is installed and working. I'm not sure why this is happening.
Does anyone knows if this plug-in can be used in Pipeline Jobs? And if so, how? What am I missing?
If it cannot be used, my apologies, Jenkins' documentation is often misleading.
I couldn't find a way to use the plug-in but I found an alternative. I'm leaving it here in case it results useful for someone else.
// Import the JsonOutput class at the top of your Jenkinsfile
import groovy.json.JsonOutput
...
stage('Environment Setup') {
steps {
writeFile(file: 'config/parameters.json', text: JsonOutput.toJson(params))
}
}
This is probably not the cleanest, or the most elegant way to do it but it works. The params are all written to the JSON file and the JsonOutput class takes care of all the escaping magic and so on.
Do keep in mind that the format of the JSON file is a little different from the one ExportParametersBuilder created, so you'll need to adapt for it:
ExportParametersBuilder format:
[
...
{
"key": "target_node",
"value": "c3po"
}
...
]
JsonOutput format:
{
...
"target_node": "c3po"
...
}
I'm trying to convert old Jenkins jobs to declarative pipeline code.
When trying to use the choice parameter in the script I implement a function which should return updated values, if the values are not the most recent ones - the job will fail.
The problem is that after the first build which looks ok, the values stay static, they don't get updated afterwards which as I said above - fails my job.
It's like the function that i wrote runs only one time at the first build and doesn't run ever again.
I've tried writing the code in a way that the output will be sent to a file and be read from it - thus maybe the function will get updated by getting the text from a file - that didn't work.
I've tried looking at the Jenkins documentation / a lot of other threads and didn't find a thing.
My code looks like this:
def GetNames() {
def workspace = "..."
def proc = "${workspace}/script.sh list".execute()
return proc.text
}
${workspace} - Is just my workspace, doesn't matter.
script.sh - A script that 100% works and tested
return proc.text - Does return the values, I've tested it in my Jenkins website/script section and the values do return properly and updated.
My parameters section:
parameters {
choice(name: 'Names', choices: GetNames(), description: 'The names')
}
First build I get 5 names, which is good because those are the updated values, seconds build I know there are 10 values but I still get the 5 from before, and every build after that I will still get the same 5 names - they do not get updated at all, the function does not get triggered again.
It seems like this is a very long running issue which still didn't get patched, the only thread that had this mentioned was this one:
Jenkins dynamic declarative pipeline parameters but the solution is in a scripted and not declarative way.
Well, i've finally figured it out, the solution is combining declarative and scripted ways,
(using active parameter plugin).
node {
properties([
parameters([
[$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
description: 'The names',
filterLength: 1,
filterable: true,
name: 'Name',
randomName: 'choice-parameter-5631314439613978',
script: [
$class: 'GroovyScript',
script: [
classpath: [],
sandbox: false,
script: '''
some code.....
return something'''
]
]
],
])
])
}
pipeline {
agent any
.
.
This way the script part of the active parameter initiates every time you load the page and the values get returned updated every time.