Using global shared libraries in Jenkins to define parameter options - jenkins

I am trying to use a global class that I've defined in a shared library to help organise job parameters. It's not working, and I'm not even sure if it is possible.
My job looks something like this:
pipelineJob('My-Job') {
definition {
// Job definition goes here
}
parameters {
choiceParam('awsAccount', awsAccount.ALL)
}
}
In a file in /vars/awsAccount.groovy I have the following code:
class awsAccount implements Serializable {
final String SANDPIT = "sandpit",
final String DEV = "dev",
final String PROD = "prod"
static String[] ALL = [SANDPIT, DEV, PROD]
}
Global pipeline libraries are configured to load implicitly from the my repository's master branch.
When attempting to update the DSL scripts I receive the error:
ERROR: (myJob.groovy, line 67) No such property: awsAccount for class: javaposse.jobdsl.dsl.helpers.BuildParametersContext
Why does it not find the class, and is it even possible to use shared library classes like this in pipeline job?

Disclaimer: I know it works using Jenkinsfile. Unfortunatelly, not tested usng Declarative Pipelines - but no answers yet, so it may be worth a try
Regarding your first question: there are some reasons why a class from your shared-lib could not be found. Starting from the library import, the library syntax, etc. But they definitvely work for DSL. To be more precise about it, additional information would be great. But be sure that:
You have your groovy class definition using exactly the directory structure as described in the documentation (https://www.jenkins.io/doc/book/pipeline/shared-libraries/)
Give a name to the shared-lib in jenkins as you configure it and be sure is exactly the name you use in the import
Use the import as described in the documentation (under Using Libraries)
Regarding your second question (the one that names this SO question): yes, you can include parameter jobs from information in your shared-lib. At least, using Jenkinsfiles. You can even define properties to be included in the pipelie. I got it working with a tricky syntax due to different problems.
Again, I am using Jenkinsfile and this is what worked for me:
In my shared-lib class, I added a static function that introduces the build parameters. Notice the input parameters that function needs and its usage:
class awsAccount implements Serializable {
//
static giveMeParameters (script) {
return [
// Some parms
script.string(defaultValue: '', description: 'A default parameter', name: 'textParm'),
script.booleanParam(defaultValue: false, description: 'If set to True, do whatever you need - otherwise, do not do it', name: 'boolOption'),
]
}
}
To introduce those parameters in the pipeline, you need to place the returned value of the function into the parameters array
properties (
parameters (
awsAccount.giveMeParameters (this)
)
Again, notice the syntax when calling the function. Similar to this, you can also define functions in the shared-lib that return properties and use them in multiple jobs (disableConcurrentBuilds, buildDiscarder, etc)

Related

How can I reference my constant within a Jenkins Parameter?

I have the following code in a Pipelineconstant.groovy file:
public static final list ACTION_CHOICES = [
N_A,
FULL_BLUE_GREEN,
STAGE,
FLIP,
CLEANUP
]
and this PARAMETERS in Jenkins multi-Rapper-file:
parameters {
string (name: 'ChangeTicket', defaultValue: '000000', description : 'Prod change ticket otherwise 000000')
choice (name: 'AssetAreaName', choices: ['fpukviewwholeof', 'fpukdocrhs', 'fpuklegstatus', 'fpukbooksandjournals', 'fpukleglinks', 'fpukcasesoverview'], description: 'Select the AssetAreaName.')
/* groovylint-disable-next-line DuplicateStringLiteral */
choice (name: 'AssetGroup', choices: ['pdc1c', 'pdc2c'])
}
I would like to ref ACTION_CHOICES in the parameter as this:
choice (name: 'Action', choices: constants.ACTION_CHOICES, description: 'Multi Version deployment actions')
but it doesn't work for me.
I tried to do this:
choice (name: 'Action', choices: constants.ACTION_CHOICES, description: 'Multi Version deployment actions')
but it doesn't work for me.
You're almost there! Jenkinsfile(s) can be extended with variables / constants defined (directly in your file or (better I'd say) from a Jenkins shared library (this scenario).
The parameter syntax within you pipeline was fine as well as the idea of lists of constants, but what was missing: a proper interlink of those parts together - proper library import. See example below (the names below in the example are not carved in stone and can be of course changed but watch out - Jenkins is quite sensitive about filenames, paths, ... (especially in shared libraries]):
Pipelineconstant.groovy should be placed in src/org/pipelines of your Jenkins shared library.
Pipelineconstant.groovy
package org.pipelines
class Pipelineconstant {
public static final List<String> ACTION_CHOICES = ["N_A", "FULL_BLUE_GREEN", "STAGE", "FLIP", "CLEANUP"]
}
and then you can reference this list of constants within your Jenkinsfile pipeline.
Jenkinsfile
#Library('jsl-constants') _
import org.pipelines.Pipelineconstant
pipeline {
agent any
parameters {
choice (name: 'Action', choices: Pipelineconstant.ACTION_CHOICES , description: 'Multi Version deployment actions')
}
// rest of your pipeline code
}
The first two lines of the pipeline are important - the first loads the JSL itself! Therefore the second line of that import can be used (otherwise Jenkins would not know where find that Pipelineconstant.groovy file.
B) Without Jenkins shared library (files in one repo):
I've found this topic discussed and solved for scripted pipeline here: Load jenkins parameters from external groovy file

Jenkinsfile shared params in source control

I'm new to jenkins and inherited a bunch of declarative pipelines of unknown code quality. Each pipeline uses folder properties to set shared default param values. This puts essential variables outside of source control, which kills our PR process and our history for debugging. For example
//pipelineA/Jenkinsfile
pipeline {
parameters {
string name: 'important_variable', defaultValue: folderProperty('important_variable')
}
//etc
}
//pipelineB/Jenkinsfile
pipeline {
parameters {
string name: 'important_variable', defaultValue: folderProperty('important_variable')
}
//etc
}
Then in the root folder a property important_variable is set to "Hello World"
Is there a way to get this into source control either by setting the folder property to extract the variable from a yaml, or by using shared libraries?
Thank you for any help!
In case anyone reads this, we ended up:
Create a bootstrap.groovy file
This file MUST go in a /vars directory at the absolute top of your repo
Using the Jenkins UI we went to the pipeline's parent directory > config and created a shared library called config-lib that points at our repo and automatically exposes the bootstrap.groovy file methods as long as the file is in the right place
The bootstrap.groovy file has a call method that returns a map with key value pairs for our default parameters. This method has to be named call
In the Jenkinsfile for the pipeline we include the following two lines:
#Library("config-lib") _
config = bootstrap()
The library decorator (note it ends with _) imports the config-lib methods defined in the jenkins ui
The bootstrap function calls the call method from the bootstrap.groovy file in that config-lib library
in the Jenkinsfile use the config map to populate the param default values
pipeline {
parameters {
string name: 'foo', defaultValue: config.foo
}
And it's done.
This video helped immensely: https://youtu.be/Wj-weFEsTb0

Invoke block passed to pipeline step with parameters, from plugin

I'm trying to write a Jenkins plugin that provides Step myStep which expects a block with a single parameter per below
myStep { someParameter -> <user code> }
I've found that BodyInvoker ( retrieved from StepContext.newBodyInvoker() ) provides no facilities to invoke the user provided block with parameters.
Expanding the environment would not be ideal, even though the type of the parameter is serializable ( to/from String ), i'd have to provide additional helpers to carry out this serialization, e.g
myStep { deserialize "${env.value}" <user code> }
do i have any other option to pass a non-string type in to the provided block? would type information of the parameter survive even if i did?
nb: i understand you can return a value from your Execution.run() which will be the return value of the step in the pipeline. It's just that in a related shared pipeline library i'm already heavily leaning in to this pattern of:
withFoo { computedFoo ->
# something with computedFoo
withBar computedFoo { computedBar ->
}
}
i prefer this over
computedFoo = withFoo
# something with computedFoo
withBar(computedFoo)
..then again, i couldn't find any plugins pulling this off.
no matter how close i look at workflow-step-api-plugin this doesn't seem possible today. The options are:
expand the environment context with a string value
add a custom object to the context ( requires access to step context in pipeline )
use a return value

gradle test fail when using slackNotifier in Jenkins Job DSL definition

Update:
From the bottom of the Automatically Generated DSL wiki entry ... The generated DSL is only supported when running in Jenkins,....
Since slackNotifier is generated DSL, it doesn't appear that there is a way to test this in our particular infrastructure. We're going to write a function which generates the config using the configure block.
I have a seed job definition which is failing gradle test even though it seems to work fine when we use it in Jenkins.
Job Definition Excerpt
//package master
// GitURL
def gitUrl = 'https://github.com/team/myapp'
def slackRoom = null
job('seed-dsl') {
description('This seed is updated from the seed-dsl-updater job')
properties {
//Set github project URL
githubProjectUrl(gitUrl)
}
...
// publishers is another name for post build steps
publishers {
mailer('', false, true)
slackNotifier {
room(slackRoom)
notifyAborted(true)
notifyFailure(true)
notifyNotBuilt(true)
notifyUnstable(true)
notifyBackToNormal(true)
notifySuccess(false)
notifyRepeatedFailure(false)
startNotification(false)
includeTestSummary(false)
includeCustomMessage(false)
customMessage(null)
buildServerUrl(null)
sendAs(null)
commitInfoChoice('NONE')
teamDomain(null)
authToken(null)
}
}
}
The gradle test command works fine when I comment out the with the slackNotifier declaration, but fail with the following error when it's enabled:
Test output excerpt
Caused by:
javaposse.jobdsl.dsl.DslScriptException: (script, line 79) No signature of method: javaposse.jobdsl.dsl.helpers.publisher.PublisherContext.slackNotifier() is applicable for argument types: (script$_run_closure1$_closure9$_closure14) values: [script$_run_closure1$_closure9$_closure14#d2392a1]
Possible solutions: stashNotifier(), stashNotifier(groovy.lang.Closure)
at javaposse.jobdsl.dsl.DslScriptLoader.runScriptEngine(DslScriptLoader.groovy:135)
at javaposse.jobdsl.dsl.DslScriptLoader.runScriptsWithClassLoader_closure1(DslScriptLoader.groovy:78)
According to the migration doc, slackNotifer has been supported since 1.47. In my gradle.build, I'm using 1.48. I see the same errors with plugin version 1.50
gradle.build excerpt
ext {
jobDslVersion = '1.48'
...
}
...
// Job DSL plugin including plugin dependencies
testCompile "org.jenkins-ci.plugins:job-dsl:${jobDslVersion}"
testCompile "org.jenkins-ci.plugins:job-dsl:${jobDslVersion}#jar"
...
The gradle.build also includes the following, as suggested by the [testing docs] *(https://github.com/jenkinsci/job-dsl-plugin/wiki/Testing-DSL-Scripts).
testPlugins 'org.jenkins-ci.plugins:slack:2.0.1'
What do I need to do to be able to successfully test my job definitions. Is this a bug, or have I missed something else?
removed incorrect reply
EDIT
I see I missed the point.
The new approach is to reuse the #DataBoundConstructor exposed by plugins, so nothing needs to be written to support a new plugin assuming it has a DataBoundConstructor
Your SlackNotifier has this - note the DSL converts the lowercase first letter for you
#DataBoundConstructor
public SlackNotifier(
final String teamDomain,
final String authToken,
final String room,
final String buildServerUrl,
final String sendAs,
final boolean startNotification,
final boolean notifyAborted,
final boolean notifyFailure,
final boolean notifyNotBuilt,
final boolean notifySuccess,
final boolean notifyUnstable,
final boolean notifyBackToNormal,
final boolean notifyRepeatedFailure,
final boolean includeTestSummary,
CommitInfoChoice commitInfoChoice,
boolean includeCustomMessage,
String customMessage) {
...
}
Unfortunately there is an embedded type in the parameter list CommitInfoChoice and this does not have a DataBoundConstructor and its an enum too.
public enum CommitInfoChoice {
NONE("nothing about commits", false, false),
AUTHORS("commit list with authors only", true, false),
AUTHORS_AND_TITLES("commit list with authors and titles", true, true);
...
}
I'll go out on a limb and say that it won't work out the box until the nested enum implements a databound constructor and also has a descriptor, sorry.
I don't have the plugin but you can look at the XML for a real created job with the plugin and see what goes into this section. I suspect it is a nested structure
You can try the job dsl google group - link to a post about the generic approach
We ran into this as well. The solution for us was to add the slack plugin version we were using on jenkins to our list of plugins in gradle.
To be more specific, in our build.gradle file under dependencies, we added the following code to get our plugins included and hence allow the auto-generated DSL to work.
You can see this described here and an example of a different plugin next to testPlugins:
https://github.com/jenkinsci/job-dsl-plugin/wiki/Testing-DSL-Scripts
Like the following:
dependencies {
...
// plugins to install in test instance
testPlugins 'org.jenkins-ci.plugins:ghprb:1.31.4'
testPlugins 'com.coravy.hudson.plugins.github:github:1.19.0'
}

How does variable scoping work when splitting a workflow into smaller chunks?

I have a very long workflow for building and testing our application. So long, in fact, that when we try to load the main workflow script, we get this exception:
java.lang.ClassFormatError: Invalid method Code length 67768 in class file WorkflowScript
I am not proud of this. I'm tying to split the workflow into smaller scripts that we load from the main workflow script, but are running into an issue with variable scoping. For example:
def a = 'foo' //some variable referenced in multiple workflow stages
node {
echo a
}
//... and then a whole bunch of other stages
might become
def a = 'foo' //some variable referenced in multiple workflow stages
node {
git: ...
load 'flowPartA.groovy'
}()
where flowPartA.groovy looks like:
{ ->
node {
echo a
}
}
Based on my understanding of the documentation, where flowPartA.groovy is interpreted as a closure, I expect the variable 'a' would remain in scope, but instead, I get an exception to the contrary.
groovy.lang.MissingPropertyException: No such property: a for class: groovy.lang.Binding
Am I missing something about the way workflow interprets the flow scripts? Is there a good way to take a huge workflow that uses many, many parameters and split it into smaller chunks?
You have to define a function in the external groovy and call it passing all required parameters:
def a = 'foo'
node('slave') {
git '…'
def flow = load 'flowPartA.groovy'
flow.echoFromA(a)
}
And flowPartA.groovy contains:
def echoFromA(String a) {
echo a
}
return this
See the documentation for more information.

Resources