I have a Jenkins Job DSL seed job that calls out to a couple of pipeline jobs e.g.
pipelineJob("job1") {
definition {
cps {
script(readFileFromWorkspace('job1.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
}
}
}
pipelineJob("job2") {
definition {
cps {
script(readFileFromWorkspace('job2.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
}
}
}
job1.groovy and job2.groovy are standard Jenkinsfile style pipelines.
I want to pass a couple of common maps into these jobs. These contains things that may vary between environments, e.g. target servers, credential names.
Something like:
def SERVERS_MAP = [
'prod': [
'prod-server1',
'prod-server2',
],
'dev': [
'dev-server1',
'dev-server2',
],
]
Can I define a map in my seed job that I can then pass and access as a map in my pipeline jobs?
I've come up with a hacky workaround using the pipeline-utility-steps plugin.
Essentially I pass my data maps around as JSON.
So my seed job might contain:
def SERVERS_MAP = '''
{
"prod": [
"prod-server1",
"prod-server2"
],
"dev": [
"dev-server1",
"dev-server2"
]
}
'''
pipelineJob("job1") {
definition {
cps {
script(readFileFromWorkspace('job1.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
stringParam('SERVERS_MAP', "${SERVERS_MAP}", "")
}
}
}
and my pipeline would contain something like:
def serversMap = readJSON text: SERVERS_MAP
def targetServers = serversMap["${ENV}"]
targetServers.each { server ->
echo server
}
I could also extract these variables into a JSON file and read them from there.
Although it works, it feels wrong somehow.
You can use string parameter pass the Map val, downstream read it as json format.
UPSTREAM PIPELINE
timestamps{
node("sse_lab_CI_076"){ //${execNode}
currentBuild.description="${env.NODE_NAME};"
stage("-- regression execute --"){
def test_map =
"""
{
"gerrit_patchset_commit": "aad5fce",
"build_cpu_x86_ubuntu": [
"centos_compatible_build_test",
"gdb_compatible_build_test",
"visual_profiler_compatible_build_test"
],
}
"""
build(job: 'tops_regression_down',
parameters: [string(name: 'UPSTREAM_JOB_NAME',
value: "${env.JOB_BASE_NAME}"),
string(name: 'UPSTREAM_BUILD_NUM',
value: "${env.BUILD_NUMBER}"),
string(name: 'MAP_PARAM',
value: "${test_map}"),
],
propagate: true,
wait: true)
}
}
}
DOWNSTREAM PIPELINE
timestamps{
node("sse_lab_inspur_076"){ //${execNode}
currentBuild.description="${env.NODE_NAME};"
stage('--in precondition--'){
dir('./'){
cleanWs()
println("hello world")
println("${env.MAP_PARAM}")
Map result_json = readJSON(text: "${env.MAP_PARAM}")
println(result_json)
}
}
}
}
Related
I've written a pipeline as a shared library and I would like to call as one of the stage of master pipeline, but I am getting an error that probably node is not defined. What is the best approach for that? In second case I rewrite sharedTest as a standard pipe line and use "build job" instead of call a shared library, but in that I am repeating a code in some places.
So generally I would like to have:
sharedTest as a independed pipeline but also reusing it in some places, so first one is simple because I can create a separate pipeline where I am importing lib and then calling such lib method. The problem is when I would like to use a shared pipeline as a stage of master piple.
sharedTest.groovy :
def call() {
pipeline{
agent {
label "ansirobotSpy3-devel"
}
parameters {
choice(name: 'TEST', choices: ['bts1', 'bts2'], description: '')
string(name: 'PATH', defaultValue: '/bts1/, description: '')
}
environment {
HTTPS_PROXY = 'http://1.1.1.1'
HTTP_PROXY = 'http://1.1.1.1'
}
stages{
stage('Test stage'){
steps{
script {
sh "ls -lart ./*"
installPyLibs('pytest')
}
}
}
}
}
}
master pipeline:
...
stage("tests"){
agent none
options {
skipDefaultCheckout()
}
when{
beforeAgent true
allOf{
not { expression { currentBuild.result == 'ABORTED' } }
not { expression { SharedTest == 'true' } }
}
}
steps {
script {
stage ("Seek && Destoy") {
sharedTest()
}
stage ("Deploy") {
def deploy = build job: 'Deploy',
parameters: [
string(name: 'BUILD_NUMBER', value: "${env.NEW_BUILD_NR_VAR}")
], wait: true, propagate: false
}
...
From my experience Jenkins doesn't allow using shared libraries locally. I did a workaround registering my shared library this way:
library identifier: 'LIBRARYNAME#BRANCH',
retriever: modernSCM([$class: 'GitSCMSource',
credentialsId: 'CREDENTIALS_FROM_JENKINS',
id: 'GUID',
remote: 'CLONE_LINK_TO_GIT_REPO',
traits: [[$class: 'jenkins.plugins.git.traits.BranchDiscoveryTrait']]])
This also can be achieved using UI and registering the library. More on that here: https://www.jenkins.io/doc/book/pipeline/shared-libraries/
As for the code - I assume your pipeline is inside vars folder in your repository. (details about this here: folder structure) .This way they will be accessible during the pipeline.
Let's assume I have file vars/internalStepTestEcho.groovy. After loading a library this can be accessed using:
internalStepTestEcho()
Using the declarative pipeline syntax, I want to be able to define parameters based on an array of repos, so that when starting the build, the user can check/uncheck the repos that should not be included when the job runs.
final String[] repos = [
'one',
'two',
'three',
]
pipeline {
parameters {
booleanParam(name: ...) // static param
// now include a booleanParam for each item in the `repos` array
// like this but it's not allowed
script {
repos.each {
booleanParam(name: it, defaultValue: true, description: "Include the ${it} repo in the release?")
}
}
}
// later on, I'll loop through each repo and do stuff only if its value in `params` is `true`
}
Of course, you can't have a script within the parameters block, so this won't work. How can I achieve this?
Using the Active Choices Parameter plugin is probably the best choice, but if for some reason you can't (or don't want to) use a plugin, you can still achieve dynamic parameters in a Declarative Pipeline.
Here is a sample Jenkinsfile:
def list_wrap() {
sh(script: 'echo choice1 choice2 choice3 choice4 | sed -e "s/ /\\n/g"', , returnStdout: true)
}
pipeline {
agent any
stages {
stage ('Gather Parameters') {
steps {
timeout(time: 30, unit: 'SECONDS') {
script {
properties([
parameters([
choice(
description: 'List of arguments',
name: 'service_name',
choices: 'DEFAULT\n' + list_wrap()
),
booleanParam(
defaultValue: false,
description: 'Whether we should apply changes',
name: 'apply'
)
])
])
}
}
}
}
stage ('Run command') {
when { expression { params.apply == true } }
steps {
sh """
echo choice: ${params.service_name} ;
"""
}
}
}
}
This embeds a script {} in a stage, which calls a function, which runs a shell script on the agent/node of the Declarative Pipeline, and uses the script's output to set the choices for the parameters. The parameters are then available in the next stages.
The gotcha is that you must first run the job with no build parameters in order for Jenkins to populate the parameters, so they're always going to be one run out of date. That's why the Active Choices Parameter plugin is probably a better idea.
You could also combine this with an input command to cause the pipeline to prompt the user for a parameter:
script {
def INPUT_PARAMS = input message: 'Please Provide Parameters', ok: 'Next',
parameters: [
choice(name: 'ENVIRONMENT', choices: ['dev','qa'].join('\n'), description: 'Please select the Environment'),
choice(name: 'IMAGE_TAG', choices: getDockerImages(), description: 'Available Docker Images')]
env.ENVIRONMENT = INPUT_PARAMS.ENVIRONMENT
env.IMAGE_TAG = INPUT_PARAMS.IMAGE_TAG
}
Credit goes to Alex Lashford (https://medium.com/disney-streaming/jenkins-pipeline-with-dynamic-user-input-9f340fb8d9e2) for this method.
You can use CHOICE parameter of Jenkins in which user can select a repository.
pipeline {
agent any
parameters
{
choice(name: "REPOS", choices: ['REPO1', 'REPO2', 'REPO3'])
}
stages {
stage ('stage 1') {
steps {
// the repository selected by user will be printed
println("$params.REPOS")
}
}
}
}
You can also use the plugin Active Choices Parameter if you want to do multiple select : https://plugins.jenkins.io/uno-choice/#documentation
You can visit pipeline syntax and configure in below way to generate code snippet and you can put it in the jenkins file:
Copy the snippet code and paste it in jenkinsfile at the start.
I have the following pipeline. I need this pipeline to run on 4 different nodes at the same time. I have read that using a matrix section within the declarative pipeline is key to making this work. How can I go about doing that with the pipeline below?
pipeline
{
stages
{
stage ('Test')
{
steps
{
script
{
def test_proj_choices = ['AD', 'CD', 'DC', 'DISP_A', 'DISP_PROC', 'EGI', 'FD', 'FLT', 'FMS_C', 'IFF', 'liblO', 'libNGC', 'libSC', 'MISCMP_MP', 'MISCMP_GP', 'NAV_MGR', 'RADALT', 'SYS', 'SYSIO15', 'SYSIO42', 'SYSRED', 'TACAN', 'VOR_ILS', 'VPA', 'WAAS', 'WCA']
for (choice in test_proj_choices)
{
stage ("${choice}")
{
echo "Running ${choice}"
build job: "UH60Job", parameters: [string(name: "TEST_PROJECT", value: choice), string(name: "SCADE_SUITE_TEST_ACTION", value: "all"), string(name: "VIEW_ROOT", value: "myview")]
}
}
}
}
}
}
}
One helpful article can be found here : https://www.jenkins.io/blog/2019/11/22/welcome-to-the-matrix/
The official documentation here: https://www.jenkins.io/doc/book/pipeline/syntax/#declarative-matrix
Accordingly, the syntax should be:
pipeline {
agent none
stages {
stage('Tests') {
matrix {
agent any
axes {
axis {
name 'CHOICE'
values 'AD', 'CD', 'DC', 'DISP_A', 'DISP_PROC', 'EGI', 'FD', 'FLT', 'FMS_C', 'IFF', 'liblO', 'libNGC', 'libSC', 'MISCMP_MP', 'MISCMP_GP', 'NAV_MGR', 'RADALT', 'SYS', 'SYSIO15', 'SYSIO42', 'SYSRED', 'TACAN', 'VOR_ILS', 'VPA', 'WAAS', 'WCA'
}
}
stages {
stage("Test") {
steps {
echo "Running ${CHOICE}"
build job: "UH60Job", parameters: [string(name: "TEST_PROJECT", value: CHOICE), string(name: "SCADE_SUITE_TEST_ACTION", value: "all"), string(name: "VIEW_ROOT", value: "myview")]
}
}
}
}
}
}
}
Note that your inner stage cannot be named dynamically, you'd get a syntax error trying to expand "${CHOICE}".
pipeline {
agent any
stages {
stage("foo") {
steps {
script {
env.RELEASE_SCOPE = input message: 'User input required', ok: 'Release!',
parameters: [choice(name: 'RELEASE_SCOPE', choices: 'patch\nminor\nmajor',
description: 'What is the release scope?')]
}
echo "${env.RELEASE_SCOPE}"
}
}
}
}
In this above code, The choice are hardcoded (patch\nminor\nmajor) -- My requirement is to dynamically give choice values in the dropdown.
I get the values from calling api - Artifacts list (.zip) file names from artifactory
In the above example, It request input when we do the build, But i want to do a "Build with parameters"
Please suggest/help on this.
Depends how you get data from API there will be different options for it, for example let's imagine that you get data as a List of Strings (let's call it releaseScope), in that case your code be following:
...
script {
def releaseScopeChoices = ''
releaseScope.each {
releaseScopeChoices += it + '\n'
}
parameters: [choice(name: 'RELEASE_SCOPE', choices: ${releaseScopeChoices}, description: 'What is the release scope?')]
}
...
hope it will help.
This is a cutdown version of what we use. We separate stuff into shared libraries but I have consolidated a bit to make it easier.
Jenkinsfile looks something like this:
#!groovy
#Library('shared') _
def imageList = pipelineChoices.artifactoryArtifactSearchList(repoName, env.BRANCH_NAME)
imageList.add(0, 'build')
properties([
buildDiscarder(logRotator(numToKeepStr: '20')),
parameters([
choice(name: 'ARTIFACT_NAME', choices: imageList.join('\n'), description: '')
])
])
Shared library that looks at artifactory, its pretty simple.
Essentially make GET Request (And provide auth creds on it) then filter/split result to whittle down to desired values and return list to Jenkinsfile.
import com.cloudbees.groovy.cps.NonCPS
import groovy.json.JsonSlurper
import java.util.regex.Pattern
import java.util.regex.Matcher
List artifactoryArtifactSearchList(String repoKey, String artifact_name, String artifact_archive, String branchName) {
// URL components
String baseUrl = "https://org.jfrog.io/org/api/search/artifact"
String url = baseUrl + "?name=${artifact_name}&repos=${repoKey}"
Object responseJson = getRequest(url)
String regexPattern = "(.+)${artifact_name}-(\\d+).(\\d+).(\\d+).${artifact_archive}\$"
Pattern regex = ~ regexPattern
List<String> outlist = responseJson.results.findAll({ it['uri'].matches(regex) })
List<String> artifactlist=[]
for (i in outlist) {
artifactlist.add(i['uri'].tokenize('/')[-1])
}
return artifactlist.reverse()
}
// Artifactory Get Request - Consume in other methods
Object getRequest(url_string){
URL url = url_string.toURL()
// Open connection
URLConnection connection = url.openConnection()
connection.setRequestProperty ("Authorization", basicAuthString())
// Open input stream
InputStream inputStream = connection.getInputStream()
#NonCPS
json_data = new groovy.json.JsonSlurper().parseText(inputStream.text)
// Close the stream
inputStream.close()
return json_data
}
// Artifactory Get Request - Consume in other methods
Object basicAuthString() {
// Retrieve password
String username = "artifactoryMachineUsername"
String credid = "artifactoryApiKey"
#NonCPS
credentials_store = jenkins.model.Jenkins.instance.getExtensionList(
'com.cloudbees.plugins.credentials.SystemCredentialsProvider'
)
credentials_store[0].credentials.each { it ->
if (it instanceof org.jenkinsci.plugins.plaincredentials.StringCredentials) {
if (it.getId() == credid) {
apiKey = it.getSecret()
}
}
}
// Create authorization header format using Base64 encoding
String userpass = username + ":" + apiKey;
String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(userpass.getBytes());
return basicAuth
}
I could achieve it without any plugin:
With Jenkins 2.249.2 using a declarative pipeline,
the following pattern prompt the user with a dynamic dropdown menu
(for him to choose a branch):
(the surrounding withCredentials bloc is optional, required only if your script and jenkins configuration do use credentials)
node {
withCredentials([[$class: 'UsernamePasswordMultiBinding',
credentialsId: 'user-credential-in-gitlab',
usernameVariable: 'GIT_USERNAME',
passwordVariable: 'GITLAB_ACCESS_TOKEN']]) {
BRANCH_NAMES = sh (script: 'git ls-remote -h https://${GIT_USERNAME}:${GITLAB_ACCESS_TOKEN}#dns.name/gitlab/PROJS/PROJ.git | sed \'s/\\(.*\\)\\/\\(.*\\)/\\2/\' ', returnStdout:true).trim()
}
}
pipeline {
agent any
parameters {
choice(
name: 'BranchName',
choices: "${BRANCH_NAMES}",
description: 'to refresh the list, go to configure, disable "this build has parameters", launch build (without parameters)to reload the list and stop it, then launch it again (with parameters)'
)
}
stages {
stage("Run Tests") {
steps {
sh "echo SUCCESS on ${BranchName}"
}
}
}
}
The drawback is that one should refresh the jenkins configration and use a blank run for the list be refreshed using the script ...
Solution (not from me): This limitation can be made less anoying using an aditional parameters used to specifically refresh the values:
parameters {
booleanParam(name: 'REFRESH_BRANCHES', defaultValue: false, description: 'refresh BRANCH_NAMES branch list and launch no step')
}
then wihtin stage:
stage('a stage') {
when {
expression {
return ! params.REFRESH_BRANCHES.toBoolean()
}
}
...
}
this is my solution.
def envList
def dockerId
node {
envList = "defaultValue\n" + sh (script: 'kubectl get namespaces --no-headers -o custom-columns=":metadata.name"', returnStdout: true).trim()
}
pipeline {
agent any
parameters {
choice(choices: "${envList}", name: 'DEPLOYMENT_ENVIRONMENT', description: 'please choose the environment you want to deploy?')
booleanParam(name: 'SECURITY_SCAN',defaultValue: false, description: 'container vulnerability scan')
}
The example of Jenkinsfile below contains AWS CLI command to get the list of Docker images from AWS ECR dynamically, but it can be replaced with your own command. Active Choices Plug-in is required.
Note! You need to approve the script specified in parameters after first run in "Manage Jenkins" -> "In-process Script Approval", or open job configuration and save it to approve
automatically (might require administrator permissions).
properties([
parameters([[
$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'image',
description: 'Docker image',
filterLength: 1,
filterable: false,
script: [
$class: 'GroovyScript',
fallbackScript: [classpath: [], sandbox: false, script: 'return ["none"]'],
script: [
classpath: [],
sandbox: false,
script: '''\
def repository = "frontend"
def aws_ecr_cmd = "aws ecr list-images" +
" --repository-name ${repository}" +
" --filter tagStatus=TAGGED" +
" --query imageIds[*].[imageTag]" +
" --region us-east-1 --output text"
def aws_ecr_out = aws_ecr_cmd.execute() | "sort -V".execute()
def images = aws_ecr_out.text.tokenize().reverse()
return images
'''.stripIndent()
]
]
]])
])
pipeline {
agent any
stages {
stage('First stage') {
steps {
sh 'echo "${image}"'
}
}
}
}
choiceArray = [ "patch" , "minor" , "major" ]
properties([
parameters([
choice(choices: choiceArray.collect { "$it\n" }.join(' ') ,
description: '',
name: 'SOME_CHOICE')
])
])
Lately, I'm working on a project where I'm swapping deprecated build flows to build pipeline in several job DSL generated Jenkins.
One particular problem was to swap the build flows guard-rescue mechanism to pipeline syntax. I'm curious what you think about my solution:
See the following flow DSL:
guard {
b = build("parameterised_job")
} rescue {
build("analyzer_job",
PARAMETER_ONE:b.environment.get("PARAMETER_ONE"),
PARAMETER_TWO:b.environment.get("PARAMETER_TWO")
)
}
I created the following alternative with the pipeline syntax:
pipeline {
agent any
stages {
stage("build") {
steps {
script {
def b = build(job: "parameterised_job", propagate: false)
build(job: "analyzer_job",
parameters:
[[$class: 'StringParameterValue', name: 'PARAMETER_ONE', value: b.buildVariables.PARAMETER_ONE],
[$class: 'StringParameterValue', name: 'PARAMETER_TWO', value: b.buildVariables.PARAMETER_TWO]])
if(b.result == 'FAILURE') {
error("${b.projectName} FAILED")
}
}
}
}
}
}