I'm in the process of passing a from the Jenkins Global Variable Reference variable called JOB_BASE_NAME to the groovy script. I'm using extendedChoice parameter with Groovy script and it is responsible for listing container images from the ECR on a specific repository. In my case Jenkins job names and ECR repository names are equivalent.
Ex:
Jenkins Job Name = http://jenkins.localhost/job/application-abc
ECR Repo name = abc/application-abc
I tried several things but all time I ended up with an empty response to the container images listing part.
Please help me to figure out is it outofthebox or how can i implement this thing
Thanks
Here is my Code
pipeline {
agent {
label 'centos7-slave'
}
stages {
stage('Re Tag RELEASE TAG AS UAT') {
environment {
BRANCH = "${params.GITHUB_BRANCH_TAG}"
}
input {
message 'Select tag'
ok 'Release!'
parameters {
extendedChoice(
bindings: '',
groovyClasspath: '',
multiSelectDelimiter: ',',
name: 'DOCKER_RELEASE_TAG',
quoteValue: false,
saveJSONParameterToFile: false,
type: 'PT_SINGLE_SELECT',
visibleItemCount: 5,
groovyScript: '''
import groovy.json.JsonSlurper
def AWS_ECR = ("/usr/local/bin/aws ecr list-images --repository-name abc/${JOB_BASE_NAME} --filter tagStatus=TAGGED --region ap-southeast-1").execute()
def DATA = new JsonSlurper().parseText(AWS_ECR.text)
def ECR_IMAGES = []
DATA.imageIds.each {
if(("$it.imageTag".length()>3))
{
ECR_IMAGES.push("$it.imageTag")
}
}
return ECR_IMAGES.grep( ~/.*beta.*/ ).sort().reverse()
'''
)
}
}
steps {
script {
def DOCKER_TAG = sh(returnStdout: true, script:"""
#!/bin/bashF
set -e
set -x
DOCKER_TAG_NUM=`echo $DOCKER_RELEASE_TAG | cut -d "-" -f1`
echo \$DOCKER_TAG_NUM
""")
DOCKER_TAG = DOCKER_TAG.trim()
DOCKER_TAG_NUM = DOCKER_TAG
}
sh "echo ${AWS_ECR} | docker login --username AWS --password-stdin ${ECR}"
sh "docker pull ${ECR}/${REPOSITORY}:${DOCKER_RELEASE_TAG}"
sh " docker tag ${ECR}/${REPOSITORY}:${DOCKER_RELEASE_TAG} ${ECR}/${REPOSITORY}:${DOCKER_TAG_NUM}-rc"
sh "docker push ${ECR}/${REPOSITORY}:${DOCKER_TAG_NUM}-rc"
}
}
}
}
You can leverage Groovy String Interpolation to replace the job base name in the script for the parameter, but the script can't access any variable out of the scope of the script.
You can try as following:
Use a function to compose the Groovy script for parameter
The function accept the JOB_BASE_NAME value
Use Groovy string interpolation to replace to real value.
pipeline {
agent {
label 'centos7-slave'
}
stages {
stage('Re Tag RELEASE TAG AS UAT') {
environment {
BRANCH = "${params.GITHUB_BRANCH_TAG}"
}
input {
message 'Select tag'
ok 'Release!'
parameters {
extendedChoice(
bindings: '',
groovyClasspath: '',
multiSelectDelimiter: ',',
name: 'DOCKER_RELEASE_TAG',
quoteValue: false,
saveJSONParameterToFile: false,
type: 'PT_SINGLE_SELECT',
visibleItemCount: 5,
groovyScript: list_ecr_images("${env.JOB_BASE_NAME}")
)
}
}
steps {
script {
def DOCKER_TAG = sh(returnStdout: true, script:"""
#!/bin/bashF
set -e
set -x
DOCKER_TAG_NUM=`echo $DOCKER_RELEASE_TAG | cut -d "-" -f1`
echo \$DOCKER_TAG_NUM
""")
DOCKER_TAG = DOCKER_TAG.trim()
DOCKER_TAG_NUM = DOCKER_TAG
}
sh "echo ${AWS_ECR} | docker login --username AWS --password-stdin ${ECR}"
sh "docker pull ${ECR}/${REPOSITORY}:${DOCKER_RELEASE_TAG}"
sh " docker tag ${ECR}/${REPOSITORY}:${DOCKER_RELEASE_TAG} ${ECR}/${REPOSITORY}:${DOCKER_TAG_NUM}-rc"
sh "docker push ${ECR}/${REPOSITORY}:${DOCKER_TAG_NUM}-rc"
}
}
}
}
def list_ecr_images(jobBaseName) {
def _script = """
import groovy.json.JsonSlurper
def AWS_ECR = [
'/usr/local/bin/aws',
'ecr list-images',
"--repository-name abc/${jobBaseName}",
'--filter tagStatus=TAGGED',
'--region ap-southeast-1'
].execute().text
def DATA = new JsonSlurper().parseText(AWS_ECR)
def ECR_IMAGES = []
DATA.imageIds.each {
if((it.imageTag.length()>3))
{
ECR_IMAGES.push(it.imageTag)
}
}
return ECR_IMAGES.grep( ~/.*beta.*/ ).sort().reverse()
"""
return _script.stripIndent()
}
Related
I have a shared library that accept parameters i setup to compress files into a tar. The jenkinspipline looks like this.
stage("Package"){
steps{
compress_files("arg1", "arg2")
}
}
The shared library compress_file looks like this
#!/usr/bin/env groovy
// Process any number of arguments.
def call(String... args) {
sh label: 'Create Directory to store tar files.', returnStdout: true,
script: """ mkdir -p "$WORKSPACE/${env.PROJECT_NAME}" """
args.each {
sh label: 'Creating project directory.', returnStdout: true,
script: """ mkdir -p "$WORKSPACE/${env.PROJECT_NAME}" """
sh label: 'Coping contents to project directory.', returnStdout: true,
script: """ cp -rv ${it} "$WORKSPACE/${env.PROJECT_NAME}/." """
}
sh label: 'Compressing project directory to a tar file.', returnStdout: true,
script: """ tar -czf "${env.PROJECT_NAME}.tar.gz" "${env.PROJECT_NAME}" """
sh label: 'Remove the Project directory..', returnStdout: true,
script: """ rm -rf "$WORKSPACE/${env.PROJECT_NAME}" """
}
New requirement is to use an array instead of updating the argument values. How or can we pass an arrayname in the jenkinsfile stage
Yes it’s possible, from Jenkinsfile you can define the array inside stage() or outside stage() and make that use of, like
In declarative pipeline :
def files = ["arg1", "arg2"] as String[]
pipeline {
agent any
stages {
stage("Package") {
steps {
// script is optional
script {
// you can manipulate the variable value of files here
}
compress_files(files)
}
}
}
}
In scripted pipeline:
node() {
//You can define the value here as well
// def files = ["arg1", "arg2"] as String[]
stage("Package"){
def files = ["arg1", "arg2"] as String[]
compress_files(files)
}
}
And in the shared library, the method will be like
// var/compress_files.groovy
def call(String[] args) {
args.each {
// retrive the value from ${it} and proceed with your logic
}
}
or
def call(String... args) {
args.each {
// retrive the value from ${it} and proceed with your logic
}
}
Below is the Jenkins pipeline. which runs a state against the list of minions stored in a .txt file on the salt master server. The below command runs fine on the salt master cli:
salt --list `awk -vORS=, '{ print $1 }' /srv/salt/TMjenkins/minions.txt | sed 's/,$/\n/'` test.ping
However, when I run it through the Jenkins pipeline, I get illegal string body character after dollar sign. The salt master is in a remote server, hence I can't execute the cmd natively.
so, far i have tried with passing the cmd in """ """ and ''' ''', also { print \"${1}\" }. Nothing has worked so far. Any suggestion, appreciated.
pipeline = {
ansiColor('xterm') {
def remote = [:]
remote.name = 'saltmaster'
remote.host = 'xx.xxx.xx.x'
remote.allowAnyHosts = true
withCredentials([usernamePassword(credentialsId: 'saltmaster', passwordVariable: 'password', usernameVariable: 'ops')]) {
remote.user = 'xxx'
remote.password = password
stage('Filetransfer') {
sshCommand remote: remote, command: " salt -L `awk -vORS=, '{ print \"${1}\" }' /srv/salt/TMjenkins/minions.txt | sed 's/,$/\n/'` test.ping "
}
}
sh '/home/jenkins/jenkins-data/slack_notification.sh " ${minionid}" "Deployment finished successfully" "good" ":jenkins:"'
}
}
postFailure = {
sh '/home/jenkins/jenkins-data/slack_notification.sh " ${minionid}" "Unfortunately deployment was unsuccessful this time" "danger" ":jenkinserror:"'
}
postAlways = {
echo 'Cleaning Workspace now'
env.WORKSPACE = pwd()
sh "rm ${env.WORKSPACE}/* -fr"
}
node{
properties([
parameters([
string(name: 'Region', defaultValue: '', description: 'Region for which the process should run. ')
])
])
try {
pipeline()
} catch (e) {
postFailure()
throw e
} finally {
postAlways()
}
}
You need to escape your $ sign if you want to pass it over. So:
awk -vORS=, '{ print $1 }' /srv/salt/TMjenkins/minions.txt
becomes
sh "awk -vORS=, '{ print \$1 }' /srv/salt/TMjenkins/minions.txt "
and
sed 's/,$/\n/'
becomes
sh "sed 's/,\$/\n/'"
Finally, instead of using bash scripts to send Slack notifications, you should use Slack plugin for Jenkins, like this:
slackSend color: "danger", text: "Failed"
See also
I have a fairly simple script, but the Jenkinsfile never substitutes the variable (since) and I am not sure why.
I have tried the $since and ${since} syntax and each time the substitution is empty. The parameters work just fine.
since = ''
pipeline {
agent any
options {
buildDiscarder(logRotator(numToKeepStr: '5', artifactNumToKeepStr: '5'))
}
parameters {
string(defaultValue: '', description: 'The service name you wish to display status', name: 'serviceName')
choice(choices: ['swarm-group-test', 'swarm-group-prod'], description: 'Swarm environment to remove from', name: 'swarmEnv')
string(defaultValue: '5', description: 'Number of minutes to look back in logs', name: 'numMinutes')
}
tools {
nodejs "node-js-11.12.0"
}
stages {
stage('Invoke Playbook') {
steps {
script {
sh (script: "node -p -e \"var a = new Date(); a.setMinutes(a.getMinutes() - ${numMinutes}); a.toISOString();\"", returnStdout: true).trim()
}
echo "since: ${since}"
ansiColor('xterm') {
sh '''
export ANSIBLE_FORCE_COLOR=true
export ANSIBLE_STDOUT_CALLBACK=debug
ansible-playbook -b -v -u jenkins playbooks/display-service-status.yml -k --extra-vars="serviceName=$serviceName swarmEnv=$swarmEnv since=$since" -i playbooks/swarm-hosts
'''
}
}
}
}
}
Output:
since: 2019-09-30T12:43:12.134Z
ansible-playbook -b -v -u jenkins playbooks/display-service-status.yml -k '--extra-vars=serviceName=TEST_openam swarmEnv=swarm-group-test since=' -i playbooks/swarm-hosts
I believe that you cannot substitute Jenkins variables in single quote strings.
Here's a helpful document: https://gist.github.com/Faheetah/e11bd0315c34ed32e681616e41279ef4
Relevant question. A few things to remember:
Environment variables should be declared within an environment (reference). They can have global or stage scope.
Variables can also be declared with the def keyword in some specific locations/scopes.
We can reference variables using groovy-like interpolation, that is, $ and curly braces inside double quote strings (reference) .
Example:
environment {
since = 'hello'
varThatUsesAnotherVar = "${since} world"
}
stages {
stage('Invoke Playbook') {
steps {
echo "since: ${since}"
echo "varThatUsesAnotherVar : ${varThatUsesAnotherVar}"
}
}
}
I am using the extendedChoice plugin for my Jenkins pipeline. It fetches the s3 objects from the bucket and provides the list of values using a short Groovy script. The issue is that I need to parametrize the s3 bucket by using the corresponding variable defined within the pipeline's environment section. How can I do this?
So I tried a lot of different snippets to get the env vars though with no result.
import jenkins.model.*
// This will print out the requested var from the global Jenkins config.
def envVars = Jenkins.instance.getGlobalNodeProperties()[0].getEnvVars()
return envVars['S3_BUCKET']
// This will print out values from the env vars of the node itself where the Jenkins is running.
def env = System.getenv('S3_BUCKET')
return env
// This is what I have now
def domainsList = "aws s3api list-objects-v2 --bucket someRandomBucket --output text --delimiter /".execute() | 'cut -d / -f 1'.execute() | 'sed 1d'.execute()
domainsList.waitFor()
def output = domainsList.in.text
return output.split('COMMONPREFIXES')
// This is the Jenkinsfile
pipeline {
agent any
environment {
DOMAIN_NAME = "${params.DOMAIN_NAME}"
MODEL_VERSION = "${params.MODEL_VERSION}"
S3_BUCKET = "someRandomBucket"
}
parameters {
extendedChoice(
bindings: '',
defaultValue: '',
description: '',
descriptionPropertyValue: '',
groovyClasspath: '',
groovyScript: '''
def domainsList = "aws s3api list-objects-v2 --bucket someRandomBucket --output text --delimiter /".execute() | 'cut -d / -f 1'.execute() | 'sed 1d'.execute()
domainsList.waitFor()
def output = domainsList.in.text
return output.split('COMMONPREFIXES')
''',
multiSelectDelimiter: ',',
name: 'DOMAIN_NAME',
quoteValue: false,
saveJSONParameterToFile: false,
type: 'PT_SINGLE_SELECT',
visibleItemCount: 10)
choice(
choices: ['a', 'b'],
description: 'Select a model version for processing',
name: 'MODEL_VERSION')
}
stages {
stage('Clean workdir') {
steps {
cleanWs()
}
}
stage('build') {
steps {
sh "echo $S3_BUCKET"
sh "echo $DOMAIN_NAME"
sh "echo $MODEL_VERSION"
}
}
}
}
As I mentioned above I need to substitue the someRandomBucket hardcode with the S3_BUCKET env var value in the groovy script within the extendedChoice parameter
RESOLVED - Environment variables can be injected particarly for the parameter via the Jenkins job UI
pipeline {
agent any
stages {
stage("foo") {
steps {
script {
env.RELEASE_SCOPE = input message: 'User input required', ok: 'Release!',
parameters: [choice(name: 'RELEASE_SCOPE', choices: 'patch\nminor\nmajor',
description: 'What is the release scope?')]
}
echo "${env.RELEASE_SCOPE}"
}
}
}
}
In this above code, The choice are hardcoded (patch\nminor\nmajor) -- My requirement is to dynamically give choice values in the dropdown.
I get the values from calling api - Artifacts list (.zip) file names from artifactory
In the above example, It request input when we do the build, But i want to do a "Build with parameters"
Please suggest/help on this.
Depends how you get data from API there will be different options for it, for example let's imagine that you get data as a List of Strings (let's call it releaseScope), in that case your code be following:
...
script {
def releaseScopeChoices = ''
releaseScope.each {
releaseScopeChoices += it + '\n'
}
parameters: [choice(name: 'RELEASE_SCOPE', choices: ${releaseScopeChoices}, description: 'What is the release scope?')]
}
...
hope it will help.
This is a cutdown version of what we use. We separate stuff into shared libraries but I have consolidated a bit to make it easier.
Jenkinsfile looks something like this:
#!groovy
#Library('shared') _
def imageList = pipelineChoices.artifactoryArtifactSearchList(repoName, env.BRANCH_NAME)
imageList.add(0, 'build')
properties([
buildDiscarder(logRotator(numToKeepStr: '20')),
parameters([
choice(name: 'ARTIFACT_NAME', choices: imageList.join('\n'), description: '')
])
])
Shared library that looks at artifactory, its pretty simple.
Essentially make GET Request (And provide auth creds on it) then filter/split result to whittle down to desired values and return list to Jenkinsfile.
import com.cloudbees.groovy.cps.NonCPS
import groovy.json.JsonSlurper
import java.util.regex.Pattern
import java.util.regex.Matcher
List artifactoryArtifactSearchList(String repoKey, String artifact_name, String artifact_archive, String branchName) {
// URL components
String baseUrl = "https://org.jfrog.io/org/api/search/artifact"
String url = baseUrl + "?name=${artifact_name}&repos=${repoKey}"
Object responseJson = getRequest(url)
String regexPattern = "(.+)${artifact_name}-(\\d+).(\\d+).(\\d+).${artifact_archive}\$"
Pattern regex = ~ regexPattern
List<String> outlist = responseJson.results.findAll({ it['uri'].matches(regex) })
List<String> artifactlist=[]
for (i in outlist) {
artifactlist.add(i['uri'].tokenize('/')[-1])
}
return artifactlist.reverse()
}
// Artifactory Get Request - Consume in other methods
Object getRequest(url_string){
URL url = url_string.toURL()
// Open connection
URLConnection connection = url.openConnection()
connection.setRequestProperty ("Authorization", basicAuthString())
// Open input stream
InputStream inputStream = connection.getInputStream()
#NonCPS
json_data = new groovy.json.JsonSlurper().parseText(inputStream.text)
// Close the stream
inputStream.close()
return json_data
}
// Artifactory Get Request - Consume in other methods
Object basicAuthString() {
// Retrieve password
String username = "artifactoryMachineUsername"
String credid = "artifactoryApiKey"
#NonCPS
credentials_store = jenkins.model.Jenkins.instance.getExtensionList(
'com.cloudbees.plugins.credentials.SystemCredentialsProvider'
)
credentials_store[0].credentials.each { it ->
if (it instanceof org.jenkinsci.plugins.plaincredentials.StringCredentials) {
if (it.getId() == credid) {
apiKey = it.getSecret()
}
}
}
// Create authorization header format using Base64 encoding
String userpass = username + ":" + apiKey;
String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(userpass.getBytes());
return basicAuth
}
I could achieve it without any plugin:
With Jenkins 2.249.2 using a declarative pipeline,
the following pattern prompt the user with a dynamic dropdown menu
(for him to choose a branch):
(the surrounding withCredentials bloc is optional, required only if your script and jenkins configuration do use credentials)
node {
withCredentials([[$class: 'UsernamePasswordMultiBinding',
credentialsId: 'user-credential-in-gitlab',
usernameVariable: 'GIT_USERNAME',
passwordVariable: 'GITLAB_ACCESS_TOKEN']]) {
BRANCH_NAMES = sh (script: 'git ls-remote -h https://${GIT_USERNAME}:${GITLAB_ACCESS_TOKEN}#dns.name/gitlab/PROJS/PROJ.git | sed \'s/\\(.*\\)\\/\\(.*\\)/\\2/\' ', returnStdout:true).trim()
}
}
pipeline {
agent any
parameters {
choice(
name: 'BranchName',
choices: "${BRANCH_NAMES}",
description: 'to refresh the list, go to configure, disable "this build has parameters", launch build (without parameters)to reload the list and stop it, then launch it again (with parameters)'
)
}
stages {
stage("Run Tests") {
steps {
sh "echo SUCCESS on ${BranchName}"
}
}
}
}
The drawback is that one should refresh the jenkins configration and use a blank run for the list be refreshed using the script ...
Solution (not from me): This limitation can be made less anoying using an aditional parameters used to specifically refresh the values:
parameters {
booleanParam(name: 'REFRESH_BRANCHES', defaultValue: false, description: 'refresh BRANCH_NAMES branch list and launch no step')
}
then wihtin stage:
stage('a stage') {
when {
expression {
return ! params.REFRESH_BRANCHES.toBoolean()
}
}
...
}
this is my solution.
def envList
def dockerId
node {
envList = "defaultValue\n" + sh (script: 'kubectl get namespaces --no-headers -o custom-columns=":metadata.name"', returnStdout: true).trim()
}
pipeline {
agent any
parameters {
choice(choices: "${envList}", name: 'DEPLOYMENT_ENVIRONMENT', description: 'please choose the environment you want to deploy?')
booleanParam(name: 'SECURITY_SCAN',defaultValue: false, description: 'container vulnerability scan')
}
The example of Jenkinsfile below contains AWS CLI command to get the list of Docker images from AWS ECR dynamically, but it can be replaced with your own command. Active Choices Plug-in is required.
Note! You need to approve the script specified in parameters after first run in "Manage Jenkins" -> "In-process Script Approval", or open job configuration and save it to approve
automatically (might require administrator permissions).
properties([
parameters([[
$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'image',
description: 'Docker image',
filterLength: 1,
filterable: false,
script: [
$class: 'GroovyScript',
fallbackScript: [classpath: [], sandbox: false, script: 'return ["none"]'],
script: [
classpath: [],
sandbox: false,
script: '''\
def repository = "frontend"
def aws_ecr_cmd = "aws ecr list-images" +
" --repository-name ${repository}" +
" --filter tagStatus=TAGGED" +
" --query imageIds[*].[imageTag]" +
" --region us-east-1 --output text"
def aws_ecr_out = aws_ecr_cmd.execute() | "sort -V".execute()
def images = aws_ecr_out.text.tokenize().reverse()
return images
'''.stripIndent()
]
]
]])
])
pipeline {
agent any
stages {
stage('First stage') {
steps {
sh 'echo "${image}"'
}
}
}
}
choiceArray = [ "patch" , "minor" , "major" ]
properties([
parameters([
choice(choices: choiceArray.collect { "$it\n" }.join(' ') ,
description: '',
name: 'SOME_CHOICE')
])
])