Jenkins Pipeline environment issue - jenkins

Hi i have a following jenkins pipeline like below.
pipeline {
agent any
environment {
//JSON_NAME = sh(returnStdout: true, script: "sed -n '2 p' package.json | awk '{print \$2}' | sed 's/\\,//g'").trim()
JSON_NAME = sh(returnStdout: true, script: "sed -n '2 p' package.json | awk '{print \$2}' | sed 's/\\,//g' | awk -F "/" '{print \$2}'").trim()
}
stages {
stage ('Update Italy.json') {
when {expression { fileExists('italy.json')}}
steps {
sh "echo ${JSON_NAME}"
}
}
}
}
As you can see i have to env in ENVIRONMENT block.
First one is commented, and its working.
But when i try use my second ENV it gives me error.
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: java.lang.String.div() is applicable for argument types: (java.lang.String) values: [ '{print $2}']
I couldnt figure out what wrong with this env. Any ideas ? Thanks in advance

Related

Jenkins Pipeline: I cannot add a variable from a concatenated command in bash script

I have created several bash scripts that work perfect in the Linux shell, but when I try to incorporate them in a Jenkins Pipeline I get multiple errors, I attach an example of my Pipeline where I just want to show the value of my variables, the pipeline works fine except when I added in line 5 the environment, you can see that there are special characters that are not interpreted by Groovy as the Bash does.
pipeline {
agent {
label params.LABS == "any" ? "" : params.LABS
}
environment{
PORT_INSTANCE="${docker ps --format 'table {{ .Names }} \{{ .Ports }}' --filter expose=7000-8999/tcp | (read -r; printf "%s\n"; sort -k 3) | grep web | tail -1 | sed 's/.*0.0.0.0.0://g'|sed 's/->.*//g'}"
}
stages {
stage('Setup parameters') {
steps {
script {
properties([
parameters([
choice(
choices: ['LAB-2', 'LAB-3'],
name: 'LABS'
),
string(
defaultValue: 'cliente-1',
name: 'INSTANCE_NAME',
trim: true
),
string(
defaultValue: '8888',
name: 'PORT_NUMBER',
trim: true
),
string(
defaultValue: 'lab.domain.com',
name: 'DOMAIN_NAME',
trim: true
)
])
])
}
sh """
echo '${params.INSTANCE_NAME}'
echo '${params.PORT_NUMBER}'
echo '${params.DOMAIN_NAME}'
echo '${PORT_INSTANCE}
"""
}
}
}
}
I already tried the same thing from the sh section """ command """ and they throw the same errors.
Can someone help me to know how to run advanced commands that work in the linux shell (bash), that is, is there any way to migrate scripts from bash to Jenkins?
Thank you very much for your help ;)
I want to be able to create a variable from a bash script command from the Pipeline in Jenkins
PORT_INSTANCE="${docker ps --format 'table {{ .Names }} {{ .Ports }}' --filter expose=7000-8999/tcp | (read -r; printf "%s\n"; sort -k 3) | grep web | tail -1 | sed 's/.0.0.0.0.0://g'|sed 's/->.//g'}"
I believe that you can't execute a bash script in the environment step based on the documentation.
You can create a variable from a bash script using the sh step with returnStdout set to true. Declarative pipeline doesn't allow you to assign the retrun value to a variable, so you will need to call sh inside a script like this:
stage('Calculate port') {
steps {
script {
// When you don't use `def` in front of a variable, you implicitly create a global variable
// This means that the variable will exist with a value, and can be used in any following line in your scipt
PORT_INSTANCE = sh returnStdout: true, script: "docker ps --format 'table {{ .Names }} \{{ .Ports }}' --filter expose=7000-8999/tcp | (read -r; printf \"%s\n\"; sort -k 3) | grep web | tail -1 | sed 's/.*0.0.0.0.0://g'|sed 's/->.*//g'"
// Shell output will contain a new line character at the end, remove it
PORT_INSTANCE = PORT_INSTANCE.trim()
}
}
}
I would add a stage like this, as the first stage in my pipeline.
Note that I didn't run the same shell command as you when I was testing this, so my command may have issues like un-escaped quotes.

Declarative Pipeline Jenkinsfile: Export variables out of sh call

How can I export some variables out of an sh block so they can be used in later stages?
The following does not give me any errors but the values are never available as environment variables in later stages.
steps {
sh """
ASSUME_ROLE_RESPONSE=\$(aws sts assume-role --role-arn "arn:aws:iam::${env.NON_PROD_ACCOUNT_ID}:role/${env.AWS_ROLE}" --role-session-name "${env.AWS_ROLE_SESSION}" --duration-seconds 3600)
${env.ACCESS_KEY_ID}=\$(echo \$ASSUME_ROLE_RESPONSE | jq --raw-output '.Credentials.AccessKeyId')
${env.SECRET_ACCESS_KEY}=\$(echo \$ASSUME_ROLE_RESPONSE | jq --raw-output '.Credentials.SecretAccessKey')
${env.SESSION_TOKEN}=\$(echo \$ASSUME_ROLE_RESPONSE | jq --raw-output '.Credentials.SessionToken')
echo "AWS_ACCESS_KEY_ID=${ACCESS_KEY_ID},AWS_SECRET_ACCESS_KEY=${SECRET_ACCESS_KEY},AWS_SESSION_TOKEN=${SESSION_TOKEN}"
printenv | sort
"""
}
I have got this working but I cannot say it is elegant, if someone has a better\cleaner answer I would happily accept it.
Here is my solution:
stage("Authenticate To Non-Prod Account") {
steps {
script {
aws_credentials = sh(script: """
ASSUME_ROLE_RESPONSE=\$(aws sts assume-role --role-arn "arn:aws:iam::${env.NON_PROD_ACCOUNT_ID}:role/${env.AWS_ROLE}" --role-session-name "${env.AWS_ROLE_SESSION}" --duration-seconds 3600)
ACCESS_KEY_ID=\$(echo \$ASSUME_ROLE_RESPONSE | jq --raw-output '.Credentials.AccessKeyId')
SECRET_ACCESS_KEY=\$(echo \$ASSUME_ROLE_RESPONSE | jq --raw-output '.Credentials.SecretAccessKey')
SESSION_TOKEN=\$(echo \$ASSUME_ROLE_RESPONSE | jq --raw-output '.Credentials.SessionToken')
echo "AWS_ACCESS_KEY_ID=\$ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY=\$SECRET_ACCESS_KEY,AWS_SESSION_TOKEN=\$SESSION_TOKEN"
""", returnStdout: true)
env.ACCESS_KEY_ID = aws_credentials.split(',')[0].split('=')[1].trim()
env.AWS_SECRET_KEY = aws_credentials.split(',')[1].split('=')[1].trim()
env.SESSION_TOKEN = aws_credentials.split(',')[2].split('=')[1].trim()
}
}
}
I am answering as I have read a heap of posts which have suggested ideas that didnt work for me so this, for now, is the best option I have for authentication to AWS and ensuring the credentials are available in subsequent stages.
To export vars out of sh try using this:
env.var = sh (returnStdout: true, script: ''' SOME SH COMMAND ''').trim()
This will export your bash values to groovy variables in fact it can push the vars to the environment.

sed within pipeline doesn't work. It either throws "invalid reference \1 on `s' command's RHS" Or it produces unexpected o/p

I am not sure if its appropriate and if there is a way to continue posting question the existing thread as this:
sed error: "invalid reference \1 on `s' command's RHS"
and
Invalid reference \1 using sed when trying to print matching expression
The issue I am facing is with the Jenkins (2.249.1) pipeline script:
pipeline {
agent any
stages {
stage("exp") {
steps {
script {
def output=sh(returnStdout: true, script: "echo ab4d | sed 's/b\\(([0-9]\\))/B\\1/' ").trim()
echo "output=$output";
}
}
}
}
}
When run the build, I get this o/p where I was expecting only "B4"
+ echo ab4d
+ sed 's/b\(([0-9]\))/B\1/'
[Pipeline] echo
output=ab4d
The second version that I tried was :
def output=sh(returnStdout: true, script: "echo ab4d | sed 's/b\([0-9]\)/B\1/' ").trim()
+ echo ab4d
+ sed 's/b\([0-9]\)/B\1/'
[Pipeline] echo
output=aB4d
With the third version: def output=sh(returnStdout: true, script: "echo ab4d | sed 's/b([0-9])/B\1/' ").trim()
+ echo ab4d
+ sed 's/b([0-9])/B\1/'
sed: -e expression #1, char 15: invalid reference \1 on `s' command's RHS
The output I want is just B4. Please let me know the right approach to fix this issue, and help me understand sed in Pipeline
PS: I am not sure what my Jenkins supports and what it doesn't with respective to using -r option, etc... So I purposely avoiding it until I know exactly if its a must.

Jenkinsfile how to grep file name in variable

I have file in different server and that name will change.
File: testfile-1.2-12345.sh ,"12345" is going to change.
How I get chancing text to variable?
In server machine this works and it's prints 12345:
ls ~/test/testfile* | awk -F'[-.s]' '{print $5}'
But when I do it from jenkins it wont work:
def versio = sh "ssh user#${ip} ls ~/test/testfile* | awk -F'[-.s]' '{print \$5}'"
It prints "12345" but if I try to print ${versio} it shows null.
Your command is correct. But in pipeline you need to specify returnStdout:true. Detailed documentation is here.
def versio = sh returnStdout: true, script: 'ssh user#${ip} ls ~/test/testfile* | awk -F\'[-.s]\' \'{print \\$5}\''

extract version from package.json with bash inside jenkins pipeline

Script in package.json:
"scripts": {
"version": "echo $npm_package_version"
},
One of the stage in Jenkins pipeline:
stage('Build'){
sh 'npm install'
def packageVersion = sh 'npm run version'
echo $packageVersion
sh 'VERSION=${packageVersion} npm run build'
}
I got version from npm script output, but following line
echo $packageVersion
return null
Is packageVersion value not correctly assigned?
Edited:
when using Pipeline Utility Steps with following code
stage('Build'){
def packageJSON = readJSON file: 'package.json'
def packageJSONVersion = packageJSON.version
echo packageJSONVersion
sh 'VERSION=${packageJSONVersion}_${BUILD_NUMBER}_${BRANCH_NAME} npm run build'
}
I get
[Pipeline] echo
1.1.0
[Pipeline] sh
[...] Running shell script + VERSION=_16_SOME_BRANCH npm run build
So I am able to extract version, but still cannot pass it when running script
After your edit using readJSON, you now get string interpolation wrong. Variables within single quotes are not replaced in Groovy, only within double quotes.
sh 'VERSION=${packageJSONVersion}_${BUILD_NUMBER}_${BRANCH_NAME} npm run build'
must be
sh "VERSION=${packageJSONVersion}_${BUILD_NUMBER}_${BRANCH_NAME} npm run build"
The sh step by default returns nothing, so packageVersion should be null.
To return the output of the executed command, use it like this:
sh(script: 'npm run version', returnStdout: true)
This variant of sh returns the output instead of printing it.
Actually, I am wondering, why echo $packageVersion doesn't fail with an error, as this variable is not defined, but should be echo packageVersion.
For my it this:
sh(script: "grep \"version\" package.json | cut -d '\"' -f4 | tr -d '[[:space:]]'", returnStdout: true)
This worked for me:
Full version number:
PACKAGE_VERSION = sh returnStdout: true, script: '''grep 'version' package.json | cut -d '"' -f4 | tr '\n' '\0''''
echo "Current package version: $PACKAGE_VERSION"
$ > 1.2.3
Major version only:
PACKAGE_VERSION = sh returnStdout: true, script: '''grep 'version' package.json | cut -d '"' -f4 | cut -d '.' -f1 | tr '\n' '\0''''
echo "Current package Major version: $PACKAGE_VERSION"
$ > 1
stage('Read JSON') {
steps {
script {
def packageJson = readJSON file: 'package.json'
def packageVersion = packageJSON.version
echo "${packageJSONVersion}"
}
}
}
Below snippet worked for me: Credits to #Ferenc Takacs
version = sh(returnStdout: true, script: "grep 'version' package.json | cut -d '"' -f4 | tr '\n' '\0'")
This command will take exact property version: ... in package.json and can work on both Mac and Linux. The other solutions using grep will not give you correct answer in case of you have > 1 version keyword in your package.json (it'll return all of them instead of just the one you want)
awk -F'"' '/"version": ".+"/{ print $4; exit; }' package.json
To access an npm environment variable outside the scope of a run-script, parse the variable with bash:
$ npm run env | grep npm_package_version | cut -d '=' -f 2
Author: https://remarkablemark.org/blog/2018/08/14/package-json-version/

Resources