I have a Jenkins file that have many steps.
This is my issue:
1) I want to run ansible playbook and keep a var in Jenkins run (like environment variable or something)
2) I want to run ANOTHER playbook in different step and USE that var.
example:
pipeline {
stages {
stage('run ansible play1') {
steps {
dir("${WORKSPACE}") {
ansiblePlaybook([
inventory : 'hosts',
playbook : 'playbook1.yml',
installation: 'ansible',
colorized : true,
extraVars : [
var1: "blah1",
var2: "blah2",
]
])
}
}
}
}
stage('run ansible play2') {
steps {
dir("${WORKSPACE}") {
ansiblePlaybook([
inventory : 'hosts',
playbook : 'playbook2.yml',
installation: 'ansible',
colorized : true,
extraVars : [
var_from_last_play: "some_value",
]
])
}
}
}
}
}
}
I hope i make myself clear... Thanks for your help and if you need more info tell me.
on Jenkins pipelines you can declare environment variables at the top of the script like this:
pipeline {
environment {
MY_ENV_VAR='something' // Added variable
}
stages {
stage('run ansible play1') {
steps {
dir("${WORKSPACE}") {
ansiblePlaybook([
inventory : 'hosts',
playbook : 'playbook1.yml',
installation: 'ansible',
colorized : true,
extraVars : [
var1: "blah1",
var2: "blah2",
]
])
// you can also assign new value to the env
// variable depending on results of the script execution
// env.MY_ENV_VAR = 'something'
}
}
}
stage('run ansible play2') {
steps {
dir("${WORKSPACE}") {
ansiblePlaybook([
inventory : 'hosts',
playbook : 'playbook2.yml',
installation: 'ansible',
colorized : true,
extraVars : [
var_from_last_play: env.MY_ENV_VAR,
]
])
}
}
}
}
}
This way, you can get the variable at any stage you want, also you can set it to a desired value at the stage before (output from script or whatever)
Another way is to declare a global variable at the top outsite the 'pipeline {...}' tag as a Groovy 'def' and do the same but it is not as clean as the other solution
def myVar = ''
pipeline {
...
}
Let me know if this is what you wanted
Related
I am trying to a separate file holding variable for a Jenkins pipeline, this is because it will be used by multiple pipelines. But I can't seem to find the proper way to include it? Or if there's any way to include it?
MapA:
def MapA = [
ItemA: [
Environment: 'envA',
Name: 'ItemA',
Version: '1.0.0.2',
],
ItemB: [
Environment: 'envB',
Name: 'ItemB',
Version: '2.0.0.1',
]
]
return this;
MainScript:
def NodeLabel = 'windows'
def CustomWorkSpace = "C:/Workspace"
// Tried loading it here (Location 1)
load 'MapA'
pipeline {
agent {
node {
// Restrict Project Execution
label NodeLabel
// Use Custom Workspace
customWorkspace CustomWorkSpace
// Tried loading it here (Location 2)
load 'MapA'
}
}
stages {
// Solution
stage('Solution') {
steps {
script {
// Using it here
MapA.each { Solution ->
stage("Stage A") {
...
}
stage("Stage B") {
...
}
// Extract Commit Solution
stage("Stage C") {
...
echo "${Solution.value.Environment}"
echo "${Solution.value.Name}"
echo "${Solution.value.Version}"
}
}
}
}
}
}
}
On Location 1 outside the pipeline and node section: it gave the below error
org.jenkinsci.plugins.workflow.steps.MissingContextVariableException: Required context class hudson.FilePath is missing
Perhaps you forgot to surround the code with a step that provides this, such as: node
On Location 2 inside the node section: it gave the below error
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 7: Expected to find ‘someKey "someValue"’ # line 7, column 14.
load 'MapA'
node {
^
You can achieve your scenario in 2 ways:
#1
If you want you can hardcode the variable in the same Jenkins file and make use of it on your pipeline like below Example :
Jenkinsfile content
def MapA = [
ItemA: [
Environment: 'envA',
Name: 'ItemA',
Version: '1.0.0.2',
],
ItemB: [
Environment: 'envB',
Name: 'ItemB',
Version: '2.0.0.1',
]
]
pipeline {
agent any;
stages {
stage('debug') {
steps {
script {
MapA.each { k, v ->
stage(k) {
v.each { k1,v1 ->
// do your actual task by accessing the map value like below
echo "${k} , ${k1} value is : ${v1}"
}
}
}
}
}
}
}
}
#2
If you would like to keep the variable in a separate groovy file in a gitrepo, it will be like below
Git Repo file and folder structure
.
├── Jenkinsfile
├── README.md
└── var.groovy
var.groovy
def mapA() {
return [
ItemA: [
Environment: 'envA',
Name: 'ItemA',
Version: '1.0.0.2',
],
ItemB: [
Environment: 'envB',
Name: 'ItemB',
Version: '2.0.0.1',
]
]
}
def helloWorld(){
println "Hello World!"
}
return this;
Jenkinsfile
pipeline {
agent any
stages {
stage("iterate") {
steps {
sh """
ls -al
"""
script {
def x = load "${env.WORKSPACE}/var.groovy"
x.helloWorld()
x.mapA().each { k, v ->
stage(k) {
v.each { k1,v1 ->
echo "for ${k} value of ${k1} is ${v1}"
}
} //stage
} //each
} //script
} //steps
} // stage
}
}
Within Jenkins, I would like to parse the ansible playbook "Play Recap" output section for the failing hostname(s). I want to put the information into an email or other notification. This could also be used to fire off another Jenkins job.
I'm currently submitting an ansible-playbook as a jenkins job to deploy software across a number of systems. I'm using a Jenkins Pipeline script, which was necessary to implement for sshagent to be applied correctly.
pipeline {
agent any
options {
ansiColor('xterm')
}
stages {
stage("setup environment") {
steps {
deleteDir()
} //steps
} //stage - setup environment
stage("clone the repo") {
environment {
GIT_SSH_COMMAND = "ssh -o StrictHostKeyChecking=no"
} //environment
steps {
sshagent(['my_git']) {
sh "git clone ssh://git#github.com/~usr/ansible.git"
} //sshagent
} //steps
} //stage - clone the repo
stage("run ansible playbook") {
steps {
sshagent (credentials: ['apps']) {
withEnv(['ANSIBLE_CONFIG=ansible.cfg']) {
dir('ansible') {
ansiblePlaybook(
becomeUser: null,
colorized: true,
credentialsId: 'apps',
disableHostKeyChecking: true,
forks: 50,
hostKeyChecking: false,
inventory: 'hosts',
limit: 'production:&*generic',
playbook: 'demo_play.yml',
sudoUser: null,
extras: '-vvvvv'
) //ansiblePlaybook
} //dir
} //withEnv
} //sshagent
} //steps
} //stage - run ansible playbook
} //stages
post {
failure {
emailext body: "Please go to ${env.BUILD_URL}/consoleText for more details.",
recipientProviders: [[$class: 'DevelopersRecipientProvider'], [$class: 'RequesterRecipientProvider']],
subject: "${env.JOB_NAME}",
to: 'our.dev.team#gmail.com',
attachLog: true
office365ConnectorSend message:"A production system appears to be unreachable.",
status:"Failed",
color:"f00000",
factDefinitions: [[name: "Credentials ID", template: "apps"],
[name: "Build Duration", template: "${currentBuild.durationString}"],
[name: "Full Name", template: "${currentBuild.fullDisplayName}"]],
webhookUrl:'https://outlook.office.com/webhook/[really long alphanumeric key]/IncomingWebhook/[another super-long alphanumeric key]'
} //failure
} //post
} //pipeline
There are several Jenkins plug-ins for parsing the console output, but none will let me capture and utilize text. I have looked at log-parser and text finder.
The only lead I have is using groovy to script this.
https://devops.stackexchange.com/questions/5363/jenkins-groovy-to-parse-console-output-and-mark-build-failure
An example of "Play Recap" within the console output is:
PLAY RECAP **************************************************************************************************************************************************
some.host.name : ok=25 changed=2 unreachable=0 failed=1 skipped=2 rescued=0 ignored=0
some.ip.address : ok=22 changed=2 unreachable=0 failed=0 skipped=1 rescued=0 ignored=0
I am trying to get either a list or a delimited string of each host that is failing. Although, in the case of a list, I need to figure out how to send multiple notifications.
If anyone could help me with the full solution, I would very much appreciate your help.
Q: "Parse the ansible playbook 'Play Recap' output section."
A: Use json callback and parse the output with jq. For example
shell> ANSIBLE_STDOUT_CALLBACK=json ansible-playbook pb.yml | jq .stats
There are a few 'gotchas' that I came across as I solved this problem.
The only successful way I could access the output of the ansible plugin was through pulling the raw log file. def log = currentBuild.rawBuild.getLog(100) In this case I only pulled the last 100 lines, as I'm only looking for the Play Recap box. This method requires special permissions. The console log will display the error and provide a link where the functions can be allowed.
The ansible output should not be colorized. colorized: false Colorized output is quite difficult to parse. The 'console log' doesn't show you the colorized markup, however if you look at the 'consoleText' you will see it.
When using regex, you will most likely have a matcher object which is non-serializable. To use this in Jenkins, it may need to be placed in a function tagged #NonCPS which stops Jenkins from trying to serialize the object. I had mixed results with needing this, so I don't exhaustively understand where it's required.
The regex statement was one of the harder parts for me. I came up with a generic statement that could be easily modified for different scenarios e.g. failed or unreachable. I also had more luck using the 'slashy-style' regex in groovy which places a forward slash on either end of the statement with no need for quotes of any kind. You'll note the 'failed' portion is different failed=([1-9]|[1-9][0-9]), so that it only matches a statement where the failure is non-zero.
/([0-9a-zA-Z\.\-]+)(?=[ ]*:[ ]*ok=([0-9]|[1-9][0-9])[ ]*changed=([0-9]|[1-9][0-9])[ ]*unreachable=([0-9]|[1-9][0-9])[ ]*failed=([1-9]|[1-9][0-9]))/
Here's the full pipeline code that I came up with.
pipeline {
agent any
options {
ansiColor('xterm')
}
stages {
stage("setup environment") {
steps {
deleteDir()
} //steps
} //stage - setup environment
stage("clone the repo") {
environment {
GIT_SSH_COMMAND = "ssh -o StrictHostKeyChecking=no"
} //environment
steps {
sshagent(['my_git']) {
sh "git clone ssh://git#github.com/~usr/ansible.git"
} //sshagent
} //steps
} //stage - clone the repo
stage("run ansible playbook") {
steps {
sshagent (credentials: ['apps']) {
withEnv(['ANSIBLE_CONFIG=ansible.cfg']) {
dir('ansible') {
ansiblePlaybook(
becomeUser: null,
colorized: false,
credentialsId: 'apps',
disableHostKeyChecking: true,
forks: 50,
hostKeyChecking: false,
inventory: 'hosts',
limit: 'production:&*generic',
playbook: 'demo_play.yml',
sudoUser: null,
extras: '-vvvvv'
) //ansiblePlaybook
} //dir
} //withEnv
} //sshagent
} //steps
} //stage - run ansible playbook
} //stages
post {
failure {
script {
problem_hosts = get_the_hostnames()
}
emailext body: "${problem_hosts} has failed. Please go to ${env.BUILD_URL}/consoleText for more details.",
recipientProviders: [[$class: 'DevelopersRecipientProvider'], [$class: 'RequesterRecipientProvider']],
subject: "${env.JOB_NAME}",
to: 'our.dev.team#gmail.com',
attachLog: true
office365ConnectorSend message:"${problem_hosts} has failed.",
status:"Failed",
color:"f00000",
factDefinitions: [[name: "Credentials ID", template: "apps"],
[name: "Build Duration", template: "${currentBuild.durationString}"],
[name: "Full Name", template: "${currentBuild.fullDisplayName}"]],
webhookUrl:'https://outlook.office.com/webhook/[really long alphanumeric key]/IncomingWebhook/[another super-long alphanumeric key]'
} //failure
} //post
} //pipeline
//#NonCPS
def get_the_hostnames() {
// Get the last 100 lines of the log
def log = currentBuild.rawBuild.getLog(100)
print log
// GREP the log for the failed hostnames
def matches = log =~ /([0-9a-zA-Z\.\-]+)(?=[ ]*:[ ]*ok=([0-9]|[1-9][0-9])[ ]*changed=([0-9]|[1-9][0-9])[ ]*unreachable=([0-9]|[1-9][0-9])[ ]*failed=([1-9]|[1-9][0-9]))/
def hostnames = null
// if any matches occurred
if (matches) {
// iterate over the matches
for (int i = 0; i < matches.size(); i++) {
// if there is a name, concatenate it
// else populate it
if (hostnames?.trim()) {
hostnames = hostnames + " " + matches[i]
} else {
hostnames = matches[i][0]
} // if/else
} // for
} // if
if (!hostnames?.trim()) {
hostnames = "No hostnames identified."
}
return hostnames
}
What I am trying to achieve is: the user will select an environment and then I can set appropriate env vars for that environment (URL, DB, etc.).
Is it possible?
I tried a lot of things: with interpolation in environment, trying to define environments map in different places but no luck.
My environments map variable is not accesible in environment section and also there seems to be some limitations to what can be done inside environment section, I got messages like:
you can concatenate only with +, env var can only be value or function call.
I tried some variations with those hints but still no luck.
def environments = [
TEST: [APP_URL: 'http://test'],
DEV: [ APP_URL: 'https://dev'],
QA: [ APP_URL: 'https://qa']
]
pipeline {
agent any
parameters {
choice(name: 'environment', choices: "${environments.keySet().join('\n')}")
}
stages {
stage ('Test') {
environment {
APP_URL = environments[params.environment]['APP_URL']
}
steps {
sh 'env'
}
}
}
}
This works :)
def environments = [
TEST: [APP_URL: 'http://test'],
DEV: [ APP_URL: 'https://dev'],
QA: [ APP_URL: 'https://qa']
]
pipeline {
agent any
parameters {
choice(name: 'environment', choices: "${environments.keySet().join('\n')}")
}
stages {
stage ('Test') {
steps {
sh """
export APP_URL=${environments[params.environment]['APP_URL']}
env
"""
}
}
}
}
I'm new to Jenkins pipeline; I'm defining a declarative syntax pipeline and I don't know if I can solve my problem, because I didn't find a solution.
In this example, I need to pass a variable to ansible plugin (in old version I use an ENV_VAR or injecting it from file with inject plugin) that variable comes from a script.
This is my perfect scenario (but it doesn't work because environment{}):
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('Deploy') {
environment {
ANSIBLE_CONFIG = '${WORKSPACE}/chimera-ci/ansible/ansible.cfg'
VERSION = sh("python3.5 docker/get_version.py")
}
steps {
ansiblePlaybook credentialsId: 'example-credential', extras: '-e version=${VERSION}', inventory: 'development', playbook: 'deploy.yml'
}
}
}
}
I tried other ways to test how env vars work in other post, example:
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('PREPARE VARS') {
steps {
script {
env['VERSION'] = sh(script: "python3.5 get_version.py")
}
echo env.VERSION
}
}
}
}
but "echo env.VERSION" return null.
Also tried the same example with:
- VERSION=python3.5 get_version.py
- VERSION=python3.5 get_version.py > props.file (and try to inject it, but didnt found how)
If this is not possible I will do it in the ansible role.
UPDATE
There is another "issue" in Ansible Plugin, to use vars in extra vars it must have double quotes instead of single.
ansiblePlaybook credentialsId: 'example-credential', extras: "-e version=${VERSION}", inventory: 'development', playbook: 'deploy.yml'
You can create variables before the pipeline block starts. You can have sh return stdout to assign to these variables. You don't have the same flexibility to assign to environment variables in the environment stanza. So substitute in python3.5 get_version.py where I have echo 0.0.1 in the script here (and make sure your python script just returns the version to stdout):
def awesomeVersion = 'UNKNOWN'
pipeline {
agent { label 'docker' }
stages {
stage('build') {
steps {
script {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1').trim()
}
}
}
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
The output of the above pipeline is:
awesomeVersion: 0.0.1
In Jenkins 2.76 I was able to simplify the solution from #burnettk to:
pipeline {
agent { label 'docker' }
environment {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1')
}
stages {
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
Using the "pipeline utility steps" plugin, you can define general vars available to all stages from a properties file. For example, let props.txt as:
version=1.0
fix=alfa
and mix script and declarative Jenkins pipeline as:
def props
def VERSION
def FIX
def RELEASE
node {
props = readProperties file:'props.txt'
VERSION = props['version']
FIX = props['fix']
RELEASE = VERSION + "_" + FIX
}
pipeline {
stages {
stage('Build') {
echo ${RELEASE}
}
}
}
A possible variation of the main answer is to provide variable using another pipeline instead of a sh script.
example (set the variable pipeline) : my-set-env-variables pipeline
script
{
env.my_dev_version = "0.0.4-SNAPSHOT"
env.my_qa_version = "0.0.4-SNAPSHOT"
env.my_pp_version = "0.0.2"
env.my_prd_version = "0.0.2"
echo " My versions [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}
(use these variables) in a another pipeline my-set-env-variables-test
script
{
env.dev_version = "NOT DEFINED DEV"
env.qa_version = "NOT DEFINED QA"
env.pp_version = "NOT DEFINED PP"
env.prd_version = "NOT DEFINED PRD"
}
stage('inject variables') {
echo "PRE DEV version = ${env.dev_version}"
script
{
// call set variable job
def variables = build job: 'my-set-env-variables'
def vars = variables.getBuildVariables()
//println "found variables" + vars
env.dev_version = vars.my_dev_version
env.qa_version = vars.my_qa_version
env.pp_version = vars.my_pp_version
env.prd_version = vars.my_prd_version
}
}
stage('next job') {
echo "NEXT JOB DEV version = ${env.dev_version}"
echo "NEXT JOB QA version = ${env.qa_version}"
echo "NEXT JOB PP version = ${env.pp_version}"
echo "NEXT JOB PRD version = ${env.prd_version}"
}
For those who wants the environment's key to be dynamic, the following code can be used:
stage('Prepare Environment') {
steps {
script {
def data = [
"k1": "v1",
"k2": "v2",
]
data.each { key ,value ->
env."$key" = value
// env[key] = value // Deprecated, this can be used as well, but need approval in sandbox ScriptApproval page
}
}
}
}
You can also dump all your vars into a file, and then use the '-e #file' syntax. This is very useful if you have many vars to populate.
steps {
echo "hello World!!"
sh """
var1: ${params.var1}
var2: ${params.var2}
" > vars
"""
ansiblePlaybook inventory: _inventory, playbook: 'test-playbook.yml', sudoUser: null, extras: '-e #vars'
}
You can do use library functions in the environments section, like so:
#Library('mylibrary') _ // contains functions.groovy with several functions.
pipeline {
environment {
ENV_VAR = functions.myfunc()
}
…
}
My pipeline script is
VersionNumber([
versionNumberString : '1.0.${BUILD_DAY}',
projectStartDate : '1990-07-01',
PrefixVariable : ''
])
Through jobs its creating an enviorment varibale . But through pipeline how can I echo version number string?
just assign it to a environment variable and use it:
environment {
VERSION = VersionNumber([
versionNumberString : '${BUILD_YEAR}.${BUILD_MONTH}.${BUILD_ID}',
projectStartDate : '2014-05-19'
]);
}
then you can output it to a file:
steps {
sh 'echo "$VERSION" > version.txt';
}
or to console:
steps {
sh 'echo "$VERSION"';
}
where ever you use $VERSION it will be replaced with your version number
Try with the following code snippet:
environment {
VERSION = VersionNumber([projectStartDate: '2017-05-12', skipFailedBuilds: true, versionNumberString: '${YEARS_SINCE_PROJECT_START, XX}.${BUILD_MONTH, XX}.${BUILDS_THIS_MONTH}', versionPrefix: 'v']);
}
Here is a Jenkins Declarative Pipeline example:
pipeline {
environment {
XCODE_BUILD_NUMBER = VersionNumber(projectStartDate: '1970-01-01', versionNumberString: '${BUILD_DATE_FORMATTED, "yyyyMMddHHmm"}', versionPrefix: '')
}
stages {
stage('Example Print') {
steps{
echo XCODE_BUILD_NUMBER
sh 'add dollar sign when using sh: $XCODE_BUILD_NUMBER'
}
}
}
}