I have a below jenkins pipeline and it is working fine
pipeline {
agent
{
node
{
label 'test'
}
}
environment{
ansible_pass = 'credentials('ans-pass')'
}
stages {
stage('Load Vars'){
steps{
script{
configFileProvider([configFile(fileId: "${ENV_CONFIG_ID}", targetLocation: "${ENV_CONFIG_FILE}")]) {
load "${ENV_CONFIG_FILE}"
}
}
}
}
stage('svc install') {
steps {
sshagent(["${SSH_KEY_ID}"])
{
sh '''
ansible-playbook main.yaml -i hosts.yaml -b --vault-password-file $ansible_pass
'''
}
}
}
}
}
Now i want to pass the global environment variable id from shell instead of hartcoding
ansible_pass = 'credentials('ans-pass')'===>>>>
this ansible-pass1 should come from managed files(config provider)
I have already below from managed files
env.ARTI_TOKEN_ID='art-token'
env.PLAYBOOK_REPO='dep.stg'
env.SSH_KEY_ID = 'test_key'
Now how to add this credential id in this file?.Tried like below
env.ansible_pass = 'ansible-pass1'
and in jenkins pipeline refered the same as below
environment{
ansible_pass = 'credentials($ansible_pass)'
}
But it didn't worked.Could you please advice
As you are using secrets in config file it is better to use secret type 'secret file' in jenkins. Follow the link to read about different types of credentials.
Also correct way of setting credentials is:
environment{
ansible_pass = credentials('credentials-id-here')
}
I'd like my Jenkins deploy pipeline to
attempt a shell command,
provide an input step if that command fails, and then
re-try the command and continue the pipeline on "ok".
Here's the (start) of my attempt to do so.
stage('Get config') {
steps {
sh 'aws appconfig get-configuration [etc etc]'
}
post {
failure {
input {
message "There is no config deployed for this environment. Set it up in AWS and then continue."
ok "Continue"
}
steps {
sh 'aws appconfig get-configuration [etc etc]'
}
}
}
}
When running the input directly in a stage, this example does show the input. However, when putting it in the post { failure }, I get this error:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 27: Missing required parameter: "message" # line 27, column 21.
input {
^
Do Jenkins declarative pipelines allow input in post?
Is there a better way to accomplish my desired outcome?
As per documentation:
Post-condition blocks contain steps the same as the steps section.
This means that input in your code is interpreted as step instead of directive.
Solution using script syntax (try/catch would also be fine there):
stage('Get config') {
steps {
script {
def isConfigOk = sh( script: 'aws appconfig get-configuration [etc etc]', returnStatus: true) == 0
if ( ! isConfigOk ) {
input (message: "There is no config deployed for this environment. Set it up in AWS and then continue.", ok: "Continue")
sh 'aws appconfig get-configuration [etc etc]'
}
}
}
}
Using post section:
stage('Get config') {
steps {
sh 'aws appconfig get-configuration [etc etc]'
}
post {
failure {
input (message: "There is no config deployed for this environment. Set it up in AWS and then continue.", ok: "Continue")
sh 'aws appconfig get-configuration [etc etc]'
}
}
}
Remember that your approach with post section will ignore outcome of second aws appconfig get-configuration [etc etc] and fail. There is a way to change this behaviour but I wouldn't call this solution any clean.
Within Jenkins, I would like to parse the ansible playbook "Play Recap" output section for the failing hostname(s). I want to put the information into an email or other notification. This could also be used to fire off another Jenkins job.
I'm currently submitting an ansible-playbook as a jenkins job to deploy software across a number of systems. I'm using a Jenkins Pipeline script, which was necessary to implement for sshagent to be applied correctly.
pipeline {
agent any
options {
ansiColor('xterm')
}
stages {
stage("setup environment") {
steps {
deleteDir()
} //steps
} //stage - setup environment
stage("clone the repo") {
environment {
GIT_SSH_COMMAND = "ssh -o StrictHostKeyChecking=no"
} //environment
steps {
sshagent(['my_git']) {
sh "git clone ssh://git#github.com/~usr/ansible.git"
} //sshagent
} //steps
} //stage - clone the repo
stage("run ansible playbook") {
steps {
sshagent (credentials: ['apps']) {
withEnv(['ANSIBLE_CONFIG=ansible.cfg']) {
dir('ansible') {
ansiblePlaybook(
becomeUser: null,
colorized: true,
credentialsId: 'apps',
disableHostKeyChecking: true,
forks: 50,
hostKeyChecking: false,
inventory: 'hosts',
limit: 'production:&*generic',
playbook: 'demo_play.yml',
sudoUser: null,
extras: '-vvvvv'
) //ansiblePlaybook
} //dir
} //withEnv
} //sshagent
} //steps
} //stage - run ansible playbook
} //stages
post {
failure {
emailext body: "Please go to ${env.BUILD_URL}/consoleText for more details.",
recipientProviders: [[$class: 'DevelopersRecipientProvider'], [$class: 'RequesterRecipientProvider']],
subject: "${env.JOB_NAME}",
to: 'our.dev.team#gmail.com',
attachLog: true
office365ConnectorSend message:"A production system appears to be unreachable.",
status:"Failed",
color:"f00000",
factDefinitions: [[name: "Credentials ID", template: "apps"],
[name: "Build Duration", template: "${currentBuild.durationString}"],
[name: "Full Name", template: "${currentBuild.fullDisplayName}"]],
webhookUrl:'https://outlook.office.com/webhook/[really long alphanumeric key]/IncomingWebhook/[another super-long alphanumeric key]'
} //failure
} //post
} //pipeline
There are several Jenkins plug-ins for parsing the console output, but none will let me capture and utilize text. I have looked at log-parser and text finder.
The only lead I have is using groovy to script this.
https://devops.stackexchange.com/questions/5363/jenkins-groovy-to-parse-console-output-and-mark-build-failure
An example of "Play Recap" within the console output is:
PLAY RECAP **************************************************************************************************************************************************
some.host.name : ok=25 changed=2 unreachable=0 failed=1 skipped=2 rescued=0 ignored=0
some.ip.address : ok=22 changed=2 unreachable=0 failed=0 skipped=1 rescued=0 ignored=0
I am trying to get either a list or a delimited string of each host that is failing. Although, in the case of a list, I need to figure out how to send multiple notifications.
If anyone could help me with the full solution, I would very much appreciate your help.
Q: "Parse the ansible playbook 'Play Recap' output section."
A: Use json callback and parse the output with jq. For example
shell> ANSIBLE_STDOUT_CALLBACK=json ansible-playbook pb.yml | jq .stats
There are a few 'gotchas' that I came across as I solved this problem.
The only successful way I could access the output of the ansible plugin was through pulling the raw log file. def log = currentBuild.rawBuild.getLog(100) In this case I only pulled the last 100 lines, as I'm only looking for the Play Recap box. This method requires special permissions. The console log will display the error and provide a link where the functions can be allowed.
The ansible output should not be colorized. colorized: false Colorized output is quite difficult to parse. The 'console log' doesn't show you the colorized markup, however if you look at the 'consoleText' you will see it.
When using regex, you will most likely have a matcher object which is non-serializable. To use this in Jenkins, it may need to be placed in a function tagged #NonCPS which stops Jenkins from trying to serialize the object. I had mixed results with needing this, so I don't exhaustively understand where it's required.
The regex statement was one of the harder parts for me. I came up with a generic statement that could be easily modified for different scenarios e.g. failed or unreachable. I also had more luck using the 'slashy-style' regex in groovy which places a forward slash on either end of the statement with no need for quotes of any kind. You'll note the 'failed' portion is different failed=([1-9]|[1-9][0-9]), so that it only matches a statement where the failure is non-zero.
/([0-9a-zA-Z\.\-]+)(?=[ ]*:[ ]*ok=([0-9]|[1-9][0-9])[ ]*changed=([0-9]|[1-9][0-9])[ ]*unreachable=([0-9]|[1-9][0-9])[ ]*failed=([1-9]|[1-9][0-9]))/
Here's the full pipeline code that I came up with.
pipeline {
agent any
options {
ansiColor('xterm')
}
stages {
stage("setup environment") {
steps {
deleteDir()
} //steps
} //stage - setup environment
stage("clone the repo") {
environment {
GIT_SSH_COMMAND = "ssh -o StrictHostKeyChecking=no"
} //environment
steps {
sshagent(['my_git']) {
sh "git clone ssh://git#github.com/~usr/ansible.git"
} //sshagent
} //steps
} //stage - clone the repo
stage("run ansible playbook") {
steps {
sshagent (credentials: ['apps']) {
withEnv(['ANSIBLE_CONFIG=ansible.cfg']) {
dir('ansible') {
ansiblePlaybook(
becomeUser: null,
colorized: false,
credentialsId: 'apps',
disableHostKeyChecking: true,
forks: 50,
hostKeyChecking: false,
inventory: 'hosts',
limit: 'production:&*generic',
playbook: 'demo_play.yml',
sudoUser: null,
extras: '-vvvvv'
) //ansiblePlaybook
} //dir
} //withEnv
} //sshagent
} //steps
} //stage - run ansible playbook
} //stages
post {
failure {
script {
problem_hosts = get_the_hostnames()
}
emailext body: "${problem_hosts} has failed. Please go to ${env.BUILD_URL}/consoleText for more details.",
recipientProviders: [[$class: 'DevelopersRecipientProvider'], [$class: 'RequesterRecipientProvider']],
subject: "${env.JOB_NAME}",
to: 'our.dev.team#gmail.com',
attachLog: true
office365ConnectorSend message:"${problem_hosts} has failed.",
status:"Failed",
color:"f00000",
factDefinitions: [[name: "Credentials ID", template: "apps"],
[name: "Build Duration", template: "${currentBuild.durationString}"],
[name: "Full Name", template: "${currentBuild.fullDisplayName}"]],
webhookUrl:'https://outlook.office.com/webhook/[really long alphanumeric key]/IncomingWebhook/[another super-long alphanumeric key]'
} //failure
} //post
} //pipeline
//#NonCPS
def get_the_hostnames() {
// Get the last 100 lines of the log
def log = currentBuild.rawBuild.getLog(100)
print log
// GREP the log for the failed hostnames
def matches = log =~ /([0-9a-zA-Z\.\-]+)(?=[ ]*:[ ]*ok=([0-9]|[1-9][0-9])[ ]*changed=([0-9]|[1-9][0-9])[ ]*unreachable=([0-9]|[1-9][0-9])[ ]*failed=([1-9]|[1-9][0-9]))/
def hostnames = null
// if any matches occurred
if (matches) {
// iterate over the matches
for (int i = 0; i < matches.size(); i++) {
// if there is a name, concatenate it
// else populate it
if (hostnames?.trim()) {
hostnames = hostnames + " " + matches[i]
} else {
hostnames = matches[i][0]
} // if/else
} // for
} // if
if (!hostnames?.trim()) {
hostnames = "No hostnames identified."
}
return hostnames
}
I have a Jenkins pipeline which needs to run on a slave node. I curently have issues with passing Variables set by plugin withCredentials. When I try to use them on the slave node they are empty, but they work on the master.
Here is the pipeline snippet.
#!groovy
#Library('sharedPipelineLib#master') _
pipeline {
agent { node
{ label 'jenkins-slave-docker' }
}
options {
skipDefaultCheckout(true)
}
environment {
sonar = credentials('SONAR')
}
stages {
stage('Checkout') {
steps {
cleanWs()
script {
checkout scm
}
}
}
stage('Deploy backend') {
steps {
script {
withCredentials([
[
$class : 'AmazonWebServicesCredentialsBinding',
credentialsId : 'AWS_ACCOUNT_ID_DEV',
accessKeyVariable: 'AWS_ACCESS_KEY_ID_DEV',
secretKeyVariable: 'AWS_SECRET_ACCESS_KEY_DEV'
],
[
$class : 'AmazonWebServicesCredentialsBinding',
credentialsId : 'AWS_ACCOUNT_ID_DNS',
accessKeyVariable: 'AWS_ACCESS_KEY_ID_DNS',
secretKeyVariable: 'AWS_SECRET_ACCESS_KEY_DNS'
]
]){
sh '''
echo "$AWS_ACCESS_KEY_ID_DEV\\n$AWS_SECRET_ACCESS_KEY_DEV\\n\\n" | aws configure --profile profile_705229686812
echo "$AWS_ACCESS_KEY_ID_DNS\\n$AWS_SECRET_ACCESS_KEY_DNS\\n\\n" | aws configure --profile profile_417752960097
'''
}
}
}
}
}
}
And the log
[Pipeline] withCredentials
Masking supported pattern matches of $AWS_ACCESS_KEY_ID_DEV or $AWS_SECRET_ACCESS_KEY_DEV or $AWS_SECRET_ACCESS_KEY_DNS or $AWS_ACCESS_KEY_ID_DNS
[Pipeline] {
[Pipeline] sh
echo '\n\n\n'
aws configure --profile profile_705229686812
AWS Access Key ID [None]: AWS Secret Access Key [None]:
EOF when reading a line
the issue was again echo cmd. I had to uses printf instead, cause echo adds newline which causes to fail.
I have created a credential test_cred of type secret text to store a password, which should be passed to an ansible playbook.
I am passing this parameter as an extra variable root_pass to ansible, but the value root_pass is evaluated to string test_cred instead of the secret text contained in it. Can somebody please help to get the value of the credential test_cred so that I can pass it to ansible.
stages {
stage('Execution') {
steps {
withCredentials([string(credentialsId: 'test_cred', variable: 'test')]) {
}
ansiblePlaybook(
installation: 'ansible',
inventory: "inventory/hosts",
playbook: "${PLAYBOOK}",
extraVars: [
server: "${params.Server}",
client: "${params.Client}",
root_pass: "${test}"
]
)
}
}
}
Thank you Zeitounator. The working code is:
stages {
stage('Execution') {
steps {
withCredentials([string(credentialsId: 'test_cred', variable: 'test')]) {
ansiblePlaybook(
installation: 'ansible',
inventory: "inventory/hosts",
playbook: "${PLAYBOOK}",
extraVars: [
server: "${params.Server}",
client: "${params.Client}",
root_pass: "${test}"
]
)
}
}
}
}