How to write common code for Jenkins Pipelines - jenkins

I would like to know how one would write common code that is used by other var files. Any pointers to decent tutorials would be appreciated as I'm sure I will be writing many more modules. I have been trying to follow this: https://www.jenkins.io/doc/book/pipeline/shared-libraries/#writing-libraries but have very little knowledge of groovy.
From what I understand, files in var/ are loaded and are available within the pipeline syntax. Files located in src/ need to be imported by the files in var/ to be used. Is this correct?
I would like to produce the following:
A file that has common code such as AWS Account numbers, functions like ECR logins and a ECR string creator which requires the account number. I foresee many other common aws tasks living in here.
A file that has a list of commands to execute to do the work I want to do.
What I have currently (doesn't work):
// var/ECRPush.groovy
#Library('awsCommon')
import com.mycompany.awsCommon
def call(Map config=[:]) {
awsCommon.dockerLogin(config.account)
ecrAddress=awsCommon.ecrAddress(account:config.account, name:config.name) + ":" + config.tag
sh("docker tag ${config.sourceContainer} ${ecrAddress}")
sh("docker push ${ecrAddress}")
}
// src/com/mycompany/awsCommon/awsCommon.groovy
#!/usr/bin/env groovy
package com.mycomany.awsCommon
class awsCommon {
def Map accounts = [
"dev":"123",
"shared":"456"
]
// used as:
// account: account from accounts
// name: 'service'
def String ecrAddress(Map config=[:]) {
address = self.Accounts([config.account]) + ".dkr.ecr.eu-west-1.amazonaws.com/" + config.name
return address
}
def dockerLogin(String account) {
switch(account) {
case "shared":
echo 'login to Shared ECR'
withCredentials([aws(credentialsId: 'jenkins-aws-shared')]){
sh '$(aws ecr get-login --no-include-email --region eu-west-1)'
}
break
case "dev":
echo 'login to Dev ECR'
withCredentials([aws(credentialsId: 'jenkins-aws-dev')]){
sh '$(aws ecr get-login --no-include-email --region eu-west-1)'
}
break
}
}
}
// Example desired usage
#Library('jenkins-shared-lib-global') _
pipeline {
stage('ecr push') {
steps {
ECRPush(
account: 'dev',
name: 'ecs-testing',
sourceContainer: 'ecs-testing:dev',
tag: 'jenkins'
)
}
}
}

Related

How to feed credentials/password type values in cloudFormation template using deploy commands' --parameter-override from Jenkins-pipeline?

I want to create some secrets in aws secret-manager, these secret values are stored in Jenkins credential store and my Jenkinsfile is reading these values using Jenkins credentials.
I can easily use these values to feed to my cloudformation templates as following-
stage("aws-stack-deploy") {
agent { label 'linux' }
steps {
withCredentials([string(credentialsId: env.SECRET_TEXT_KEY, variable: 'SECRET_TEXT_VALUE')]) {
withAwsCli(credentialsId: "${env.AWS_CREDENTIALS}", defaultRegion: "${env.AWS_REGION}") {
aws cloudformation deploy --template-file ./secretmananger.yaml --parameter-overrides $(cat /cft-param.json | tr '\n' ', ') ${PARAM_SECRET_KEY}=${SECRET_TEXT_VALUE}
}
}
}
}
But I am looking for some more scalable way to feed these secret, where I can add variable number of secrets in automated fashion.
I like to know if there is better way to do this. Thanks!

Scripts not permitted to use staticMethod org.codehaus.groovy.runtime.DefaultGroovyMethods write java.io.File java.lang.String

I'm trying to create vault-deployment using Jenkins. Here's a link to my repo.
When running the script I'm getting
"Scripts not permitted to use staticMethod org.codehaus.groovy.runtime.DefaultGroovyMethods write java.io.File java.lang.String. Administrators can decide whether to approve or reject this signature." issue.
I got this issue after adding a stage "Generate Vars".
If I remove this stage in the code the other stages works, but they don't complete the job. This is because it needs to get token for vault deployment and it needs to get it from .tfvars file.
It's not a good idea to share my variables on GitHub, that's why I`m trying to create vault.tfvars through Jenkins and provide any token before running a pipeline job.
Does anyone know how to fix this???
If some part is not clear please feel free to ask questions!
If I find the solution for this issue I will share it here with the link to my GitHub.
Thanks
Here is my code Jenkinsfile.groovy
node('master') {
properties([parameters([
string(defaultValue: 'plan', description: 'Please provide what action you want? (plan,apply,destroy)', name: 'terraformPlan', trim: true),
string(defaultValue: 'default_token_add_here', description: 'Please provide a token for vault', name: 'vault_token', trim: true)
]
)])
checkout scm
stage('Generate Vars') {
def file = new File("${WORKSPACE}/vaultDeployment/vault.tfvars")
file.write """
vault_token = "${vault_token}"
"""
}
stage("Terraform init") {
dir("${workspace}/vaultDeployment/") {
sh 'ls'
sh 'pwd'
sh "terraform init"
}
stage("Terraform Plan/Apply/Destroy"){
if (params.terraformPlan.toLowerCase() == 'plan') {
dir("${workspace}/vaultDeployment/") {
sh "terraform plan -var-file=variables.tfvars"
}
}
if (params.terraformPlan.toLowerCase() == 'apply') {
dir("${workspace}/vaultDeployment/") {
sh "terraform apply --auto-approve"
}
}
if (params.terraformPlan.toLowerCase() == 'destroy') {
dir("${workspace}/vaultDeployment/") {
sh "terraform destroy --auto-approve"
}
}
}
}
}
Generally, we choose pipeline to execute in Groovy sandbox which has restriction in some aspects for security considering. Like using new keyword, using static method.
But you need Jenkins admin to add the restriction to whitelist in jenkins > Manage jenkins > In-process Script Approval
To write file, Jenkins pipeline supply alternative writeFile which has no such restriction.
writeFile file: '<file path>', text: """
vault_token = "${vault_token}"
"""
As #yong already pointed out the right way to achieve this and avoid eventual restrictions in environments where we don't have admin control is to use writeFile
i.e.:
writeFile file: 'tmp/query.sql', text: "SELECT * FROM table"
Advantage of this is that migrating from fully managed to restricted environment will be painless.
Subfolders, like 'tmp' in example, will be automatically created and code itself is pretty verbose

How to read log file from within pipeline?

I have a pipeline job that runs a maven build. In the "post" section of the pipeline, I want to get the log file so that I can perform some failure analysis on it using some regexes. I have tried the following:
def logContent = Jenkins.getInstance()
.getItemByFullName(JOB_NAME)
.getBuildByNumber(
Integer.parseInt(BUILD_NUMBER))
.logFile.text
Error for the above code
Scripts not permitted to use staticMethod jenkins.model.Jenkins
getInstance
currentBuild.rawBuild.getLogFile()
Error for the above code
Scripts not permitted to use method hudson.model.Run getLogFile
From my research, when I encounter these, I should be able to go to the scriptApproval page and see a prompt to approve these scripts, but when I go to that page, there are no new prompts.
I've also tried loading the script in from a separate file and running it on a different node with no luck.
I'm not sure what else to try at this point, so that's why I'm here. Any help is greatly appreciated.
P.S. I'm aware of the BFA tool, and I've tried manually triggering the analysis early, but in order to do that, I need to be able to access the log file, so I run into the same issue.
You can use pipeline step httpRequest from here
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Test fetch build log'
}
post {
always {
script {
def logUrl = env.BUILD_URL + 'consoleText'
def response = httpRequest(
url: logUrl,
authentication: '<credentialsId of jenkins user>',
ignoreSslErrors: true
)
def log = response.content
echo 'Build log: ' + log
}
}
}
}
}
}
If your jenkins job can run on linux machine, you can use curl to archive same goal.
pipeline {
agent any
stages {
stage('Build') {
environment {
JENKINS_AUTH = credentials('< credentialsId of jenkins user')
}
steps {
sh 'pwd'
}
post {
always {
script {
def logUrl = env.BUILD_URL + 'consoleText'
def cmd = 'curl -u ${JENKINS_AUTH} -k ' + logUrl
def log = sh(returnStdout: true, script: cmd).trim()
echo 'Build log: ' +
echo log
}
}
}
}
}
}
Above two approaches both require the credentials is Username and password format. More detail about what is it and how to add in Jenkins, please look at here
Currently this is not possible via the RunWrapper object that is made available. See https://issues.jenkins.io/browse/JENKINS-46376 for a request to add this.
So the only options are:
explicitly whitelisting the methods
read the log via the URL as described in the other answer, but this requires either anonymous read access or using proper credentials.

Hiding passwords in Jenkins Pipeline log output without using WithCredentials

I have a parametrized Jenkins pipeline based on a Jenkinsfile. Some of the parameters contain sensitive passwords that I don't want to appear in the job's build logs.
So my question is: can I somehow register a String within the Jenkinsfile that is then replaced - by let's say ********** - whenever it appears in the log output?
I am aware of the withCredentials step, but I can't use it, since the credentials are not stored in the Jenkins credentials store (but provided as parameters at runtime).
I found this answer here https://stackoverflow.com/a/42372859/1549950 and tried it like this:
def secrets = [
[password: firstPassword, var: 'SECRET'],
[password: secondPassword, var: 'SECRET'],
[password: thirdPassword, var: 'SECRET']
]
node() {
wrap([$class: 'MaskPasswordsBuildWrapper', varPasswordPairs: secrets]) {
// my stages containing steps...
}
}
Where firstPassword, secondPassword, thirdPassword are variables containing my passwords. But still I get the content of firstPassword... displayed plain text in the log output.
I have the Mask Password plugin installed on my Jenkins in version 2.12.0.
Basically I am searching for something like this: https://issues.jenkins-ci.org/browse/JENKINS-27486 - ticket is resolved, but no sample snippet of final implementation is given.
Actually I don't know why this didn't work in the first place, but here is the solution to the problem.
Define an array with secrets that you want to hide like this:
def splunkPassword = 'verySecretPa55w0rd'
def basicAuthPassword = 'my8asicAuthPa55w0rd'
def getSecrets() {
[
[password: splunkPassword, var: 'SECRET'],
[password: basicAuthPassword, var: 'SECRET']
]
}
Disclaimer: I don't know whether the SECRET value has an important role, copy and pasted it from some snippet and it works as expected :)
Afterwards, you can wrap any calls in your scripted pipeline like this:
node {
wrap([$class: 'MaskPasswordsBuildWrapper', varPasswordPairs: getSecrets()]) {
stage 'First Stage' { ... }
stage 'Second Stage' { ... }
}
}
All passwords provided in the getSecrets() array will then be masked like this in your build output:
SPLUNK_PASSWORD: ********
BASIC_AUTH_ADMIN_PASSWORD: ********
I think you are looking for JENKINS-36007?
Update 26 May 2020
The workaround below stopped working for me recently. My guess is that something changed in a recent Jenkins update. I was trying to avoid installing another plugin, but I eventually gave up and installed the Mask Passwords plugin.
I used the following syntax for use with parameters:
parameters {
string(name: 'USERNAME', defaultValue: '', description: 'Username')
password(name: 'PASSWORD', defaultValue: '', description: 'Password')
}
Then in the build stage:
steps {
script {
wrap([$class: 'MaskPasswordsBuildWrapper',
varPasswordPairs: [
[password: "${USERNAME}", var: 'USR'],
[password: "${PASSWORD}", var: 'PSW']
]
]) {
sh '''
echo "Username: ${USERNAME}"
echo "Password: ${PASSWORD}"
'''
}
}
}
The original workaround is below, in case anyone else tries to go down the same path.
I've discovered a workaround that is a bit of a hack, but seems to work well. The trick is to use withCredentials, but override the variable with a parameter.
Here's an example which uses the environment directive's credentials() helper method to populate an environment variable, then overrides the two additional environment variables that are automatically defined (and masked in the logs).
First, create a dummy Username with password Credentials. The Username and Password values don't matter, we just need a Credential to use as a placeholder. Enter an ID such as dummy-credentials.
Then define an environment variable using the dummy credentials, and override the automatically defined variables with the parameters (MYUSERNAME and MYPASSWORD in this example):
environment {
MY_CREDS = credentials('dummy-credentials')
MY_CREDS_USR = "${params.MYUSERNAME}"
MY_CREDS_PSW = "${params.MYPASSWORD}"
}
Use the MY_CREDS_USR and MY_CREDS_PSW environment variables wherever you need to reference the secrets. Their contents will be masked in the console log.
sh '''
echo "Username: ${MY_CREDS_USR}"
echo "Password: ${MY_CREDS_PSW}"
'''
You might have a look at https://github.com/jenkinsci/log-file-filter-plugin
This plugin allows filtering Jenkins' console output by means of regular expressions. If some pattern matches the matched string is replaced by a string that can be specified for each pattern in the configuration.
Currently the plugin doesn't support adding filter-patterns from a jenkinsfile but only from the Jenkins global settings.
Highly brutish workaround.
Write a simple script, e.g. bash, and echo the parameter credentials into some file of arbitrary format, down to your echoing approach.
E.g. basic shell script:
$ cat executor/obfuscate.sh
#!/bin/bash
echo "PASSWORD: ${AWX_PW}" > ./executor/credential.yml
In your pipeline then:
stages {
stage('Placing') {
steps {
**sh './executor/obfuscate.sh'** }
[...]
< something reading credential.yml>
}
}
Outcome, nothing showing up in console:

Can I "import" the stages in a Jenkins Declarative pipeline

I have several pipeline jobs, which are configured very similarly.
They all have the same stages (of which there are about 10).
I am now I am thinking about moving to the declarative pipeline (https://jenkins.io/blog/2016/09/19/blueocean-beta-declarative-pipeline-pipeline-editor/).
But I do not want to define the ~10 stages in every pipeline. I want to define them at one place, and "import" them somehow.
Is this possible with declarative pipelines at all? I see that there are Libraries, but it does not seem like I could include the stage definition using them.
You will have to create a shared-library to implement what i am about to suggest. For shared-library implementation, you may check the following posts:
Using Building Blocks in Jenkins Declarative Pipeline
Upload file in Jenkins input step to workspace (Mainly for images so one can easily figure out things)
Now if you want to use a Jenkinsfile (kind of a template) which can be reused across multiple projects (jobs), then that is indeed possible.
Once you have created a shared-library repository with vars directory in it, then you just have to create a Groovy file (let's say, commonPipeline.groovy) inside vars directory.
Here's an example that works because I have used it earlier in multiple jobs.
$ cat shared-lib/vars/commonPipeline.groovy
// You can create function(s) as shown below, if required
def someFunctionA() {
// Your code
}
// This is where you will define all the stages that you want
// to run as a whole in multiple projects (jobs)
def call(Map config) {
pipeline {
agent {
node { label 'slaveA || slaveB' }
}
environment {
myvar_Y = 'apple'
myvar_Z = 'orange'
}
stages {
stage('Checkout') {
steps {
deleteDir()
checkout scm
}
}
stage ('Build') {
steps {
script {
check_something = someFunctionA()
if (check_something) {
echo "Build!"
# your_build_code
} else {
error "Something bad happened! Exiting..."
}
}
}
}
stage ('Test') {
steps {
echo "Running tests..."
// your_test_code
}
}
stage ('Deploy') {
steps {
script {
sh '''
# your_deploy_code
'''
}
}
}
}
post {
failure {
sh '''
# anything_you_need_to_perform_in_failure_step
'''
}
success {
sh '''
# anything_you_need_to_perform_in_success_step
'''
}
}
}
}
With above Groovy file in place, all you have to do now is to call it in your various Jenkins projects. Since you might already be having an existing Jenkinsfile (if not, create it) in your Jenkins project, you just have to replace the existing content of that file with the following:
$ cat Jenkinsfile
// Assuming you have named your shared-library as `my-shared-lib` & `Default version` to `master` branch in
// `Manage Jenkins` » `Configure System` » `Global Pipeline Libraries` section
#Library('my-shared-lib#master')_
def params = [:]
params=[
jenkins_var: "${env.JOB_BASE_NAME}",
]
commonPipeline params
Note: As you can see above, I am calling commonPipeline.groovy file. So, all your bulky Jenkinsfile will get reduced to just five or six lines of code, and those few lines are also going to be common across all those projects. Also note that I have used jenkins_var above. It can be any name. It's not actually used but is required for pipeline to run. Some Groovy expert can clarify that part.
Ref: https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/

Resources