Pact in Jenkinsfile: can-i-deploy check - jenkins

We are using PACT (https://pact.io/) in our project. The can-i-deploy check, whether the deployment can be executed is done like this:
Jenkins Environment Variables (see <JENKINS_URL>/configure)
PACT_BROKER_URL
PACT_RUBY_STANDALONE_VERSION
(PACT_RUBY_STANDALONE_VERSION from https://github.com/pact-foundation/pact-ruby-standalone/releases)
Jenkinsfile:
environment {
SOURCE_BRANCH_NAME = sourceBranchName(env.BRANCH_NAME, env.CHANGE_BRANCH)
}
...
def sourceBranchName(String branchName, String changeBranchName) {
return changeBranchName == null ? branchName : changeBranchName
}
...
stage('can-i-deploy') {
steps {
withCredentials([usernamePassword(credentialsId: 'XXX', passwordVariable: 'PACT_BROKER_PASSWORD', usernameVariable: 'PACT_BROKER_USERNAME')]) {
sh "curl -LO https://github.com/pact-foundation/pact-ruby-standalone/releases/download/v${PACT_RUBY_STANDALONE_VERSION}/pact-${PACT_RUBY_STANDALONE_VERSION}-linux-x86_64.tar.gz"
sh "tar xzf pact-${PACT_RUBY_STANDALONE_VERSION}-linux-x86_64.tar.gz"
echo "Performing can-i-deploy check"
sh "pact/bin/./pact-broker can-i-deploy --broker-base-url=${PACT_BROKER_URL} --broker-username=${PACT_BROKER_USERNAME} --broker-password=${PACT_BROKER_PASSWORD} --pacticipant=project-frontend --latest=${env.SOURCE_BRANCH_NAME} --pacticipant=project-backend --latest=${env.SOURCE_BRANCH_NAME} --pacticipant=other-project-backend --latest=${env.SOURCE_BRANCH_NAME}"
}
}
}
Is there a more elegant way to do this?

I can't speak for Jenkins, but there are two things worth changing in the arguments sent to can-i-deploy:
It's no recommended to use the latest flag. You should use the --version to indicate the version of the application you are deploying and the --to flag to denote the target environment (latest runs the risk of race conditions with builds giving you false positives/negatives)
You don't need to specify compatible other projects, can-i-deploy will automatically detect all dependent components.
So it would look more like this
can-i-deploy --broker-base-url=${PACT_BROKER_URL} --broker-username=${PACT_BROKER_USERNAME} --broker-password=${PACT_BROKER_PASSWORD} --pacticipant=project-frontend --version some-sha-1234 --to prod
If you have access to docker, you might prefer to use our container.
P.S. if you simply export the following environment variables you can drop them off the argument list also:
PACT_BROKER_BASE_URL (please note the minor difference from what you're using)
PACT_BROKER_USERNAME
PACT_BROKER_PASSWORD

Related

Jenkins build failed due to command not being recognized

I have this build error saying pandoc command is not recognize, when I build my pipeline on Jenkins :
But when I run the exact same command using cmd.exe from the same repository it works perfectly :
So what's wrong here, my command pandoc is well installed and can perfectly be used from cmd.exe, why doesn't it works from Jenkins ?
Here is my Jenkins code (the part causing the error is in the "Build" stage):
pipeline {
agent any
stages {
stage('Prerequisites') {
steps {
//bat 'RMDIR C:\\wamp64\\www\\html\\doc'
bat 'MKDIR C:\\wamp64\\www\\html\\doc'
}
}
stage('Build') {
steps {
bat 'pandoc -s C:\\wamp64\\www\\index.md -o C:\\wamp64\\www\\index.html'
bat 'pandoc -s C:\\wamp64\\www\\index.md -o C:\\wamp64\\www\\index.docx'
}
}
stage('Deploy') {
steps {
bat 'COPY C:\\wamp64\\www\\index.html COPY C:\\wamp64\\www\\html\\index.html'
bat 'COPY C:\\wamp64\\www\\index.docx COPY C:\\wamp64\\www\\html\\doc\\index.docx'
}
}
}
}
Thanks for helping.
Jenkins doesn't automatically take your Windows (path) environment variables. Instead, what you need to do is to go to Jenkins -> Configure System -> Global properties -> Environment variables and add a new variable called Path. For the value, set $Path, and your path variables should start getting registered.
The issue has been discussed extensively in this question.

Jenkinsfile shell command not using env variables as expected

In my Jenkinsfile I want to dynamically find the unity version using a python script like so:
environment {
UNITY_EDITOR = bat(script: "py $WORKSPACE/get_versions.py --unity", returnStdout: true).trim()
UNITY_BASE = "C:/Program Files/Unity/Hub/Editor/$UNITY_EDITOR/Editor/Unity.exe"
UNITY_WRAPPER = "UnityBatchWrapper -silent-crashes -no-dialogs -batchmode -quit -unityPath \"$UNITY_BASE\""
}
post {
always {
script {
echo "Returning license"
licenseReturnStatus = bat (
script: "$UNITY_WRAPPER -returnlicense",
returnStatus: true
) == 0
}
}
From other stackoverflow answers this seems like it should work, but instead my Jenkins job errors out during the post-build step because $UNITY_WRAPPER isn't defined:
groovy.lang.MissingPropertyException: No such property: UNITY_WRAPPER for class: groovy.lang.Binding
I'm thinking the batch step is what's failing, even though Jenkins doesn't complain about it. I've also tried using $env.WORKSPACE and %WORKSPACE% and that doesn't work either.
I'm beginning to think $WORKSPACE doesn't exist til after the environments step...
Turns out I didn't have Python installed since it was an ephemeral GCP builder and I hadn't updated the node label yet.
For anyone reading this that has trouble with bat commands - be sure to put an # sign in front of your command like "#py ..." or else the command will be echoed in the output. Also trim your output so it doesn't have CRLF in it.

Jenkins - Is it possible to use two different versions of terraform in a jenkins file

Currently we are using terraform 11 but we would like to start moving to 12. The idea is to move module by module, which means some modules will be using terraform version 11 and those that can run on 12 will be using version 12.
My question now is that in our Jenkins file, we have a stage which downloads terraform 11 and then different stages then to run terraform, is it possible to download terraform 12 as well and have some stages then use 11 and and others use 12?
stage('Download Terraform') {
steps{
sh "wget path/terraform/0.11.8/terraform-0.11.8.zip"
sh "unzip -o terraform-0.11.8.zip"
sh "rm terraform-0.11.8.zip"
}
}
stage('Create .terraformrc') {
steps {
sh "echo ~"
writeFile file: "/home/user/.terraformrc", text: """
credentials "" {
token = ""
}
"""
}
}
stage('Enable CloudTrail') {
steps {
{code}
}
}
stage('Create Automation Lambdas') {
steps {
{code}
}
}
In the above example, i would like the "Enable Cloudtrail" stage to run terraform 12 and the "Create Automation Lambdas" stage to run with terraform 11.....
This is how I solved this (sorry if this is not exactly what you needed):
Install terraform plugin
In jenkins UI > Global Tools Configuration add multiple terraform installations and name them in a consistent, predictable way:
In you pipeline you can now 'pick' the version to use:
pipeline {
agent { label terraformdevagent }
environment {
TF_HOME = tool('terraform-0.14.4')
PATH = "$TF_HOME:$PATH"
}}
You can make an optional parameter and a function that returns the parameter if not null and a hardcoded value if the parameter is null.
This way you can have hundreds of jobs under one TF version and still test in isolation the new version.
It can also let you handle exceptions and not block your rollouts because of a handful of projects that need to be refactored for the new version.
TF_HOME = tool( "${params.newTerraformVersion}?:'terraform-0.14.4'")

Jenkins Pipeline - How do I use the 'tool' option to specify a custom tool?

I have a custom tool defined within Jenkins via the Custom Tools plugin. If I create a freestyle project the Install custom tools option correctly finds and uses the tool (Salesforce DX) during execution.
However, I cannot find a way to do the same via a pipeline file. I have used the pipeline syntax snippet generator to get:
tool name: 'sfdx', type: 'com.cloudbees.jenkins.plugins.customtools.CustomTool'
I have put that into my stage definition:
stage('FetchMetadata') {
print 'Collect Prod metadata via SFDX'
tool name: 'sfdx', type: 'com.cloudbees.jenkins.plugins.customtools.CustomTool'
sh('sfdx force:mdapi:retrieve -r metadata/ -u DevHub -k ./metadata/package.xml')
}
but I get an error message stating line 2: sfdx: command not found
Is there some other way I should be using this snippet?
Full Jenkinsfile for info:
node {
currentBuild.result = 'SUCCESS'`
try {
stage('CheckoutRepo') {
print 'Get the latest code from the MASTER branch'
checkout scm
}
stage('FetchMetadata') {
print 'Collect Prod metadata via SFDX'
tool name: 'sfdx', type: 'com.cloudbees.jenkins.plugins.customtools.CustomTool'
sh('sfdx force:mdapi:retrieve -r metadata/ -u DevHub -k ./metadata/package.xml')
}
stage('ConvertMetadata') {
print 'Unzip retrieved metadata file'
sh('unzip unpackaged.zip .')
print 'Convert metadata to SFDX format'
sh('/usr/local/bin/sfdx force:mdapi:convert -r metadata/unpackaged/ -d force-app/')
}
stage('CommitChanges') {
sh('git add --all')
print 'Check if any changes need committing'
sh('if ! git diff-index --quiet HEAD --; then echo "changes found - pushing to repo"; git commit -m "Autocommit from Prod # $(date +%H:%M:%S\' \'%d/%m/%Y)"; else echo "no changes found"; fi')
sshagent(['xxx-xxx-xxx-xxx']) {
sh('git push -u origin master')
}
}
}
catch (err) {
currentBuild.result = 'FAILURE'
print 'Build failed'
error(err)
}
}
UPDATE
I have made some progress using this example Jenkinsfile
My stage now looks like this:
stage('FetchMetadata') {
print 'Collect Prod metadata via SFDX'
def sfdxLoc = tool 'sfdx'
sh script: "cd topLevel; ${sfdxLoc}/sfdx force:mdapi:retrieve -r metadata/ -u DevHub -k ./metadata/package.xml"
}
Unfortunately, although it looks like Jenkins is now finding and running the sfdx tool, I get a new error:
TypeError: Cannot read property 'run' of undefined
at Object.<anonymous> (/var/lib/jenkins/.cache/sfdx/tmp/heroku-script-509584048:20:4)
at Module._compile (module.js:570:32)
at Object.Module._extensions..js (module.js:579:10)
at Module.load (module.js:487:32)
at tryModuleLoad (module.js:446:12)
at Function.Module._load (module.js:438:3)
at Module.runMain (module.js:604:10)
at run (bootstrap_node.js:394:7)
at startup (bootstrap_node.js:149:9)
at bootstrap_node.js:509:3
I ran into the same problem. I got to this workaround:
environment {
GROOVY_HOME = tool name: 'Groovy-2.4.9', type: 'hudson.plugins.groovy.GroovyInstallation'
}
stages {
stage('Run Groovy') {
steps {
bat "${groovy_home}/bin/groovy <script.name>"
}
}
}
Somehow the tool path is not added to PATH by default (as was customary on my 1.6 Jenkins server install). Adding the ${groovy_home} when executing the bat command fixes that for me.
This way of calling a tool is basically lent from the scripted pipeline syntax.
I am using this for all my custom tools (not only groovy).
The tool part:
tool name: 'Groovy-2.4.9', type: 'hudson.plugins.groovy.GroovyInstallation'
was generated by the snippet generator like you did.
According to the Jenkins users mailing list, work is still ongoing for a definitive solution, so my solution really is a work around.
This is my first time commenting on stack overflow, but I've been looking for this answer for a few days and I think I have a potential solution. Checking out Fholst answer, I'd like to expand on it. That environment stanza I think may work for declarative syntax, but on a scripted pipeline you must use the withEnv() equivalent, and pass in the tools via a gString: i.e. ${tool 'nameOfToolDefinedInGlobalTools'}. For my particular use case, for reasons beyond my control, we do not have maven installed on our jenkins host machine, but there is one defined within the global tools configuration. This means I need to add mvn to the path before executing my sh commands within my steps. What I have been able to do is this:
withEnv(["PATH+MVN=${tool 'NameOfMavenTool'}/bin"]){
sh '''
echo "PATH = ${PATH}"
'''
}
This should give you what you need. Please ignore the triple single quotes on the sh line, I actually have several environment variables loaded and simply removed them from my snippet.
Hope this helps anyone who has been searching for this solution for days. I feel your pain. Cobbled this together from looking through the console output of a declarative pipeline script (if you use tools{} stanza it will show you how it builds those environment variables and wraps your subsequent declarative steps) and the following link: https://go.cloudbees.com/docs/cloudbees-documentation/use/automating-projects/jenkinsfile/
You may be having a problem because of the path to your sfdx install folder if you are on Windows. The Dreamhouse Jenkinsfile was written for a linux shell or Mac terminal so some changes are necessary to make it work on Windows.
${sfdxLoc}/sfdx
Should be
\"${sfdxLoc}/sfdx\"
So that the command line handles any spaces properly.
https://wipdeveloper.com/2017/06/22/salesforce-dx-jenkins-jenkinsfile/

Echo off in Jenkins Console Output

I'm following guideline how to sign Android apk with Jenkins. I have parametrized Jenkins job with KSTOREPWD and KEYPWD. A part of Jenkins' job configuration (Build->Execute shell) is to take those parameters and store them as environment variables:
export KSTOREPWD=${KSTOREPWD}
export KEYPWD=${KEYPWD}
...
./gradlew assembleRelease
The problem is when the build is over anybody can access the build "Console Output" and see what passwords were entered; part of that output:
08:06:57 + export KSTOREPWD=secretStorePwd
08:06:57 + KSTOREPWD=secretStorePwd
08:06:57 + export KEYPWD=secretPwd
08:06:57 + KEYPWD=secretPwd
So I'd like to suppress echo before output from export commands and re-enable echo after export commands.
By default, Jenkins launches Execute Shell script with set -x. This causes all commands to be echoed
You can type set +x before any command to temporary override that behavior. Of course you will need set -x to start showing them again.
You can override this behaviour for the whole script by putting the following at the top of the build step:
#!/bin/bash +x
Here is an example of how to write the sh parameter in Jenkinsfile with no output in a more secure way, as suggested in official documentation. The set +x does the main magic as has been written in this answer.
The single-quotes will
cause the secret to be expanded by the shell as an environment
variable. The double-quotes are potentially less secure as the secret
is interpolated by Groovy, and so typical operating system process
listings (as well as Blue Ocean, and the pipeline steps tree in the
classic UI) will accidentally disclose it:
Insecure, wrong usage:
node {
withCredentials([string(credentialsId: 'mytoken', variable: 'TOKEN')]) {
sh /* WRONG! */ """
set +x
curl -H 'Token: $TOKEN' https://some.api/
"""
}
}
Correct usage ✅:
node {
withCredentials([string(credentialsId: 'mytoken', variable: 'TOKEN')]) {
sh '''
set +x
curl -H 'Token: $TOKEN' https://some.api/
'''
}
}
In your specific situation (using gradle and jenkins) you could also use a Password Parameter, using Gradle's pattern for environment variables (ORG_GRADLE_PROJECT_prop). Gradle will then set a propproperty on your project.
In your case this would look something like this
And you can use it in your gradle.properties like this
signingConfigs {
release {
storeFile file(KEYSTORE)
storePassword KSTOREPWD
keyAlias ALIAS
keyPassword KEYPWD
}
}
BTW - I recommend using the credentials binding plugin for KEYSTORE

Resources