I'm following guideline how to sign Android apk with Jenkins. I have parametrized Jenkins job with KSTOREPWD and KEYPWD. A part of Jenkins' job configuration (Build->Execute shell) is to take those parameters and store them as environment variables:
export KSTOREPWD=${KSTOREPWD}
export KEYPWD=${KEYPWD}
...
./gradlew assembleRelease
The problem is when the build is over anybody can access the build "Console Output" and see what passwords were entered; part of that output:
08:06:57 + export KSTOREPWD=secretStorePwd
08:06:57 + KSTOREPWD=secretStorePwd
08:06:57 + export KEYPWD=secretPwd
08:06:57 + KEYPWD=secretPwd
So I'd like to suppress echo before output from export commands and re-enable echo after export commands.
By default, Jenkins launches Execute Shell script with set -x. This causes all commands to be echoed
You can type set +x before any command to temporary override that behavior. Of course you will need set -x to start showing them again.
You can override this behaviour for the whole script by putting the following at the top of the build step:
#!/bin/bash +x
Here is an example of how to write the sh parameter in Jenkinsfile with no output in a more secure way, as suggested in official documentation. The set +x does the main magic as has been written in this answer.
The single-quotes will
cause the secret to be expanded by the shell as an environment
variable. The double-quotes are potentially less secure as the secret
is interpolated by Groovy, and so typical operating system process
listings (as well as Blue Ocean, and the pipeline steps tree in the
classic UI) will accidentally disclose it:
Insecure, wrong usage:
node {
withCredentials([string(credentialsId: 'mytoken', variable: 'TOKEN')]) {
sh /* WRONG! */ """
set +x
curl -H 'Token: $TOKEN' https://some.api/
"""
}
}
Correct usage ✅:
node {
withCredentials([string(credentialsId: 'mytoken', variable: 'TOKEN')]) {
sh '''
set +x
curl -H 'Token: $TOKEN' https://some.api/
'''
}
}
In your specific situation (using gradle and jenkins) you could also use a Password Parameter, using Gradle's pattern for environment variables (ORG_GRADLE_PROJECT_prop). Gradle will then set a propproperty on your project.
In your case this would look something like this
And you can use it in your gradle.properties like this
signingConfigs {
release {
storeFile file(KEYSTORE)
storePassword KSTOREPWD
keyAlias ALIAS
keyPassword KEYPWD
}
}
BTW - I recommend using the credentials binding plugin for KEYSTORE
Related
I have a legacy project in Jenkins that hast to be pipelined (for
later parallelization), hence moving from simple tcsh script to
pipeline
running the script as
#!/bin/tcsh
source ./mysetting.sh
update
works but the same pipeline step fails due to missing alias expansion
stage ('update') {
steps {
//should be working but alias expansion fails
sh 'tcsh -c "source ./mysettings.sh; alias; update"'
//manually expanding the alias works fine
sh 'tcsh -c "source ./mysettings.sh; alias; python update.py;"'
}
}
calling alias in the steps properly lists all the set aliases, so I
can see them, but not use them.
I know in bash alias expansion has to be set
#enable shell option for alias_expansion
shopt -s expand_aliases
but in csh/tcsh that should be taken care of by source.
what am I missing?
found the solution:
sh '#!/bin/tcsh \n' +
'source ./mysettings.sh \n' +
'echo "Calling my alias" \n' +
'my_alias \n'
every line starting with sh launches a new shell, so it has to be in one line including line breaks.
further adding to the confusing was that documentation of jenkins says that it starts "a bash" but it launched /bin/sh which in my case pointed to something else
We are using PACT (https://pact.io/) in our project. The can-i-deploy check, whether the deployment can be executed is done like this:
Jenkins Environment Variables (see <JENKINS_URL>/configure)
PACT_BROKER_URL
PACT_RUBY_STANDALONE_VERSION
(PACT_RUBY_STANDALONE_VERSION from https://github.com/pact-foundation/pact-ruby-standalone/releases)
Jenkinsfile:
environment {
SOURCE_BRANCH_NAME = sourceBranchName(env.BRANCH_NAME, env.CHANGE_BRANCH)
}
...
def sourceBranchName(String branchName, String changeBranchName) {
return changeBranchName == null ? branchName : changeBranchName
}
...
stage('can-i-deploy') {
steps {
withCredentials([usernamePassword(credentialsId: 'XXX', passwordVariable: 'PACT_BROKER_PASSWORD', usernameVariable: 'PACT_BROKER_USERNAME')]) {
sh "curl -LO https://github.com/pact-foundation/pact-ruby-standalone/releases/download/v${PACT_RUBY_STANDALONE_VERSION}/pact-${PACT_RUBY_STANDALONE_VERSION}-linux-x86_64.tar.gz"
sh "tar xzf pact-${PACT_RUBY_STANDALONE_VERSION}-linux-x86_64.tar.gz"
echo "Performing can-i-deploy check"
sh "pact/bin/./pact-broker can-i-deploy --broker-base-url=${PACT_BROKER_URL} --broker-username=${PACT_BROKER_USERNAME} --broker-password=${PACT_BROKER_PASSWORD} --pacticipant=project-frontend --latest=${env.SOURCE_BRANCH_NAME} --pacticipant=project-backend --latest=${env.SOURCE_BRANCH_NAME} --pacticipant=other-project-backend --latest=${env.SOURCE_BRANCH_NAME}"
}
}
}
Is there a more elegant way to do this?
I can't speak for Jenkins, but there are two things worth changing in the arguments sent to can-i-deploy:
It's no recommended to use the latest flag. You should use the --version to indicate the version of the application you are deploying and the --to flag to denote the target environment (latest runs the risk of race conditions with builds giving you false positives/negatives)
You don't need to specify compatible other projects, can-i-deploy will automatically detect all dependent components.
So it would look more like this
can-i-deploy --broker-base-url=${PACT_BROKER_URL} --broker-username=${PACT_BROKER_USERNAME} --broker-password=${PACT_BROKER_PASSWORD} --pacticipant=project-frontend --version some-sha-1234 --to prod
If you have access to docker, you might prefer to use our container.
P.S. if you simply export the following environment variables you can drop them off the argument list also:
PACT_BROKER_BASE_URL (please note the minor difference from what you're using)
PACT_BROKER_USERNAME
PACT_BROKER_PASSWORD
When executing the following code in a Jenkins pipeline, a "The following steps that have been detected may have insecure interpolation of sensitive variables" warning is being added to the build, with a link to https://www.jenkins.io/doc/book/pipeline/jenkinsfile/#string-interpolation with explanation.
powershell script: """
\$ErrorActionPreference = "Stop"
cd "${WORKSPACE}\\MyDirectory"
& .\\myScript.ps1 -user "${creds_USR}" -passw "${creds_PSW}"
"""
I've already tried to change it as described in the link above, but then the variables don't seem to be replaced anymore.
powershell script: '''
\$ErrorActionPreference = \"Stop\"
cd \"$WORKSPACE\\MyDirectory\"
& .\\myScript.ps1 -user \"$creds_USR\" -passw \"$creds_PSW\"
'''
Would somebody know a working solution for this please?
Presumably you have a block like this that's generating those values:
environment {
creds = credentials('some-credentials')
}
So your build environment has those variables available to Powershell. Rather than interpolating the string that constitutes the Powershell script, then, just write the script to pull the data from the environment.
powershell script: '''\
$ErrorActionPreference = "Stop"
cd "$Env:WORKSPACE\MyDirectory"
& .\myScript.ps1 -user "$Env:creds_USR" -passw "$Env:creds_PSW"
'''
I have a custom tool defined within Jenkins via the Custom Tools plugin. If I create a freestyle project the Install custom tools option correctly finds and uses the tool (Salesforce DX) during execution.
However, I cannot find a way to do the same via a pipeline file. I have used the pipeline syntax snippet generator to get:
tool name: 'sfdx', type: 'com.cloudbees.jenkins.plugins.customtools.CustomTool'
I have put that into my stage definition:
stage('FetchMetadata') {
print 'Collect Prod metadata via SFDX'
tool name: 'sfdx', type: 'com.cloudbees.jenkins.plugins.customtools.CustomTool'
sh('sfdx force:mdapi:retrieve -r metadata/ -u DevHub -k ./metadata/package.xml')
}
but I get an error message stating line 2: sfdx: command not found
Is there some other way I should be using this snippet?
Full Jenkinsfile for info:
node {
currentBuild.result = 'SUCCESS'`
try {
stage('CheckoutRepo') {
print 'Get the latest code from the MASTER branch'
checkout scm
}
stage('FetchMetadata') {
print 'Collect Prod metadata via SFDX'
tool name: 'sfdx', type: 'com.cloudbees.jenkins.plugins.customtools.CustomTool'
sh('sfdx force:mdapi:retrieve -r metadata/ -u DevHub -k ./metadata/package.xml')
}
stage('ConvertMetadata') {
print 'Unzip retrieved metadata file'
sh('unzip unpackaged.zip .')
print 'Convert metadata to SFDX format'
sh('/usr/local/bin/sfdx force:mdapi:convert -r metadata/unpackaged/ -d force-app/')
}
stage('CommitChanges') {
sh('git add --all')
print 'Check if any changes need committing'
sh('if ! git diff-index --quiet HEAD --; then echo "changes found - pushing to repo"; git commit -m "Autocommit from Prod # $(date +%H:%M:%S\' \'%d/%m/%Y)"; else echo "no changes found"; fi')
sshagent(['xxx-xxx-xxx-xxx']) {
sh('git push -u origin master')
}
}
}
catch (err) {
currentBuild.result = 'FAILURE'
print 'Build failed'
error(err)
}
}
UPDATE
I have made some progress using this example Jenkinsfile
My stage now looks like this:
stage('FetchMetadata') {
print 'Collect Prod metadata via SFDX'
def sfdxLoc = tool 'sfdx'
sh script: "cd topLevel; ${sfdxLoc}/sfdx force:mdapi:retrieve -r metadata/ -u DevHub -k ./metadata/package.xml"
}
Unfortunately, although it looks like Jenkins is now finding and running the sfdx tool, I get a new error:
TypeError: Cannot read property 'run' of undefined
at Object.<anonymous> (/var/lib/jenkins/.cache/sfdx/tmp/heroku-script-509584048:20:4)
at Module._compile (module.js:570:32)
at Object.Module._extensions..js (module.js:579:10)
at Module.load (module.js:487:32)
at tryModuleLoad (module.js:446:12)
at Function.Module._load (module.js:438:3)
at Module.runMain (module.js:604:10)
at run (bootstrap_node.js:394:7)
at startup (bootstrap_node.js:149:9)
at bootstrap_node.js:509:3
I ran into the same problem. I got to this workaround:
environment {
GROOVY_HOME = tool name: 'Groovy-2.4.9', type: 'hudson.plugins.groovy.GroovyInstallation'
}
stages {
stage('Run Groovy') {
steps {
bat "${groovy_home}/bin/groovy <script.name>"
}
}
}
Somehow the tool path is not added to PATH by default (as was customary on my 1.6 Jenkins server install). Adding the ${groovy_home} when executing the bat command fixes that for me.
This way of calling a tool is basically lent from the scripted pipeline syntax.
I am using this for all my custom tools (not only groovy).
The tool part:
tool name: 'Groovy-2.4.9', type: 'hudson.plugins.groovy.GroovyInstallation'
was generated by the snippet generator like you did.
According to the Jenkins users mailing list, work is still ongoing for a definitive solution, so my solution really is a work around.
This is my first time commenting on stack overflow, but I've been looking for this answer for a few days and I think I have a potential solution. Checking out Fholst answer, I'd like to expand on it. That environment stanza I think may work for declarative syntax, but on a scripted pipeline you must use the withEnv() equivalent, and pass in the tools via a gString: i.e. ${tool 'nameOfToolDefinedInGlobalTools'}. For my particular use case, for reasons beyond my control, we do not have maven installed on our jenkins host machine, but there is one defined within the global tools configuration. This means I need to add mvn to the path before executing my sh commands within my steps. What I have been able to do is this:
withEnv(["PATH+MVN=${tool 'NameOfMavenTool'}/bin"]){
sh '''
echo "PATH = ${PATH}"
'''
}
This should give you what you need. Please ignore the triple single quotes on the sh line, I actually have several environment variables loaded and simply removed them from my snippet.
Hope this helps anyone who has been searching for this solution for days. I feel your pain. Cobbled this together from looking through the console output of a declarative pipeline script (if you use tools{} stanza it will show you how it builds those environment variables and wraps your subsequent declarative steps) and the following link: https://go.cloudbees.com/docs/cloudbees-documentation/use/automating-projects/jenkinsfile/
You may be having a problem because of the path to your sfdx install folder if you are on Windows. The Dreamhouse Jenkinsfile was written for a linux shell or Mac terminal so some changes are necessary to make it work on Windows.
${sfdxLoc}/sfdx
Should be
\"${sfdxLoc}/sfdx\"
So that the command line handles any spaces properly.
https://wipdeveloper.com/2017/06/22/salesforce-dx-jenkins-jenkinsfile/
Given the following pipeline:
stages {
stage ("Checkout SCM") {
steps {
checkout scm
sh "echo ${CHANGE_AUTHOR_EMAIL}"
sh "echo ${CHANGE_ID}"
}
}
}
Why do these variables fail to resolve and provide a value?
Eventually I want to use these environment variables to send an email and merge a pull request:
post {
failure {
emailext (
attachLog: true,
subject: '[Jenkins] $PROJECT_NAME :: Build #$BUILD_NUMBER :: build failure',
to: '$CHANGE_AUTHOR_EMAIL',
replyTo: 'iadar#...',
body: '''<p>You are receiving this email because your pull request was involved in a failed build. Check the attached log file, or the console output at: $BUILD_URL to view the build results.</p>'''
)
}
}
and
sh "curl -X PUT -d '{\'commit_title\': \'Merge pull request\'}' <git url>/pulls/${CHANGE_ID}/merge?access_token=<token>"
Oddly enough, $PROJECT_NAME, $BUILD_NUMBER, $BUILD_URL do work...
Update: this may be an open bug... https://issues.jenkins-ci.org/browse/JENKINS-40486 :-(
Is there any workaround to get these values?
You need to be careful about how you refer to environment variables depending on whether it is shell or Groovy code, and how you are quoting.
When you do sh "echo ${CHANGE_ID}", what actually happens is that Groovy will interpolate the string first, by replacing ${CHANGE_ID} with the Groovy property CHANGE_ID, and that's where your error message is from. In Groovy, the environment variables are wrapped in env.
If you want to refer to the environment variables directly from your shell script, you either have to interpolate with env, use single quotes, or escape the dollar sign. All of the following should work:
sh 'echo $CHANGE_ID'
sh "echo \$CHANGE_ID"
sh "echo ${env.CHANGE_ID}"
For anyone who may come across this, these variables are available only if the checkbox for Build origin PRs (merged with base branch) was checked (this is in a multi-branch job).
See more in this other Jenkins issue: https://issues.jenkins-ci.org/browse/JENKINS-39838