Jenkins withCredentials in dynamic selected parameters - jenkins

could you help me with a litle throuble?
I tried find solution with jenkins and your wonderful plugin: uno-choice, but I couldn't it.
I have very simple script:
#!/usr/bin/env groovy
def sout = new StringBuffer(), serr = new StringBuffer()
def proc ='/var/lib/jenkins/script.sh location'.execute()
proc.consumeProcessOutput(sout, serr)
proc.waitForOrKill(1000)
def credential(name) {
def v;
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: name, usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD']]) {
v = "${env.USERNAME}"
}
return v
}
def key = credential('aws_prod_api')
String str = sout.toString()
String s = str.trim()
String[] items = s.split(",");
def v1 = Arrays.asList(items)
return v1
In general I want get AWS Credentional which save in Jenkins from bash script and with it do something.
I want use withCredentials in block which make selected list, but I don't understand how I can do it.
Could you help me with it?
I will very appreciate it
I tried using withCredentials inside groovy, but I got error:
Fallback to default script... groovy.lang.MissingMethodException: No
signature of method: Script1.withCredentials() is applicable for
argument types: (java.util.ArrayList, Script1$_credential_closure1)
values: [[[$class:UsernamePasswordMultiBinding,
credentialsId:aws_prod_api, ...]], ...] at
org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:58)
at
org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:81)
at

It's because withCredentials does not exist in the scope of Script1. It exists in the scope of the Jenkinsfile DSL. You need to pass it in.
I suggest converting your script to functions. Then passing the Jenkinsfile DSL through to your Groovy code.
def doAwsStuff(dsl) {
...
def key = credential(dsl, 'aws_prod_api')
...
}
def credential(dsl, name) {
def v;
dsl.withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: name, usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD']]) {
v = "${env.USERNAME}"
}
return v
}
and then call it from your Jenkinsfile with:
def result = MyAwsStuff.doAwsStuff(this)

Related

How to create withXCredentials that wraps Jenkins withCredentials using Closure in Jenkins shared variable?

I want to have this code with exactly this syntax in my pipeline script:
withXCredentials(id: 'some-cred-id', usernameVar: 'USER', passwordVar: 'PASS') {
//do some stuff with $USER and $PASS
echo "${env.USER} - ${env.PASS}"
}
Note that you can put any code within withXCredenitals to be executed. withXCredentials.groovy resides in my Jenkins shared library under vars folder and it will use
Jenkins original withCredentials:
//withXCredentials.groovy
def userVar = params.usernameVar
def passwordVar = params.passwordVar
def credentialsId = params.credentialsId
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: credentialsId, usernameVariable: usernameVar, passwordVariable: passwordVar]]) {
body()
}
I am still learning advanced groovy stuff but I can't work out how to do this.
Please note:
My question is more about the syntax in groovy and using Closure and the answer here is not what I am after. With that solution, I need to instantiate the class first and then call the method. So I'm trying to avoid doing something like this:
new WithXCredentials(this).doSomthing(credentialsId, userVar, passwordVar)
In Jenkins documentation it has an example of using closure:
// vars/windows.groovy
def call(Closure body) {
node('windows') {
body()
}
}
//the above can be called like this:
windows {
bat "cmd /?"
}
But it doesn't explain how to pass parameters like this
windows(param1, param2) {
bat "cmd /?"
}
See here
So after digging internet I finally found the answer. In case anyone needs the same thing. The following code will work:
// filename in shared lib: /vars/withXCredentials.groovy
def call(map, Closure body) {
def credentialsId = map.credentialsId
def passwordVariable = map.passwordVariable
def usernameVariable = map.usernameVariable
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: credentialsId, usernameVariable: usernameVariable, passwordVariable: passwordVariable]]) {
echo 'INSIDE withXCredentials'
echo env."${passwordVariable}"
echo env."${usernameVariable}"
body()
}
}
With this you can have the following in your pipeline:
node('name') {
withXCredentials([credentialsId: 'some-credential', passwordVariable: 'my_password',
usernameVariable: 'my_username']) {
echo 'Outside withXCredenitals'
checkout_some_code username: "$env.my_username", password: "$env.my_password"
}
}

Groovy shared library testing pipeline step method with Spock

I have shared library that calls pipeline step method(withCredentials).I am trying to test withCredentails method is being called correctly with sh scripts on calling myMethodToTest but facing error:
class myClass implements Serializable{
def steps
public myClass(steps) {this.steps = steps}
public void myMethodToTest(script, String credentialsId) {
steps.withCredentials([[$class: ‘UsernamePasswordMultiBinding’, credentialsId: "${credentialsId}", usernameVariable: ‘USR’, passwordVariable: ‘PWD’]]) {
steps.sh """
export USR=${script.USR}
export PWD=${script.PWD}
$mvn -X clean deploy
"""
}
}
}
//Mocking
class Steps {
def withCredentials(List args, Closure closure) {}
}
class Script {
public Map env = [:]
}
//Test case
def "testMyMethod"(){
given:
def steps = Mock(Steps)
def script = Mock(Script)
def myClassObj = new myClass(steps)
script.env['USR'] = "test-user"
when:
def result = myClassObj.myMethodToTest(script, credId)
then:
1 * steps.withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: "mycredId", usernameVariable: 'USR', passwordVariable: 'PWD']])
1 * steps.sh(shString)
where:
credId | shString
"mycredId" | "export USR='test-user'"
//Error
Too few invocations for:
1 * steps.withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: "mycredId", usernameVariable: ‘USR’, passwordVariable: ‘PWD’]]) (0 invocations)
Unmatched invocations (ordered by similarity):
1 * steps.withCredentials([['$class':'UsernamePasswordMultiBinding', 'credentialsId':mycredId, 'usernameVariable’:’USR’, 'passwordVariable':'PWD’]]
You have a whole bunch of subtle and not so subtle errors in your code, both test and application classes. So let me provide a new MCVE in which I fixed everything and commented a few crucial parts inside the test:
package de.scrum_master.stackoverflow.q59442086
class Script {
public Map env = [:]
}
package de.scrum_master.stackoverflow.q59442086
class Steps {
def withCredentials(List args, Closure closure) {
println "withCredentials: $args, " + closure
closure()
}
def sh(String script) {
println "sh: $script"
}
}
package de.scrum_master.stackoverflow.q59442086
class MyClass implements Serializable {
Steps steps
String mvn = "/my/path/mvn"
MyClass(steps) {
this.steps = steps
}
void myMethodToTest(script, String credentialsId) {
steps.withCredentials(
[
[
class: "UsernamePasswordMultiBinding",
credentialsId: "$credentialsId",
usernameVariable: "USR",
passwordVariable: "PWD"]
]
) {
steps.sh """
export USR=${script.env["USR"]}
export PWD=${script.env["PWD"]}
$mvn -X clean deploy
""".stripIndent()
}
}
}
package de.scrum_master.stackoverflow.q59442086
import spock.lang.Specification
class MyClassTest extends Specification {
def "testMyMethod"() {
given:
// Cannot use mock here because mock would have 'env' set to null. Furthermore,
// we want to test the side effect of 'steps.sh()' being called from within the
// closure, which also would not work with a mock. Thus, we need a spy.
def steps = Spy(Steps)
def myClass = new MyClass(steps)
def script = new Script()
script.env['USR'] = "test-user"
when:
myClass.myMethodToTest(script, credId)
then:
1 * steps.withCredentials(
[
[
class: 'UsernamePasswordMultiBinding',
credentialsId: credId,
usernameVariable: 'USR',
passwordVariable: 'PWD'
]
],
_ // Don't forget the closure parameter!
)
// Here we need to test for a substring via argument constraint
1 * steps.sh({ it.contains(shString) })
where:
credId | shString
"mycredId" | "export USR=test-user"
}
}

Active Choices Parameter with Credentials

I'm trying to get access to the credentials stored in Jenkins without having to hardcode them in the script itself.
#!/usr/bin/env groovy
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'GroovyAWSScMgr', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
return ["${env.AWS_ACCESS_KEY_ID}"]
}
I've tried:
return [AWS_ACCESS_KEY_ID]
return [env.AWS_ACCESS_KEY_ID]
return ["${env.AWS_ACCESS_KEY_ID}"]
return ["${env.AWS_ACCESS_KEY_ID}"]
The result continues to be NULL
You can try this:
import jenkins.model.*
credentialsId = 'GroovyAWSScMgr'
def creds = com.cloudbees.plugins.credentials.CredentialsProvider.lookupCredentials(
com.cloudbees.plugins.credentials.common.StandardUsernameCredentials.class, Jenkins.instance, null, null ).find{
it.id == credentialsId}
return [creds.username]
You can use creds.usernameand creds.password in you script.
I'm not sure if it is secure.
I tried something similar in Active Choices Parameter for one of my jobs and nothing worked. I have instead used the below to prevent hardcoding credentials
Define your credentials, for ex. in your case AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY with appropriate values, as Environment variables in Manage Jenkins -> Configure System -> Global properties and retrieve them in your script
import jenkins.model.*
instance = Jenkins.getInstance()
globalNodeProperties = instance.getGlobalNodeProperties()
aws_access_key_id = ''
aws_secret_key = ''
globalNodeProperties.each {
envVars = it.getEnvVars()
if (envVars.get('AWS_ACCESS_KEY_ID') != null) {
aws_access_key_id = envVars.get('AWS_ACCESS_KEY_ID');
}
if (envVars.get('AWS_SECRET_ACCESS_KEY') != null) {
aws_secret_key = envVars.get('AWS_SECRET_ACCESS_KEY');
}
}
You can refer them in your script as ${aws_access_key_id} and ${aws_secret_key}

Jenkins Pipelines: How to use withCredentials() from a shared-variable script

I'd like to use a withCredentials() block in a shared-variable ("vars/") script rather than directly in the Jenkins pipeline because this is a lower-level semantic of a particular library, and also may or may not be required depending on the situation. However, withCredentials (or, at least, that signature of it) doesn't appear to be in scope.
script:
def credentials = [
[$class: 'UsernamePasswordMultiBinding', credentialsId: '6a55c310-aaf9-4822-bf41-5500cd82af4e', passwordVariable: 'GERRIT_PASSWORD', usernameVariable: 'GERRIT_USERNAME'],
[$class: 'StringBinding', credentialsId: 'SVC_SWREGISTRY_PASSWORD', variable: 'SVC_SWREGISTRY_PASSWORD']
]
withCredentials(credentials) {
// ...
}
Console:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: BuildagentInstallAndRun.withCredentials() is applicable for argument types: (java.util.ArrayList, org.jenkinsci.plugins.workflow.cps.CpsClosure2) values: [[[$class:UsernamePasswordMultiBinding, credentialsId:6a55c310-aaf9-4822-bf41-5500cd82af4e, ...], ...], ...]
Has anyone had any success with this?
I'm using a shared library rather than a shared variable, but I guess it is a similar situation.
I'm not using the $class parameter, but i'm calling directly one of the functions suggested by the pipeline snippet generator. You can have a list here. In the example below, I use the usernameColonPassword binding.
In the pipeline, I instantiate the class utilities and I pass this to the constructor. Then, in the library, I use the step object to access the pipeline steps (such as withCredentials or usernameColonPassword).
class Utilities implements Serializable {
def steps
Utilities(steps) {
this.steps = steps
}
def doArchiveToNexus(String credentials, String artifact, String artifact_registry_path){
try {
this.steps.withCredentials([steps.usernameColonPassword(credentialsId: credentials, variable: 'JENKINS_USER')]) {
this.steps.sh "curl --user " + '${JENKINS_USER}' + " --upload-file ${artifact} ${artifact_registry_path}"
}
} catch (error){
this.steps.echo error.getMessage()
throw error
}
}
}
You can try following:
import jenkins.model.*
credentialsId = '6a55c310-aaf9-4822-bf41-5500cd82af4e'
def creds = com.cloudbees.plugins.credentials.CredentialsProvider.lookupCredentials(
com.cloudbees.plugins.credentials.common.StandardUsernameCredentials.class, Jenkins.instance, null, null ).find{
it.id == credentialsId}
println creds.username
println creds.password
But it is not secure, everything will be in console log
I was able to obtain credentials inside the shared library with proper passwords masking with such code:
class Utilities implements Serializable {
def steps
Utilities(steps) {
this.steps = steps
}
def execute() {
this.steps.withCredentials(
bindings: [
this.steps.usernameColonPassword(
credentialsId: this.credentialsId,
variable: "unameColonPwd")
]) {
this.steps.sh "echo {this.steps.env.unameColonPwd}"
}
}

Correct way to structure a jenkins groovy pipeline script

I wrote a pipeline that works with jeknins but as a newbie to jenkins scripting I've a lot of stuffs that are not clear to me, Here's the whole script, I'll express the issues below
SCRIPT:
node()
{
def libName = "PROJECT"
def slnPath = pwd();
def slnName = "${slnPath}\\${libName}.sln"
def webProject = "${slnPath}\\PROJECT.Web\\PROJECT.Web.csproj"
def profile = getProperty("profiles");
def version = getProperty("Version");
def deployFolder = "${slnPath}Deploy";
def buildRevision = "";
def msbHome = "C:\\Program Files (x86)\\Microsoft Visual Studio\\2017\\Professional\\MSBuild\\15.0\\Bin\\msbuild.exe"
def msdHome = "C:\\Program Files (x86)\\IIS\\Microsoft Web Deploy V3\\msdeploy.exe"
def nuget = "F:\\NugetBin\\nuget.exe";
def assemblyScript = "F:\\Build\\Tools\\AssemblyInfoUpdatePowershellScript\\SetAssemblyVersion.ps1";
def webserverName ="192.168.0.116";
def buildName = "PROJECT";
def filenameBase ="PROJECT";
stage('SCM update')
{
checkout([$class: 'SubversionSCM', additionalCredentials: [], excludedCommitMessages: '', excludedRegions: '', excludedRevprop: '', excludedUsers: '', filterChangelog: false, ignoreDirPropChanges: false, includedRegions: '', locations: [[credentialsId: '08ae9e8c-8db8-43e1-b081-eb352eb14d11', depthOption: 'infinity', ignoreExternalsOption: true, local: '.', remote: 'http://someurl:18080/svn/Prod/Projects/PROJECT/PROJECT/trunk']], workspaceUpdater: [$class: 'UpdateWithRevertUpdater']])
}
stage('SCM Revision')
{
bat("svn upgrade");
bat("svn info \"${slnPath}\" >revision.txt");
for (String i : readFile('revision.txt').split("\r?\n"))
{
if(i.contains("Last Changed Rev: "))
{
def splitted = i.split(": ")
echo "Revisione : "+ splitted[1];
buildName += "." + splitted[1];
currentBuild.displayName = buildName;
buildRevision += version + "." + splitted[1];
}
}
}
stage("AssemblyInfo update")
{
powerShell("${assemblyScript} ${buildRevision} -path .")
}
stage('Nuget restore')
{
bat("${nuget} restore \"${slnName}\"")
}
stage('Main build')
{
bat("\"${msbHome}\" \"${slnName}\" /p:Configuration=Release /p:PublishProfile=Release /p:DeployOnBuild=true /p:Profile=Release ");
stash includes: 'Deploy/Web/**', name : 'web_artifact'
stash includes: 'PROJECT.Web/Web.*', name : 'web_config_files'
stash includes: 'output/client/release/**', name : 'client_artifact'
stash includes: 'PROJECT.WPF/App.*', name : 'client_config_files'
stash includes: 'PROJECT.WPF/Setup//**', name : 'client_setup'
}
stage('Profile\'s customizations')
{
if (profile != "")
{
def buildProfile = profile.split(',');
def stepsForParallel = buildProfile.collectEntries {
["echoing ${it}" : performTransformation(it,filenameBase,buildRevision)]
}
parallel stepsForParallel;
}
}
post
{
always
{
echo "mimmo";
}
}
}
def powerShell(psCmd) {
bat "powershell.exe -NonInteractive -ExecutionPolicy Bypass -Command \"\$ErrorActionPreference='Stop';[Console]::OutputEncoding=[System.Text.Encoding]::UTF8;$psCmd;EXIT \$global:LastExitCode\""
}
def performTransformation(profile,filename,buildRevision) {
return {
node {
def ctt ="F:\\Build\\Tools\\ConfigTransformationTool\\ctt.exe";
def nsiTool = "F:\\Build\\Tools\\NSIS\\makensis.exe";
def slnPath = pwd();
unstash 'web_artifact'
unstash 'web_config_files'
def source = 'Deploy/Web/Web.config';
def transform = 'PROJECT.Web\\web.' + profile + '.config';
bat("\"${ctt}\" i s:\"${source}\" t:\"${transform}\" d:\"${source}\"" )
def fname= filename + "_" + profile + "_" + buildRevision + "_web.zip";
if (fileExists(fname))
bat("del "+ fname);
zip(zipFile:fname, dir:"Deploy\\Web")
archiveArtifacts artifacts: fname
//Now I generate the client part
unstash 'client_artifact'
unstash 'client_config_files'
unstash 'client_setup'
def sourceClient = 'output/client/release/PROJECT.WPF.exe.config';
def transformClient = 'PROJECT.WPF/App.' + profile + '.config';
bat("\"${ctt}\" i s:\"${sourceClient}\" t:\"${transformClient}\" d:\"${sourceClient}\"" )
def directory = new File(pwd() + "\\output\\installer\\")
if(!directory.exists())
{
bat("mkdir output\\installer");
}
directory = new File( pwd() + "\\output\\installer\\${profile}")
if(!directory.exists())
{
echo " directory does not exist";
bat("mkdir output\\installer\\${profile}");
}
else
{
echo " directory exists";
}
def filename2= filename + "_" + profile + "_" + buildRevision + "_client.zip";
bat("${nsiTool} /DAPP_VERSION=${buildRevision} /DDEST_FOLDER=\"${slnPath}\\output\\installer\\${profile}\" /DTARGET=\"${profile}\" /DSOURCE_FILES=\"${slnPath}\\output\\client\\release\" \"${slnPath}\\PROJECT.WPF\\Setup\\setup.nsi\" ");
if (fileExists(filename2))
bat("del "+ filename2);
zip(zipFile:filename2, dir:"output\\installer\\" + profile);
archiveArtifacts artifacts: filename2
}
}
};
The series of questions are:
I've seen some script where everything is wrapped in a pipeline {}, is this necessary or does Jenkins pipeline plugin paste it?
I really dislike to have all those definitions inside the node and then replicated below.
I don't see inside the Jenkins workflow the parallelism, even if I've 4 executors in idle.
I'm not able to call the post pipeline event to clear the workspace (rigth now It's just en echo
There are 2 types of pipeline. Straight groovy like you have written is referred to as a scripted pipeline. The style that has the pipeline{} block around it is a declarative style pipeline. The declarative tends to be easier for newer Pipeline users and is a good choice for starting out with a pipeline. Many pipelines don't need the complexity that scripted allows.
This is groovy. If you want to declare a bunch of variables, you have to do it somewhere. Otherwise you hard-code those values in your script somewhere. In groovy, you don't HAVE to declare every variable, but you have to define it somewhere, and unless you know how the declaration is going to affect scope, you should just declare them. Most programming languages require some kind of variable declaration, especially when you have to worry about scope, so I don't see that this is a problem. I think it is very clean to define all of the variable values in one place at the top. Easier for maintenance.
At first glance, your parallel execution looks like it should work, but unless I set this up and ran it, it is hard to say. It could be that the parallel parts are running fast enough that the UI doesn't update. You should be able to see in the console output if these are running in parallel.
The post pipeline block is not available in scripted pipeline. That is part of the declarative pipeline syntax. In scripted, to do similar things you have to use try/catch to catch errors and run post-type things.

Resources