Jenkins active choices plugin - how to get value of password parameter - jenkins

I have a parametrized jenkins job with 2 parameters:
1st job parameter is APIKEY of type 'Password parameter'
2nd job parameter is SERVICE of type 'Active Choices Reactive Parameter' - single select, referencing parameter APIKEY and using following groovy script code which returns value of APIKEY parameter in the single select UI control:
[ APIKEY ]
When I start the build of this job, value offered in single select UI control for parameter SERVICE is garbled (encrypted?) value of APIKEY.
What I want is to be able to use actual (decrypted) value of entered APIKEY password parameter in the script code of SERVICE parameter.
I tried decrypting the APIKEY garbled value by using hudson.util.Secret like below but with no luck:
def apikey = hudson.util.Secret.fromString(APIKEY).getPlainText()
Is there any way to get actual password parameter value from active choices reactive parameter groovy script code?

After a little bit more trying this out it turns out this is working properly after all - but only when password parameter is entered manually, not with the default password parameter value (not sure if this is a bug or a feature).
First time the job is run default password parameter value provided is garbled, but entering the value again in the password field then gives the correct value in groovy script.
This worked for me:
run job build
at this point APIKEY value in groovy script code of the SERVICE field is not evaluated correctly - it is garbled value
enter correct value in APIKEY password parameter field - e.g. "abc123"
switch focus to SERVICE field
SERVICE field groovy code gets executed now and shows actual entered value of APIKEY: "abc123"
Since my use case is such that entering APIKEY is mandatory every time job is build this is good enough for me.

This is an old topic, but I found a solution so I'll add it here in case anyone else still needs it. This was working code, but I sanitized it for publication.
This Groovy script runs in an Active Choices Reactive Parameter. The task is to provide a list of the build versions available to deploy from an internal Artifactory archive. The API key needed for the REST call is stored as Secret Text in our Jenkins instance. So this code reads from the Credentials plugin's repo to find the secret text, then adds it to the header of the http request.
This is a clunky solution. There is much more elegant withCredentials method for Groovy, but it may only work in Jenkins pipelines. I didn't find a way to use it in this parameter.
This solution also does not use HTTPBuilder, which would have been simpler, but wasn't available in our Groovy plugin.
import org.apache.http.client.methods.*
import org.apache.http.impl.client.*
import groovy.json.JsonSlurper;
def APP_FULL_NAME = "My.Project.Name"
def request = new HttpGet("https://fakeDns/artifactory/api/search/versions?r=releases&a="+APP_FULL_NAME)
def jenkinsCredentials = com.cloudbees.plugins.credentials.CredentialsProvider.lookupCredentials(
com.cloudbees.plugins.credentials.Credentials.class,
Jenkins.instance,
null,
null
);
def apiKey
for (creds in jenkinsCredentials)
{
//println creds.id
//println creds.class
if(creds.id == "my_target_api_key")
{
apiKey = creds.secret.toString(creds.secret);
break
}
}
request.addHeader("X-API-KEY", apiKey)
def responseString = new DefaultHttpClient().execute(request, new BasicResponseHandler());
def branchList = new JsonSlurper().parseText(responseString)
//return branchList
def myList= []
branchList.results.each { myList << it }
return myList.version

Related

AWS CDK DocDB::DBCluster fails with 'not a valid password'

I am trying to use AWS CKD (JAVA) to create a DocumentDB instance.
This works with a "simple" plaintext password, but fails when I try to use a DatabaseSecret and a password stored in Secrets Manager.
The error I get is this:
1:44:42 PM | CREATE_FAILED | AWS::DocDB::DBCluster | ApiDocDb15EB2C21
The parameter MasterUserPassword is not a valid password. Only printable ASCII characters besides '/', '#', '"', ' ' may
be used. (Service: AmazonRDS; Status Code: 400; Error Code: InvalidParameterValue; Request ID: c786d247-8ff2-4f30-9a8a-5
065fc89d3d1; Proxy: null)
which is clear enough, but it continues to happen, even if I set the password to something such as simplepassword - so I am now somewhat confused as to what am I supposed to fix now.
Here is the code, mostly adapted from the DocDB documentation:
String id = String.format(DOCDB_PASSWORD_ID);
return DatabaseSecret.Builder.create(scope, id)
.secretName(store.getSsmSecretName())
.encryptionKey(passwordKey)
.username(store.getAdminUser())
.build();
where the ssmSecretName is the name of the secret in SecretManager:
└─( aws secretsmanager get-secret-value --secret-id api-db-admin-pwd
ARN: arn:aws:secretsmanager:us-west-2:<ACCT>:secret:api-db-admin-pwd-HHxpFf
Name: api-db-admin-pwd
SecretString: '{"api-db-admin-pwd":"simplepassword"}'
This is the code used to build the DbCluster:
DatabaseCluster dbCluster = DatabaseCluster.Builder.create(scope, id)
.dbClusterName(properties.getDbName())
.masterUser(Login.builder()
.username(properties.getAdminUser())
.kmsKey(passwordKey)
.password(masterPassword.getSecretValue())
.build())
.vpc(vpc)
.vpcSubnets(ISOLATED_SUBNETS)
.securityGroup(dbSecurityGroup)
.instanceType(InstanceType.of(InstanceClass.MEMORY5, InstanceSize.LARGE))
.instances(properties.getReplicas())
.storageEncrypted(true)
.build();
The question I have is: should I use a DatabaseSecret? or just retrieve the password from SM and be done with it?
A sub-question then: what is one supposed to use the DatabaseSecret for then?
(NOTE -- this is the same class, almost, as in the rds package; but here I am using the docdb package)
Thanks for any suggestion!
Turns out that the DatabaseSecret creates a key/value pair as the secret:
{
"username": <value of username()>,
"password": <generated>
}
However, the call to Login.password() completely ingnores this, and treats the whole JSON body as the password (so the " double quotes trip it).
The trick is to use DatabaseSecret.secretValueFromJson("password") in the call to Login.password() and it works just fine.
This is (incidentally) inconsistent with the behavior of rds.DatabaseCluster and the rds.Credentials class behavior (who take a JSON SecretValue and parse it correctly for the "password" field).
Leaving it here in case others stumble on this, as there really is NO information out there.

How to fetch SSM Parameters from two different accounts using AWS CDK

I have a scenario where I'm using CodePipeline to deploy my cdk project from a tools account to several environment accounts.
The way my pipeline is deploying is by running cdk deploy from within a CodeBuild job.
My team has decided to use SSM Parameter Store to store configuration and we ended up with some parameters living in the environment account, for example the VPC_ID (resources/vpc/id) that I can read in deployment time => ssm.StringParameter.valueForStringParameter.
However, other parameters are living in the tools account, such as the Account Ids from my environment accounts (environment/nonprod/account/id) and other Global Config. I'm having trouble fetching those values.
At the moment, the only way I could think of was by using a step to read all those values in a previous step and loaded them into the context values.
Is there a more elegant approach for this problem? I was hoping I could specify in which account to get the SSM values from. Any ideas?
Thank you.
As you already stated there is no native support for that. I am also using CodePipeline in cross-account deployments, so all the automation parameters or product specified parameters are stored in a secured account and CodePipeline deploys the resources using CloudFormation as an action provider.
Cross account resolution of SSM parameters isn't supported, so in the end, I had added an extra step (stage) in my CodePipeline, which is nothing else but a CodeBuild project, which runs a script in a containerized environment and scripts then "syncs" the parameters from the automation account to the destination account.
As part of your pipeline, I would add a preliminary step to execute a Lambda. That Lambda can then execute whatever queries you wish to obtain whatever metadata/config that is required. The output from that Lambda can then be passed in to the CodeBuild step.
e.g. within the Lambda:
export class ConfigFetcher {
codepipeline = new AWS.CodePipeline();
async fetchConfig(event: CodePipelineEvent, context : Context) : Promise<void> {
// Retrieve the Job ID from the Lambda action
const jobId = event['CodePipeline.job'].id;
// now get your config by executing whatever queries you need, even cross-account, via the SDK
// we assume that the answer is in the variable someValue
const params = {
jobId: jobId,
outputVariables: {
MY_CONFIG: someValue,
},
};
// now tell CodePipeline you're done
await this.codepipeline.putJobSuccessResult(params).promise().catch(err => {
console.error('Error reporting build success to CodePipeline: ' + err);
throw err;
});
// make sure you have some sort of catch wrapping the above to post a failure to CodePipeline
// ...
}
}
const configFetcher = new ConfigFetcher();
exports.handler = async function fetchConfigMetadata(event: CodePipelineEvent, context : Context): Promise<void> {
return configFetcher.fetchConfig(event, context);
};
Assuming that you create your pipeline using CDK, then your Lambda step will be created using something like this:
const fetcherAction = new LambdaInvokeAction({
actionName: 'FetchConfigMetadata',
lambda: configFetcher,
variablesNamespace: 'ConfigMetadata',
});
Note the use of variablesNamespace: we need to refer to this later in order to retrieve the values from the Lambda's output and insert them as env variables into the CodeBuild environment.
Now our CodeBuild definition, again assuming we create using CDK:
new CodeBuildAction({
// ...
environmentVariables: {
MY_CONFIG: {
type: BuildEnvironmentVariableType.PLAINTEXT,
value: '#{ConfigMetadata.MY_CONFIG}',
},
},
We can call the variable whatever we want within CodeBuild, but note that ConfigMetadata.MY_CONFIG needs to match the namespace and output value of the Lambda.
You can have your lambda do anything you want to retrieve whatever data it needs - it's just going to need to be given appropriate permissions to reach across into other AWS accounts if required, which you can do using role assumption. Using a Lambda as a pipeline step will be a LOT faster than using a CodeBuild step in the pipeline, plus it's easier to change: if you write your Lambda code in Typescript/JS or Python, you can even use the AWS console to do in-place edits whilst you test that it executes correctly.
AFAIK there is no native way to achieve what you described. If there is way I'd like to know too. I believe you can use the CloudFormation custom resource baked by lambda for this purpose.
You can pass parameters to the lambda request and get information back from the lambda response.
See https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/template-custom-resources-lambda.html, https://www.2ndwatch.com/blog/a-step-by-step-guide-on-using-aws-lambda-backed-custom-resources-with-amazon-cfts/ and https://docs.aws.amazon.com/cdk/api/latest/docs/custom-resources-readme.html for more information.
This question is a year old, but a simpler method I found for retrieving parameters from your tools/deployment account is to specify them as env variables in your buildspec file. CodeBuild will always pull these from whatever account your job is running in (which in this question's scenario would be the tools account).
To pull parameters from your target environment accounts, it's best to use the CDK SSM approach suggested by the question author.

Searching LDAP from Jenkins scripted pipeline

we store a GitHub account in one of the AD User attribute. When receiving a Pull Request webhook from GitHub I want to find a user based on GitHub Account and notify user about tests results. How to do that?
if I understand you correctly you want to take email attribute value by another user attribute. can't test the following code but it should give you an idea how to do that.
import javax.naming.directory.InitialDirContext
import javax.naming.directory.DirContext
Properties ldapProps = [
'java.naming.factory.initial' :'com.sun.jndi.ldap.LdapCtxFactory',
'java.naming.security.authentication':'simple',
'java.naming.provider.url' :'ldap://ldap-host:389',
'java.naming.security.principal' :'ldap-access-username',
'java.naming.security.credentials' :'ldap-access-password',
] as Properties
String user = 'user-to-search'
String attr = 'mail' //attribute name for email could be different in your ldap
String ldapFilter = "CN=${user}" //put instead of `CN` git attribute
String[] attrValues = [] //array because could be several values for one attribute
DirContext ldapCtx = new InitialDirContext(ldapProps)
ldapCtx.search("", ldapFilter, null).each{ldapUser->
def ldapAttr = ldapUser.getAttributes().get(attr)
attrValues = ldapAttr?.getAll()?.collect{it.toString()}
}
ldapCtx.close()
println "found values: $attrValues"

Retrieve jenkins job builder credentials in groovy

I am trying to extract the username and password in jenkins groovy script who has initiated the build. I need these details to post comments on jira from my name.
So for eg.. I login into jenkins and start a job, then my login credentials should be used to post the comment on jira..
I tried alot of posts but didnt find anytihng related to my requirement.
Any help will be appreciated..
after few seconds of Googling, I found this script officially published by cloudbees.
So, as follows:
Jenkins.instance.getAllItems(Job).each{
def jobBuilds=it.getBuilds()
//for each of such jobs we can get all the builds (or you can limit the number at your convenience)
jobBuilds.each { build ->
def runningSince = groovy.time.TimeCategory.minus( new Date(), build.getTime() )
def currentStatus = build.buildStatusSummary.message
def cause = build.getCauses()[0] //we keep the first cause
//This is a simple case where we want to get information on the cause if the build was
//triggered by an user
def user = cause instanceof Cause.UserIdCause? cause.getUserId():""
//This is an easy way to show the information on screen but can be changed at convenience
println "Build: ${build} | Since: ${runningSince} | Status: ${currentStatus} | Cause: ${cause} | User: ${user}"
// You can get all the information available for build parameters.
def parameters = build.getAction(ParametersAction)?.parameters
parameters.each {
println "Type: ${it.class} Name: ${it.name}, Value: ${it.dump()}"
}
}
}
You will get the user ID of the user, which start the job, for sure you will not be able to get his credentials, at least not in the plain text.
Little explanation
//to get all jobs
Jenkins.instance.getAllItems(Job)
{...}
//get builds per job
def jobBuilds=it.getBuilds()
//get build cause
def cause = build.getCauses()[0] //we keep the first cause
//if triggered by an user get id, otherwise empty string
def user = cause instanceof Cause.UserIdCause? cause.getUserId():""

How to know which user answered a Jenkins-Pipeline input step?

I have a Jenkinsfile script that tests for the possibility to perform an SVN merge and then asks the user for the permission to commit the merge.
I would like to know the username that answers the "input" step in order to write it into the commit message.
Is this possibile?
This is what hypothetically I would like to do:
outcome = input message: 'Merge trunk into branch?', ok: 'Merge'
echo "User that allowed merge: ${outcome.user}"
The input step got an optional submitterParameter, which allows to specify the key of the returned Map that should contain the user who's submitting the input dialog:
If specified, this is the name of the return value that will contain the ID of the user that approves this input.
The return value will be handled in a fashion similar to the parameters value.
Type: String
This looks then as follows:
def feedback = input(submitterParameter: 'submitter', ...)
echo "It was ${feedback.submitter} who submitted the dialog."
P.S: If anybody is interested in a full-fledged code snippet returning the user both for positive and negative feedback to the dialog (and timeout as well), I kindly point to our pipeline library.
It is not currently possible, for now only entry parameters are returned in the input step answer, as mentionned in source code :
// TODO: perhaps we should return a different object to allow the workflow to look up
// who approved it, etc?
switch (mapResult.size()) {
case 0:
return null; // no value if there's no parameter
case 1:
return mapResult.values().iterator().next();
default:
return mapResult;
}
If you'd like to restrict which user(s) can approve the input step, you can however use the submitter parameter, e.g. :
input message: 'Approve ?', submitter: 'authorized-submitter'
EDIT
Since January 2017 it is now possible to request additional parameters to be sent. Please see StephenKing answer above.
If you are not asking for any parameters on the input, then adding the submitterParameter kind of worked. It didn't add it as a parameter on the return object, instead, it turned the returned object into a string with the username in it.
def feedback = input(submitterParameter: 'submitter')
echo "It was ${feedback} who submitted the dialog."
You can do this for exceptions if you turn off the groovy-sandbox:
try {
'Deploy to production?'
node {
sh 'echo deploying'
}
} catch(e) {
def user = e.getCauses()[0].getUser()
echo "Production deployment aborted by:\n ${user}"
}

Resources