Pass User-Scoped Credentials in Downstream Job in Jenkins is giving error - jenkins

I am trying to pass user-scoped credentials to a downstream job in Jenkins in the declerative pipeline in order to be used from the downstream job for AWS Authentication. I have checked the option "Run as User who triggered the buid" in jenkins settings. When I trigger the Job it is working but when I try to trigger it from another job the it is giving me an error like "Error: " and after that is giving the credentials ID. Which means that the credentials are pass to the job but fore some reason the cannot be used.
I use the credentials like this: environment { creds = credentials("${AWSCredentials}") } in a stage of the declerative pipeline and it is failing right there. My goal is to make all the job to run with each user's personalized credentials and not to use Global credentials to Access and Modify AWS Resources through those jobs.

Related

Why is a Jenkins script job failing to use proper AWS credentials?

I have a simple jenkins job that just runs aws ssm send-command and it fails with:
"An error occurred (AccessDeniedException) when calling the SendCommand operation: User: arn:aws:sts::1234567890:assumed-role/jenkins-live/i-1234567890abc is not authorized to perform: ssm:SendCommand on resource: arn:aws:ssm:us-east-1:1234567890:document/my-document-name"
However, the IAM permissions are correct. To prove it, I directly SSH onto that instance and run the exact same ssm command, and it works. I verify it's using the instance role by running aws sts get-caller-identity and it returns arn:aws:sts::1234567890:assumed-role/jenkins-live/i-1234567890abc which is the same user mentioned in the error message.
So indeed, this assumed role can run the command.
I even modified the jenkins job to run aws sts get-caller-identity first, and it outputs the same user json.
Does jenkins do some caching that I am unaware of? Why would I get that AccessDeniedException if that jenkins-live user can run the command otherwise?
First, install the AWS Credentials and AWS Steps plugins and register your AWS key and secret access key in Jenkins credential store. Then, the next steps depends if you're using a freestyle or a declarative/scripted pipeline.
If you're using a freestyle pipeline: On "Build Environment", click on "Use secret text(s) or file(s)" and follow the next steps. After that, you're gonna have your credentials as variables in your pipeline;
If you're using a declarative/scripted pipeline: Enclose your aws calls with a withAWS block, something like this:
withAWS(region: 'us-east-1', credentials: 'my-pretty-credentials') {
// let's explode something
}
Best regards.

How to use Jenkins credential in downstream promoted build?

I am trying to trigger a job (we'll call it downstream) from a parent job promotion (we'll call it parent) using the promoted builds plugin. But the downstream job fails with the following error:
ERROR: Could not find credentials entry with ID '${credentialssh}'
The parent job has a promotion which is passing parameter credentialssh=mycred in the Trigger parameterized build on other projects action. This triggers the downstream job.
The downstream job is parameterized with a parameter type of Credential Parameter set to name credentialssh. The credential type is SSH username with private key. The build environment is set to Use secret text(s) or file(s). there is a binding for SSH User Private Key and the parameter expansion is set to ${credentialssh}.
I have a SSH Username with private key global credential with ID mycred.
It fails when running the promotion from parent. But when I run the downstream job manually (selecting the mycred credential) it works.
Versions:
Jenkins 2.204.2
Credentials Binding 1.20
Credentials Plugin 2.3.1
promoted builds plugin 3.5
Potentially related Jenkins Jira issues (which I am not quite certain how to handle based on comments):
https://issues.jenkins-ci.org/browse/JENKINS-23977
https://issues.jenkins-ci.org/browse/JENKINS-58967
How can I get this working from the promoted builds plugin?

Jenkinsfile build remotely - How to set 'Authentication token' from jenkinsfile

I have Jenkinsfile project, which I want to trigger remotely (from script). I'm familiar with the 'Trigger builds remotely (e.g. from scripts)', but I think because I use Jenkinsfile (and not the Jenkins's GUI), I don't have option to save the token in the project settings.
So I'm trying to create 'Authentication Token' from Jenkinsfile, but I don't know how.
I tried the following code I found online, but I got "Undefined section 'authenticationToken'":
pipeline {
authenticationToken('mytokenn')
}
Do you know how to create token from jenkinsfile?

How do I pass SSH keys from Jenkins Pipeline to Jenkins build jobs?

I'm working on a set of jobs to tag a bunch of related Git repos with the same tag. At the moment, the flow is decomposed into three types of jobs: an overall Jenkins scripted Pipeline, a job that does a build and drops a tag if the build succeeds, and a job triggered by the tagging job that does the final release build. My intention is to allow users to run either the overall pipeline or one of the jobs beneath it depending on if they need to re-run a step in the process or do an entire release.
One of my requirements is that this all needs to happen with the invoking user's credentials, which are then passed to Git so the updates (maven pom changes, etc.) are logged into the commit history as their user. I was successful in this by combining User-scoped credentials with the Authorize Project plugin (so the job can access the user-scoped credentials), the Build User Vars to set user.name and user.email in Git, and the SSH Agent plugin to supply the keys to Git so the commit and tag can be pushed as the correct user.
What I'm trying to do now is collect the user's SSH key with a credentials parameter to the scripted pipeline job and then pass that credentials parameter to the downstream tagging job (which also takes a credentials parameter). Unfortunately, when I do that the downstream job fails because the SSH Agent in the downstream job can't retrieve the credentials based on the value that the credentials parameter in the pipeline passes on to the credentials parameter in the tagging job.
The error I'm getting is:
FATAL:
java.io.IOException: [ssh-agent] Could not find specified credentials
at com.cloudbees.jenkins.plugins.sshagent.SSHAgentBuildWrapper.preCheckout(SSHAgentBuildWrapper.java:209)
at jenkins.scm.SCMCheckoutStrategy.preCheckout(SCMCheckoutStrategy.java:76)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:490)
at hudson.model.Run.execute(Run.java:1737)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:421)
FATAL: [ssh-agent] Could not find specified credentials
java.io.IOException: [ssh-agent] Could not find specified credentials
at com.cloudbees.jenkins.plugins.sshagent.SSHAgentBuildWrapper.preCheckout(SSHAgentBuildWrapper.java:209)
at jenkins.scm.SCMCheckoutStrategy.preCheckout(SCMCheckoutStrategy.java:76)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:490)
at hudson.model.Run.execute(Run.java:1737)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:421)
Right now, my Job DSL for the pipeline job looks like this:
parameters {
stringParam('sitePrefix',Projects.siteAbbr,"Three-character site code")
activeChoiceParam('modules'){
choiceType('MULTI_SELECT')
groovyScript{
script("[${projectsAsGroovyString}]")
}
description("Modules to build")
}
credentialsParam('gitUser'){
type('com.cloudbees.jenkins.plugins.sshcredentials.impl.BasicSSHUserPrivateKey')
required()
description('Personal SSH Key for tagging and releasing')
}
stringParam('gitBranch','develop','Branch to tag')
stringParam('releaseVersion',null,'Version you want to release')
stringParam('developmentVersion',null,'Snapshot version to set after release. If unset, generates a new patch snapshot based on the release version')
}
and my actual pipeline code contains code like this:
def tag_params = [
[$class:'com.cloudbees.plugins.credentials.CredentialsParameterValue',name: 'gitUser',value:params.gitUser],
// credentials(name:'gitUser',value:params.gitUser),
string(name:'gitBranch',value:params.gitBranch),
string(name:'releaseVersion',value:params.releaseVersion),
string(name:'developmentVersion',value:params.developmentVersion),
booleanParam(name:'buildRelease',value:false),
]
stage('Tag bom'){
// Run tag job
build job: "bom_tag_release", parameters: tag_params
// Run release build
build job: "bom_tag_build", parameters: build_params
}
The downstream job is just using another credentials parameter to receive the credentials, not the Credentials Binding plugin because that only seems to handle secret files not the SSH keys that SSH Agent needs. Is passing a credential id from a pipeline to a job even possible or should I be looking at another approach?
Thanks!

How to build with parameters in Jenkins from Gitlab push?

I have GitLab Community Edition 8.15.2 successfully trigger pipeline projects in Jenkins 2.32.1 using a webhook. I want the gitlab push to trigger a build with parameters but the parameter value is null when it comes through so the build fails.
The gitlab webhook looks like:
http://jenkins.server:8080/project/project-a/buildWithParameters?MYPARAM=foo
In my pipeline project I echo the parameter value out with
echo "MYPARAM: ${MYPARAM}"
and it's not set to anything. Any ideas on where I've gone wrong?
UPDATE
The actual code I'm using in the pipeline is:
node {
try {
echo "VM_HOST: ${VM_HOST}"
echo "VM_NAME: ${VM_NAME}"
stage('checkout') {
deleteDir()
git 'http://git-server/project/automated-build.git'
}
stage('build') {
bat 'powershell -nologo -file Remove-MyVM.ps1 -VMHostName %VM_HOST% -VMName "%VM_NAME%" -Verbose'
}
...
}
}
The parameter VM_HOST has a default value but VM_NAME doesn't. In my Console output in Jenkins I can see:
[Pipeline] echo
VM_HOST: HyperVHost
[Pipeline] echo
VM_NAME:
I have been struggling with this for weeks. I had it working once, but I couldn't get it to work again, untill today. And the solution was mindblowingly obvious ofcourse...
Automatically for each pipeline job I ticked the following box:
Build when a change is pushed to GitLab. GitLab CI Service URL:
http://jenkins.dev:8080/project/MyProject
Then from GitLab I used the webhook to trigger the above.
Like you I tried to add /buildWithParameters and tried many other things that didn't work.
The problem was, I ticked the wrong checkbox!
Since I trigger the build from a GitLab webhook, the above checkbox (build when a...) does not have to be checked at all.
What needs to be checked is:
Trigger builds remotely (e.g., from scripts)
That checkbox provides you with a new URL:
Use the following URL to trigger build remotely:
JENKINS_URL/job/MyProject/build?token=TOKEN_NAME or
/buildWithParameters?token=TOKEN_NAME
Like all the documentation I came along states and as you can see, the URL now no longer starts with /project, but with /job instead!
So tick that box and change your URL accordingly:
http://jenkins.server:8080/**job**/project-a/buildWithParameters?token=TOKEN_NAME&MYPARAM=foo
Least I want to mention the token:
In the GitLab webhook there is a seperate field for "token", which states:
Use this token to validate received payloads. It will be sent with the request in the X-Gitlab-Token HTTP header.
So, the token provided there will be sent along the request as a HTTP header.
This is the token which can be provided globally in the Jenkins setup.
The token you must provide in the Jenkins job when ticking the box Use the following URL to trigger build remotely must be send in the URL as GET parameter, just like the example shows.
Final note: personally I have never got this working completely, because I don't get the Jenkins CSRF protection off my back. Disabling it gives me another error. However, hopefully the above does fix the problem for you and others.
GitLab plugin does not allow you to pass arbitrary parameters. In their project there is an open issue for it that deserves to be upvoted.
My convoluted solution was to use the desired values for the push trigger as the default parameters of the job. Then I used the Parameterized Scheduler plugin to use other values in the scheduled executions.
The problem is that I got a bad usability for the job when it was manually run, since the default parameters were appropriate for the push hook.
I found the solution here https://www.jittagornp.me/blog/jenkins-gitlab-webhook/
I verified it with Jenkins 2.263.1 and GitLab Community Edition 13.6.1
Your webhook url will look like
https://hunter:11a403302a4f01b9b4975c0ac27441a5cc#jenkinsservername.com/job/yourjenkinsproject/buildWithParameters?token=Aju9ryHUu6t7W8wLSeCWtY2bWjzQduYNPyY7B3gs&yourparam=yourvalue
"hunter" ist your username in Jenkins.
The following is the Jenkins API Token you have to create in your Jenkins User Managment independent of the project.
The last Token is the one you specify in the jenkins project options under "Trigger builds remotely (e.g., from scripts)"
The last thing is to add your Parameter and value to the url with &param=value

Resources