I have a simple jenkins job that just runs aws ssm send-command and it fails with:
"An error occurred (AccessDeniedException) when calling the SendCommand operation: User: arn:aws:sts::1234567890:assumed-role/jenkins-live/i-1234567890abc is not authorized to perform: ssm:SendCommand on resource: arn:aws:ssm:us-east-1:1234567890:document/my-document-name"
However, the IAM permissions are correct. To prove it, I directly SSH onto that instance and run the exact same ssm command, and it works. I verify it's using the instance role by running aws sts get-caller-identity and it returns arn:aws:sts::1234567890:assumed-role/jenkins-live/i-1234567890abc which is the same user mentioned in the error message.
So indeed, this assumed role can run the command.
I even modified the jenkins job to run aws sts get-caller-identity first, and it outputs the same user json.
Does jenkins do some caching that I am unaware of? Why would I get that AccessDeniedException if that jenkins-live user can run the command otherwise?
First, install the AWS Credentials and AWS Steps plugins and register your AWS key and secret access key in Jenkins credential store. Then, the next steps depends if you're using a freestyle or a declarative/scripted pipeline.
If you're using a freestyle pipeline: On "Build Environment", click on "Use secret text(s) or file(s)" and follow the next steps. After that, you're gonna have your credentials as variables in your pipeline;
If you're using a declarative/scripted pipeline: Enclose your aws calls with a withAWS block, something like this:
withAWS(region: 'us-east-1', credentials: 'my-pretty-credentials') {
// let's explode something
}
Best regards.
Related
I am trying to pass user-scoped credentials to a downstream job in Jenkins in the declerative pipeline in order to be used from the downstream job for AWS Authentication. I have checked the option "Run as User who triggered the buid" in jenkins settings. When I trigger the Job it is working but when I try to trigger it from another job the it is giving me an error like "Error: " and after that is giving the credentials ID. Which means that the credentials are pass to the job but fore some reason the cannot be used.
I use the credentials like this: environment { creds = credentials("${AWSCredentials}") } in a stage of the declerative pipeline and it is failing right there. My goal is to make all the job to run with each user's personalized credentials and not to use Global credentials to Access and Modify AWS Resources through those jobs.
I have Jenkins file to deploy my application into EKS cluster. From jenkins side i installed AWS credential plugin and I added Jenkins credential my secret key and access key values into the box.
Next when I'm running Jenkins build deployment stage falling with below error .
Unable to connect to the server: getting credentials: exec: executable aws not found
It looks like you are trying to use a client-go credential plugin that is not installed.
I faced similar issue and found it was a PATH settings issue. Basically aws is not found in PATH . What you could do is add "env" to the code and see what PATH values are in console output. To set the PATH correctly
Manage Jenkins -> Configure System -> Global properties -> Environment variables: name=PATH, value= (Ex: /usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin/ )
It might sound silly but I was trying to store my dockerhub password inside Mange credentials of jenkins as Secret text so that it can be accessed in the pipeline script.
Here is the secret which I have created
Here is a pipeline script where i trying to access the password using the ID
node {
stage("Docker Login"){
sh 'docker login -u rahulwagh17 -p ${DOCKER_HUB_PASSWORD}'
}
}
But it always fails with -
You're looking for the withCredentials method of jenkins' pipeline DSL.
Have a look here:
https://support.cloudbees.com/hc/en-us/articles/203802500-Injecting-Secrets-into-Jenkins-Build-Jobs
Every Job has it's pipeline syntax button available in it's dashboard:
$JENKINs_URL/$YOUR_JOB/pipeline-syntax/.
You can generate an adequate withCredentials blog there.
terraform plan -var-file=uservar.tfvars
[0m[1mRefreshing Terraform state in-memory prior to plan...[0m
The refreshed state will be used to calculate this plan, but will not be
persisted to local or remote state storage.
[0m
[31mError refreshing state: 1 error(s) occurred:
* provider.aws: InvalidClientTokenId: The security token included in the request is invalid.
status code: 403, request id: 39888d7e-b3f1-11e7-b6d2-9b6dc0727868[0m[0m
Build step 'Execute shell' marked build as failure
Finished: FAILURE
You need to run terraform init first, then terraform plan
Go through this document first: terraform Command: init
Secondly, for error The security token included in the request is invalid., please go through aws configuration to make sure you have set the AWS security token properly.
I got resolution for this one.
Do you pull your code form Git? If yes, pull only once
Note: Code on Git does not have access_key and secret_key, so if your jenkins job contains source control for every build the values (access_key and secret_key) are overridden.
-> Pull the code only once
-> In your jenkins/workspaces, set your access_key and secret_Key (or you can also mention it in Jenkins Build)
Before executing Jenkins job, Set aws cli parameters
aws configure
Set access_key, secret_key & region
and then execute
terraform init
terraform plan -var-file=uservar.tfvars
I am currently trying to find a way to allow only certain users to use stored credentials in jenkins. I have not found a way to do this using the credential plugin. I am using the role based Access plugin as well.
is there a way to create credential domains that can only be accessed by allowed users?
how can a user use the credentials that they provide in their own "user profile" configuration area?
Is this possible ? or is there another plugin that can do this.
Not to miss an obvious answer - undocumented, but easy enough to figure out.
It only works for ssh/sftp access.
Edit a job.
Mark it "parameterized".
Add a parameter
type: credential
named my-ssh-private-key
Mark "SSH agent"
parameterized credential = ${my-ssh-private-key}
Enter ssh my_host "echo Hello world" as job's batch script action
Try to build the job. You would be asked to choose credentials, and here, lo and behold, you can pick also your private credentials amongst the global ones. But other users can only run this job if they have their own authorized ssh private key (with public part included in the remote authorized_keys).
If you have "host key verification" error, just go to jenkins shell to fix:
su - jenkins
ssh my_host # confirm the host key to be saved into your known_hosts
tail ~/.ssh/known_hosts