JENKINS- aws configure --profile - jenkins

im trying to automate the creation of a new user with aws cli
steps{
sh '''
aws --version
aws configure --profile superappaws
'''
}
This throws me an EOF error.
+ aws configure --profile superappaws
AWS Access Key ID [None]:
EOF when reading a line
is there a way to enter this credentials? or how can i create the user with jenkins?
i have to created a pipeline with this steps
activate env
pip install awscli
aws configure --profile superappaws
Enter the credentials
export AWS_PROFILE=superappaws
aws s3 ls >> check if the user is created

You can set the AWS credentials as environment variables.
environment {
AWS_ACCESS_KEY_ID = credentials('jenkins-aws-secret-key-id')
AWS_SECRET_ACCESS_KEY = credentials('jenkins-aws-secret-access-key')
}

Related

How to fetch PCF credentials configured in Jenkins variable?

Jenkins is configured to deploy PCF application. PCF login credentials is configured in Jenkins as variables. Is there any way to fetch the PCF login credential details from Jenkins variables?
echo "Pushing PCF App"
cf login -a https://api.pcf.org.com -u $cduser -p $cdpass -o ORG -s ORG_Dev
cf push pcf-app-04v2\_$BUILD_NUMBER -b java_buildpack -n pcf-app-04v2\_$BUILD_NUMBER -f manifest-prod.yml -p build/libs/*.jar
cf map-route pcf-app-04v2\_$BUILD_NUMBER apps-pr03.cf.org.com --hostname pcf-app
cf delete -f pcf-app
cf rename pcf-app-04v2\_$BUILD_NUMBER pcf-app
cf delete-orphaned-routes -f
Rather than accessing Jenkins credentials outside to manually run your app, you may probably define a simple pipeline in Jenkins and can define a custom script into it to perform these tasks. In script you can access it through credentials() function and use the environment variable in your commands.
E.g.
environment {
CF_USER = credentials('YOUR JENKINS CREDENTIAL KEY')
CF_PWD = credentials('CF_PWD')
}
def deploy() {
script {
sh '''#!/bin/bash
set -x
cf login -a https://api.pcf.org.com -u ${CF_USER} -p ${CF_PWD} -o ORG -s ORG_Dev
pwd
'''
}

New Docker Build secret information for use with aws cli

I would like to use the new --secret flag in order to retreive something from aws with its cli during the build process.
# syntax = docker/dockerfile:1.0-experimental
FROM alpine
RUN --mount=type=secret,id=mysecret,dst=/root/.aws cat /root/.aws
I can see the credentials when running the following command:
docker build --no-cache --progress=plain --secret id=mysecret,src=%USERPROFILE%/.aws/credentials .
However, if I adjust the command to be run, the aws cli cannot find the credentials file and asks me to do aws configure:
RUN --mount=type=secret,id=mysecret,dst=/root/.aws aws ssm get-parameter
Any ideas?
The following works:
# syntax = docker/dockerfile:1.0-experimental
FROM alpine
RUN --mount=type=secret,id=aws,dst=/aws export AWS_SHARED_CREDENTIALS_FILE=/aws aws ssm get-parameter ...

Unable to use gcloud in a jenkins docker-agent

I'm trying to run a jenkins pipeline with a docker agent (google/cloud-sdk:alpine) to deploy my code to App Engine. Unfortunately, it seams I have no permission to to that although I'm root in the docker.
The issue tends to be the same as in this post : Jenkins Pipeline gcloud problems in docker
But there is no right answer to this issue.
When I launch theses command by hand, everything works.
My Jenkinsfile is :
pipeline {
agent {
docker {
image 'registry.hub.docker.com/google/cloud-sdk:alpine'
args '-v $HOME:/home -w /home'
}
}
stages {
stage('Deploy') {
steps {
withCredentials([file(credentialsId: 'bnc-hub', variable: 'SECRET_JSON')]) {
sh '''
set +x
gcloud auth activate-service-account --key-file $SECRET_JSON
gcloud config set project bnc-hub
gcloud app deploy app.yaml
'''
}
}
}
}
}
The return in Jenkins is :
[workspace] Running shell script
+ set +x
WARNING: Could not setup log file in /.config/gcloud/logs, (Error: Could not create directory [/.config/gcloud/logs/2018.12.28]: Permission denied.
Please verify that you have permissions to write to the parent directory.)
script returned exit code 1
By default HOME=/
Add
HOME=$WORKSPACE
before
gcloud auth activate-service-account --key-file=${GCP_SA}
worked for me

Install Python dependencies on another server using Jenkins Pipeline

Hi is there a possibility to ssh to a server activate a virtual env and install requirements from my Jenkins Pipeline Project
I have tried this but does not seem to maintain my virtual env session
node {
sh '''
ssh server virtualenv myvenv
ssh server source myvenv/bin/activate && which python "
'''
}
I found the solution you have to run it like this
ssh server "source myvenv/bin/activate; which python"

How to use credential type "SSH Username with private key" inside jenkinsfile for pipeline job

I'm trying to run remote commands using ssh from jenkinsfile. For this I'm using *.pem file for accessing the remote machine. I already have jenkins credentials created with the type "SSH Username with private key" inside jenkins.
Is there any way that i can use that credential inside jenkinsfile instead of giving sh "ssh -i *.pem username#hostname command" to authenticate?
No, you would need to use Jenkins Credentials Binding Plugin. Basically you create a binding from the key file, set the key as any variable , lets say mykey and you can use the key in your build scripts as cat mykey.
Or store in your build script as:
cat mykey > sshkey
chmod 600 sshkey
eval `ssh-agent -s`
ssh-add sshkey
and then you can ssh since the ssh key is added
This post lays all features of Jenkins Credentials Binding plugin really good https://support.cloudbees.com/hc/en-us/articles/203802500-Injecting-Secrets-into-Jenkins-Build-Jobs

Resources