Assign Terraform output to a variable within Jenkins Pipeline - jenkins

I am creating an EC2 instance using aws_cloudformation_stack and capturing it's output for private_ip. I want to use this output within my jenkins pipeline so that I can assign it to a variable and then inject that variable into my ansible vars file.
I am having a lot of trouble assigned the value of the terraform output to my private_ip variable in jenkins.
Is something like this possible in terraform or should I be working with a template file instead?
TF code :
resource "aws_cloudformation_stack" "test_instance" {
...
}
output "test_instance_private_ip" {
value = aws_cloudformation_stack.test_instance.outputs.PrivateIp
}
Jenkins pipeline code :
def private_ip
stage('terraform'){
sh """
terraform init
terraform plan
terraform apply --auto-approve
"""
private_ip = sh(returnStdout: true, script: "terraform output test_instance_private_ip").trim()
}
stage('ansible') {
sh """
ansible-playbook -i inventory.yml --limit testgroup -e "private_ip=${private_ip}" playbook.yml -v
"""
}
Current output :
Warning: No outputs found
Warning: Empty or non-existent state

Related

Environment directive with secret file content not passed to shell stage

I'm trying to use the content of a Secret File as an environment variable as follows:
stage("Terraform_plan"){
environment {
TF_VAR__private_key = credentials('SFTP_DEV_KEY')
}
sh '''
## printenv --> Can't see the TF_VAR__private_key env var.
terraform -chdir=terraform/${COMPONENT} init -var-file=./tfvars/${ENV}.${GROUP}.tfvars -upgrade -input=false
terraform -chdir=terraform/${COMPONENT} plan -var- file=./tfvars/${ENV}.${GROUP}.tfvars -out=./plan
'''
}
Getting this error from Terraform:
Error: No value for required variable
on variables.tf line 77:
77: variable "private_key" {
The root module input variable "private_key" is not set, and has no
default value. Use a -var or -var-file command line argument to provide a
value for this variable.
Also tried:
stage("Terraform_plan"){
environment {
TF_VAR_private_key = credentials('SFTP_DEV_KEY')
}
sh '''
export TF_VAR_private_key = '${env.TF_VAR_private_key}'
## printenv --> Can't see the TF_VAR__private_key env var.
terraform -chdir=terraform/${COMPONENT} init -var-file=./tfvars/${ENV}.${GROUP}.tfvars -upgrade -input=false
terraform -chdir=terraform/${COMPONENT} plan -var- file=./tfvars/${ENV}.${GROUP}.tfvars -out=./plan
'''
}
This works as Terraform goes to the Apply stage but still when applying I'm still getting an error that there's no value for the env var.
printenv shows TF_VAR_private_key=${env.TF_VAR_private_key} and not the value of TF_VAR_private_key.
However, if I try to export the env var with export TF_VAR_private_key it works inside shell.
Shouldn't the env var already be set inside the shell?
Thanks in advance,

variable does not get resolved in my jenkinsfile

I am new to jenkins/groovy and I am currently facing following issue...
I defined a deployment pipeline in a jenkinsfile, it consists in deploying scripts on two different environments running under linux. Deployment script copy_files.sh has to run under a specific user to preserve permissions:
def my_dst = '/opt/scripts'
pipeline {
agent { label '<a generic label>' }
stages {
stage('Deployment env1') {
agent { label '<a specific label>' }
steps {
script {
echo 'Deploying scripts...'
sh '/bin/sudo su -c "${WORKSPACE}/copy_files.sh ${WORKSPACE} ${my_dst}" - <another user>'
}
}
}
stage('Deployment env2') {
agent { label '<another specific label>' }
steps {
script {
echo 'Deploying scripts...'
sh '/bin/sudo su -c "${WORKSPACE}/copy_files.sh ${WORKSPACE} ${my_dst}" - <another user>'
}
}
}
}
}
I defined the destination path where files are supposed to be copied as a variable (my_dst same on both envs). While the env variable $WORKSPACE gets resolved my variable my_dst does not, resulting in an abort of the copy_files.sh script because of missing argument...
How can I quote my command properly in order my variable to be resolved? running sudo command with hard-coded destination path /opt/scripts works.
txs in advance
So you want Groovy to inject the my_dst, but the WORKSPACE to come from the shell environment, so you will need to use double quotes (so groovy does the templating), and escape the $ in WORKSPACE so Groovy doesn't template it and it gets passed to the shell as-is:
sh "/bin/sudo su -c \"\${WORKSPACE}/copy_files.sh \${WORKSPACE} ${my_dst}\" - <another user>"

setting environment variable in jenkins scripted pipeline

Im trying to set a environment variable(VIRTUALENV) in Jenkins - stage(check_style) and use that in the shell but it throws a error.
withEnv(['VIRTUAL_ENV=${env.WORKSPACE}/venv']){
stage ('Check_style') {
sh """
export PATH=${VIRTUAL_ENV}/bin:${PATH}
make flake8 | tee report/flake8.log || true
"""
}
}
Error:-
PATH=${env.WORKSPACE}/venv/bin:/usr/bin:/bin:/usr/sbin:/sbin: bad substitution
withEnv(["VIRTUAL_ENV=${env.WORKSPACE}/venv"]) should work

Calling functions in jenkinsfile with variables

I need some help in calling a function in Jenkinsfile along with a variable.
I have created a bash function to copy certain test results from jenkins slave to Jenkins master's userContent directory.
I want to use this function across diff jobs. Different jobs might have different report path, instead of hardcoding the path inside the function i want to use a variable in jenkinsfile to pass along with the function.
Here is my function:
def call() {
sh '''
mkdir -p $JOB_NAME
foldername="$BUILD_NUMBER.$(date '+%d-%m-%Y')"
echo ${foldername}
mkdir -p $JOB_NAME/${foldername}
pwd
reportPath=""
dest="./$JOB_NAME/${foldername}"
cp -R ${reportPath}/*.xml ${dest}
scp -r $JOB_NAME jenkins#master_ip:/var/lib/jenkins/userContent/
'''
}
How do i call the function in jenkinsfile and have a variable with report path?
I guess you want to copy Jenkins Slave artifact to master. You can use following plugin https://wiki.jenkins.io/display/JENKINS/Copy+Artifact+Plugin
You can use following method for declarative pipeline jobs:
stages {
stage('Copy Archive') {
steps {
script {
step ([$class: 'CopyArtifact',
projectName: 'Create_archive',
filter: "packages/infra*.zip",
target: 'Infra']);
}
}
}

Passing parameter in Jenkinsfile to a shell command within a Docker container

I have a Jenkinsfile with a String parameter env_vars. With this parameter I want to set custom environment variables which I want to set later with a shell command within the started Docker container. It is important to set such environment variables on runtime.
This is my simple Jenkinsfile:
pipeline {
options {
timestamps()
}
agent {
node {
label 'master'
}
}
parameters {
string(name: 'env_vars', defaultValue: 'MY_USER_PASSWORD=abc MY_USER_NAME=def', description: 'the ENV variables to set before starting the tests')
}
stages {
stage ('TESTS') {
steps {
script {
withDockerRegistry([credentialsId: 'XXX', url: 'http://example.com']) {
withDockerContainer(image: 'myDockerImage:latest') {
withCredentials([string(credentialsId: 'cred1', variable: 'cred1'), string(credentialsId: 'cred2', variable: 'cred2')]) {
sh '''
# here we go to run npm
${env_vars} npm run test -- chrome --tag=enabled
'''
}
}
}
}
}
}
}
}
And this error I will get in Jenkins:
/var/lib/jenkins/jenkins3/jobs/zTestMG/workspace#tmp/durable-40340d0e/script.sh: line 4: MY_USER_PASSWORD=abc: command not found
One possible workaround is using eval for the shell command:
eval "${env_vars} npm run test -- chrome --tag=enabled"
But I don't want to use eval, because later I have to evaluate the result of the npm run command. And when using eval I will get new problems.
How can I solve the problem to use the String parameter in the shell command within the Docker container?
I have found a possible solution for me. I replace my shell command in two different once:
export ${env_vars}
npm run ${run_script_method} -- ${browser} --tag=${tags}

Resources