How can I load credentials.json file into jenkins pipeline job? - jenkins

I have a script which runs against google Directory API.
The purpose of this script is to download all users from Google's Directory of our company.
But when I run that script it give an error
FileNotFoundError: [Errno 2] No such file or directory: 'credentials.json'
Although that file is in the folder.
Do I need to put that file into the Credential manager or ....?
I also have a groovy file which just install the pip and activate venv and nothing else.
Here is groovy code
stage('Check activity') {
steps {
sh 'pwd'
sh '''#!/bin/bash
set -e
if ! which pipenv >/dev/null; then
echo \'no pipenv, installing...\'
pip3 install --user pipenv
if ! which pipenv >/dev/null; then
# default location for: /home/jenkins/.local/bin/pipenv
my_pip_env="/home/${USER}/.local/bin/pipenv"
fi
else
echo \'pipenv already installed, nothing to do.\'
my_pip_env=$(which pipenv)
fi
# pipenv version, check & install
${my_pip_env} --version
${my_pip_env} install
# run script
PYTHONPATH=$(pwd):${PYTHONPATH} \\
PIPENV_PIPFILE=$(realpath ./Pipfile) \\
${my_pip_env} run -v python3 ./it/google-users/google_user.py -v "${VERSION}" -dr "${DRY_RUN}" -et "${EXCLUDED_TYPES}"
# Remove virtualenv project
${my_pip_env} --rm
'''
}
}
do I need to define environment variables in groovy script?

When reading a file from your workspace, it is best to use readFile function
readFile('credentials.json')
In your case, you could read it into a variable and then pass it into the next steps of your script. Something like this:
pipeline {
agent any
stages {
stage('Check activity')
{
steps {
script {
sh '''#!/bin/bash
echo "hello" > hello.txt
'''
def mydata = readFile('hello.txt')
sh "echo My file data: ${mydata}"
}
}
}
}
}

Related

how to go inside a specific directory and run commands inside it in a jenkins pipeline

I am trying to run a gradle command inside a jenkins pipeline and for that i should cd <location> where gradle files are.
I added a cd command inside my pipeline but that is not working. I did this
stage('build & SonarQube Scan') {
withSonarQubeEnv('sonarhost') {
sh 'cd $WORKSPACE/sonarqube-scanner-gradle/gradle-basic'
sh 'echo ${PWD}'
sh 'gradle tasks --all'
sh 'gradle sonarqube --debug'
}
}
But the cd is not working, I tried dir step as suggested in pipeline docs, but i want to cd inside $WORKSPACE folder.
How can i fix this?
Jenkins resets the directory after each command. So after the first sh, it goes back to the previous location. The dir command is the correct approach, but it should be used like this:
dir('') {
}
Similar to how you have used withSonarQubeEnv
Alternatively, you can simply chain all the commands
sh 'cd $WORKSPACE/sonarqube-scanner-gradle/gradle-basic & echo ${PWD} & ...'
But this is not recommended. Since this will all be in the same command, it will run fine though.

Jenkins Jenkinsfile Groovy bash command no such file or directory

The file validates and I look to have the proper syntax.
script {
sh """
summon -f folder/file.yml --provider summon-aws-secrets \
sh -c 'bash folder/bin/run_me.sh' \
"""
open folder/file.yml: no such file or directory
I confirmed the existence of the file and workspace location.
Try using full path with workspace variable:
script {
sh """
summon -f ${WORKSPACE}/folder/file.yml --provider summon-aws-secrets \
sh -c 'bash folder/bin/run_me.sh' \
"""
}
so what I see happening is I wrapped the file into a script. ran git add , git commit, git push. I updated the jenkins file to ls -l the folder and I notice that file is missing. so not sure if this is a git issue or jenkins or etc

Build and Run Docker Container in Jenkins

I need to run docker container in Jenkins so that installed libraries like pycodestyle can be runnable in the following steps.
I successfully built Docker Container (in Dockerfile)
How do I access to the container so that I can use it in the next step? (Please look for >> << code in Build step below)
Thanks
stage('Build') {
// Install python libraries from requirements.txt (Check Dockerfile for more detail)
sh "docker login -u '${DOCKER_USR}' -p '${DOCKER_PSW}' ${DOCKER_REGISTRY}"
sh "docker build \
--tag '${DOCKER_REGISTRY}/${DOCKER_TAG}:latest' \
--build-arg HTTPS_PROXY=${PIP_PROXY} ."
>> sh "docker run -ti ${DOCKER_REGISTRY}/${DOCKER_TAG}:latest sh" <<<
}
}
stage('Linting') {
sh '''
awd=$(pwd)
echo '===== Linting START ====='
for file in $(find . -name '*.py'); do
filename=$(basename $file)
if [[ ${file:(-3)} == ".py" ]] && [[ $filename = *"test"* ]] ; then
echo "perform PEP8 lint (python pylint blah) for $filename"
cd $awd && cd $(dirname "${file}") && pycodestyle "${filename}"
fi
done
echo '===== Linting END ====='
'''
}
You need to mount the workspace of your Jenkins job (containing your python project) as volume (see "docker run -v" option) to your container and then run the "next step" build step inside this container. You can do this by providing a shell script as part of your project's source code, which does the "next step" or write this script in a previous build stage.
It would be something like this:
sh "chmod +x build.sh"
sh "docker run -v $WORKSPACE:/workspace ${DOCKER_REGISTRY}/${DOCKER_TAG}:latest /workspace/build.sh"
build.sh is an executable script, which is part of your project's workspace and performans the "next step".
$WORKSPACE is the folder that is used by your jenkins job (normally /var/jenkins_home/jobs//workspace - it is provided by Jenkins as a build variable.
Please note: This solution requires that the Docker daemon is running on the same host as Jenkins! Otherwise the workspace will not be available to your container.
Another solution would be to run Jenkins as Docker container, so you can share the Jenkins home/workspaces easily with the containers you run within your build jobs, like described here:
Running Jenkins tests in Docker containers build from dockerfile in codebase

Multistep shell commands on Jenkins pipeline

I have a jenkins job that has a shell step with following commands. It runs great!
sudo yum install python36
virtualenv -p python3 test
source test/bin/activate
<some other command>
Now I want to make this into a pipeline. How do I write the same in groovy?
I tried using syntax like this but it fails:
stage('Test') {
steps {
sh 'sudo yum install python36'
sh 'virtualenv -p python3 test'
}
}
In order to to execute multiple shell commands you need to wrap them in a pair of three single quotes ''':
stage('Test') {
steps {
sh '''
sudo yum install python36
virtualenv -p python3 test
'''
}
}
And if your shell commands container GStrings like ${some_str} use double quotes:
stage('Test') {
steps {
sh """
sudo yum install ${some_package}
virtualenv -p python3 test
"""
}
}

Building Go app with "vendor" directory on Jenkins with Docker

I'm trying to set up a Jenkins Pipeline to build and deploy my first Go project using a Jenkinsfile and docker.image().inside . I can't figure out how to get go to pick up the dependencies in the vendor/ directory.
When I run the build, I get a bunch of errors:
+ goapp test ./...
src/dao/demo_dao.go:8:2: cannot find package "github.com/dgrijalva/jwt-go" in any of:
/usr/lib/go_appengine/goroot/src/github.com/dgrijalva/jwt-go (from $GOROOT)
/usr/lib/go_appengine/gopath/src/github.com/dgrijalva/jwt-go (from $GOPATH)
/workspace/src/github.com/dgrijalva/jwt-go
...why isn't it picking up the Vendor directory?
When I throw in some logging, it seems that after running sh "cd /workspace/src/bitbucket.org/nalbion/go-demo" the next sh command is still in the original ${WORKSPACE} directory. I really like the idea of the Jenkins file, but I can't find any decent documentation for it.
(Edit - there is decent documentation here but dir("/workspace/src/bitbucket.org/nalbion/go-demo") {} doesn't seem to work within docker.image().inside)
My Docker file resembles:
FROM golang:1.6.2
# Google's App Engine Go SDK
RUN wget https://storage.googleapis.com/appengine-sdks/featured/go_appengine_sdk_linux_amd64-1.9.40.zip -q -O go_appengine_sdk.zip && \
unzip -q go_appengine_sdk.zip -d /usr/lib/ && \
rm go_appengine_sdk.zip
ENV PATH /usr/lib/go_appengine:/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
ENV GOPATH /usr/lib/go_appengine/gopath
# Add Jenkins user
RUN groupadd -g 132 jenkins && useradd -d "/var/jenkins_home" -u 122 -g 132 -m -s /bin/bash jenkins
And my Jenkinsfile:
node('docker') {
currentBuild.result = "SUCCESS"
try {
stage 'Checkout'
checkout scm
stage 'Build and Test'
env.WORKSPACE = pwd()
docker.image('nalbion/go-web-build:latest').inside(
"-v ${env.WORKSPACE}:/workspace/src/bitbucket.org/nalbion/go-demo " +
"-e GOPATH=/usr/lib/go_appengine/gopath:/workspace") {
// Debugging
sh 'echo GOPATH: $GOPATH'
sh "ls -al /workspace/src/bitbucket.org/nalbion/go-demo"
sh "cd /workspace/src/bitbucket.org/nalbion/go-demo"
sh "pwd"
sh "go vet ./src/..."
sh "goapp test ./..."
}
stage 'Deploy to DEV'
docker.image('nalbion/go-web-build').inside {
sh "goapp deploy --application go-demo --version v${v} app.yaml"
}
timeout(time:5, unit:'DAYS') {
input message:'Approve deployment?', submitter: 'qa'
}
stage 'Deploy to PROD'
docker.image('nalbion/go-web-build').inside {
sh "goapp deploy --application go-demo --version v${v} app.yaml"
}
} catch (err) {
currentBuild.result = "FAILURE"
// send notifications
throw err
}
}
I managed to get it working by including the cd in the same sh statement:
docker.image('nalbion/go-web-build:latest')
.inside("-v ${env.WORKSPACE}:/workspace/src/bitbucket.org/nalbion/go-demo " +
"-e GOPATH=/usr/lib/go_appengine/gopath:/workspace") {
sh """
cd /workspace/src/bitbucket.org/nalbion/go-demo
go vet ./src/...
goapp test ./...
"""
}

Resources