I have a jenkins job that has a shell step with following commands. It runs great!
sudo yum install python36
virtualenv -p python3 test
source test/bin/activate
<some other command>
Now I want to make this into a pipeline. How do I write the same in groovy?
I tried using syntax like this but it fails:
stage('Test') {
steps {
sh 'sudo yum install python36'
sh 'virtualenv -p python3 test'
}
}
In order to to execute multiple shell commands you need to wrap them in a pair of three single quotes ''':
stage('Test') {
steps {
sh '''
sudo yum install python36
virtualenv -p python3 test
'''
}
}
And if your shell commands container GStrings like ${some_str} use double quotes:
stage('Test') {
steps {
sh """
sudo yum install ${some_package}
virtualenv -p python3 test
"""
}
}
Related
I have a script which runs against google Directory API.
The purpose of this script is to download all users from Google's Directory of our company.
But when I run that script it give an error
FileNotFoundError: [Errno 2] No such file or directory: 'credentials.json'
Although that file is in the folder.
Do I need to put that file into the Credential manager or ....?
I also have a groovy file which just install the pip and activate venv and nothing else.
Here is groovy code
stage('Check activity') {
steps {
sh 'pwd'
sh '''#!/bin/bash
set -e
if ! which pipenv >/dev/null; then
echo \'no pipenv, installing...\'
pip3 install --user pipenv
if ! which pipenv >/dev/null; then
# default location for: /home/jenkins/.local/bin/pipenv
my_pip_env="/home/${USER}/.local/bin/pipenv"
fi
else
echo \'pipenv already installed, nothing to do.\'
my_pip_env=$(which pipenv)
fi
# pipenv version, check & install
${my_pip_env} --version
${my_pip_env} install
# run script
PYTHONPATH=$(pwd):${PYTHONPATH} \\
PIPENV_PIPFILE=$(realpath ./Pipfile) \\
${my_pip_env} run -v python3 ./it/google-users/google_user.py -v "${VERSION}" -dr "${DRY_RUN}" -et "${EXCLUDED_TYPES}"
# Remove virtualenv project
${my_pip_env} --rm
'''
}
}
do I need to define environment variables in groovy script?
When reading a file from your workspace, it is best to use readFile function
readFile('credentials.json')
In your case, you could read it into a variable and then pass it into the next steps of your script. Something like this:
pipeline {
agent any
stages {
stage('Check activity')
{
steps {
script {
sh '''#!/bin/bash
echo "hello" > hello.txt
'''
def mydata = readFile('hello.txt')
sh "echo My file data: ${mydata}"
}
}
}
}
}
I am trying to send commands to a docker container within docker run via a Jenkins pipeline.
The Jenkins machine is at a different server and the docker image is in a different server.
When I hard code the environment param, the scripts execute as expected. But whenever I try to replace it with the params, it errors out saying :
bash: ${params.Environment} bad substitution
This is my pipeline script
pipeline {
agent any
parameters {
choice(
name: 'Environment',
choices: ['qa','dev'],
description: 'Passing the Environment'
)
}
stages {
stage('Environment') {
steps {
echo " The environment is ${params.Environment}"
}
}
stage('run') {
steps {
sh 'ssh 10.x.x.x \'sudo docker run --name docker_container_name docker_image_name sh -c "cd mytests ; pip3 install -r requirements.txt ; python3 runTests.py -env ${params.Environment} "\''
}
}
}
}
The sh command's parameters needs to have double quotes, or triple double quotes.
sh """ssh 10.x.x.x 'sudo docker run --name docker_container_name docker_image_name sh -c "cd mytests ; pip3 install -r requirements.txt ; python3 runTests.py -env ${params.Environment} "'"""
In the Groovy language used by pipeline scripts, single-quoted strings don't do any interpolation at all, so the ${params.Environment} string gets passed on as-is to the shell. Double-quoted strings do perform interpolation, so the Groovy engine substitutes ${params.Environment} before invoking the shell.
(You might look at the native support for Using Docker with Pipeline which can avoid the ssh 'sudo "..."' wrapping, though it requires Jenkins be able to run Docker itself on the worker nodes.)
i am trying to ssh into a remote host and then execute certain commands on the remote host's shell. Following is my pipeline code.
pipeline {
agent any
environment {
// comment added
APPLICATION = 'app'
ENVIRONMENT = 'dev'
MAINTAINER_NAME = 'jenkins'
MAINTAINER_EMAIL = 'jenkins#email.com'
}
stages {
stage('clone repository') {
steps {
// cloning repo
checkout scm
}
}
stage('Build Image') {
steps {
script {
sshagent(credentials : ['jenkins-pem']) {
sh "echo pwd"
sh 'ssh -t -t ubuntu#xx.xxx.xx.xx -o StrictHostKeyChecking=no'
sh "echo pwd"
sh 'sudo -i -u root'
sh 'cd /opt/docker/web'
sh 'echo pwd'
}
}
}
}
}
}
But upon running this job it executes sh 'ssh -t -t ubuntu#xx.xxx.xx.xx -o StrictHostKeyChecking=no' successfully but it stops there and does not execute any further commands. I want to execute the commands that are written after ssh command inside the remote host's shell. any help is appreciated.
I would try something like this:
sshagent(credentials : ['jenkins-pem']) {
sh "echo pwd"
sh 'ssh -t -t ubuntu#xx.xxx.xx.xx -o StrictHostKeyChecking=no "echo pwd && sudo -i -u root && cd /opt/docker/web && echo pwd"'
}
I resolve this issue
script
{
sh """ssh -tt login#host << EOF
your command
exit
EOF"""
}
stage("DEPLOY CONTAINER"){
steps {
script {
sh """
#!/bin/bash
sudo ssh -i /path/path/keyname.pem username#serverip << EOF
sudo bash /opt/filename.sh
exit 0
<< EOF
"""
}
}
}
There is a better way to run commands on remote using SSH. I know this is late answer but I just explored this thing so would like to share and this will help others to resolve this problem easily.
I just found this link helpful on how to run multiple commands on remote using SSH. Also we can run multiple commands conditionally as mentioned in above blog.
By going through it, I found the syntax:
ssh username#hostname "command1; command2;commandN"
Now, how to run command inside remote hots using SSH in Jenkins pipeline?
Here is the solution:
pipeline {
agent any
environment {
/*
define your command in variable
*/
remoteCommands =
"""java --version;
java --version;
java --version """
}
stages {
stage('Login to remote host') {
steps {
sshagent(['ubnt-creds']) {
/*
Provide variable as argument in ssh command
*/
sh 'ssh -tt username#hostanem $remoteCommands'
}
}
}
}
}
Firstly and optionally, you can define a variable that holds all commands separated by ;(semicolon) and then pass it as parameter in command.
Another way, you can also pass your commands directly to ssh command as
sh "ssh -tt username#hostanem 'command1;command2;commandN'"
I have used it in my code and it's working great!
see the output here
Happy Learning :)
I am setting up a Jenkins pipeline (declarative script) using a Docker container agent built from a Dockerfile. I want one of the build stages to fetch dependent packages (Debian packages, from Artifactory, in my case) and then install them within the Docker container. Installing those packages (using dpkg, in my case) needs super-user permission, and thus sudo. How do I set up the pipeline and/or Dockerfile to enable that?
At present, my Jenkinsfile is somewhat like this:
pipeline {
agent {
dockerfile {
filename 'Dockerfile.jenkins'
}
}
stages {
stage('Set up dependencies') {
steps {
sh 'rm -rf dependent-packages && mkdir dependent-packages'
script {// Fetch packages from Artifactory
def packageserver = Artifactory.server 'deb-repo-srv'
def downloadSpec = ...
packageserver.download(downloadSpec)
}
sh 'sudo dpkg -i -R dependent-packages/'
}
}
...
}
}
And my Dockerfile is like this:
# Set up the O/S environment
FROM debian:9
# Add the build and test tools
RUN apt-get -y update && apt-get -y install \
cmake \
doxygen \
g++ \
libcppunit-dev \
make \
libxerces-c-dev
Because I am using a Dockerfile agent, simply adding the jenkins user to the sudoers file of the Jenkins server will not work.
I am trying to remove the directory junit located in the workspace of my Jenkins job using scripted Pipeline which looks somewhat like this:
node {
stage('Build') {
checkout scm
app = docker.build("...")
}
stage('Test') {
app.withRun("--name = ${CONTAINER_ID} ...") {
// sh "mkdir -p junit"
// sh "rm -rf junit/"
dir "junit" {
deleteDir
}
sh "docker exec ${CONTAINER_ID} /bin/bash -c 'source venv/bin/activate && python run.py test -x junit'"
sh "docker cp ${CONTAINER_ID}:/home/foo/junit junit"
}
}
junit 'junit/*.xml'
}
However I am getting the following (red haring?) error, e.g.
java.lang.ClassCastException:
hudson.tasks.junit.pipeline.JUnitResultsStep.testResults expects class
java.lang.String but received class
org.jenkinsci.plugins.workflow.cps.CpsClosure2
However when I am using the shell steps:
sh "mkdir -p junit"
sh "rm -rf junit/"
It works as expected. What am I doing wrong?
Try to use parentheses:
dir ("junit") {
deleteDir()
}