I've been trying to access a subdirectory inside of my Jenkins workspace with unix command : sh "cd ${workspace}/Myfolder", however the command does not work. I am using groovy script in Jenkins (Jenkinsfile).
My ${workspace} directory is: /var/lib/jenkins/workspace/test_sam_single_pipeline
When I execute command: sh "cd ${workspace}/Myfolder"
I use command: sh "pwd"
The output is:
/var/lib/jenkins/workspace/test_sam_single_pipeline
It seems I cannot access "Myfolder" subdirectory by using the "cd" command.
What am I missing?
in declarative pipeline you can use
dir('MyFolder') {
sh "pwd"
}
or use one shell for all your commands
sh """
cd MyFolder
pwd
"""
or join commands
sh "cd MyFolder && pwd"
Related
I am trying to run a gradle command inside a jenkins pipeline and for that i should cd <location> where gradle files are.
I added a cd command inside my pipeline but that is not working. I did this
stage('build & SonarQube Scan') {
withSonarQubeEnv('sonarhost') {
sh 'cd $WORKSPACE/sonarqube-scanner-gradle/gradle-basic'
sh 'echo ${PWD}'
sh 'gradle tasks --all'
sh 'gradle sonarqube --debug'
}
}
But the cd is not working, I tried dir step as suggested in pipeline docs, but i want to cd inside $WORKSPACE folder.
How can i fix this?
Jenkins resets the directory after each command. So after the first sh, it goes back to the previous location. The dir command is the correct approach, but it should be used like this:
dir('') {
}
Similar to how you have used withSonarQubeEnv
Alternatively, you can simply chain all the commands
sh 'cd $WORKSPACE/sonarqube-scanner-gradle/gradle-basic & echo ${PWD} & ...'
But this is not recommended. Since this will all be in the same command, it will run fine though.
The file validates and I look to have the proper syntax.
script {
sh """
summon -f folder/file.yml --provider summon-aws-secrets \
sh -c 'bash folder/bin/run_me.sh' \
"""
open folder/file.yml: no such file or directory
I confirmed the existence of the file and workspace location.
Try using full path with workspace variable:
script {
sh """
summon -f ${WORKSPACE}/folder/file.yml --provider summon-aws-secrets \
sh -c 'bash folder/bin/run_me.sh' \
"""
}
so what I see happening is I wrapped the file into a script. ran git add , git commit, git push. I updated the jenkins file to ls -l the folder and I notice that file is missing. so not sure if this is a git issue or jenkins or etc
I need to run docker container in Jenkins so that installed libraries like pycodestyle can be runnable in the following steps.
I successfully built Docker Container (in Dockerfile)
How do I access to the container so that I can use it in the next step? (Please look for >> << code in Build step below)
Thanks
stage('Build') {
// Install python libraries from requirements.txt (Check Dockerfile for more detail)
sh "docker login -u '${DOCKER_USR}' -p '${DOCKER_PSW}' ${DOCKER_REGISTRY}"
sh "docker build \
--tag '${DOCKER_REGISTRY}/${DOCKER_TAG}:latest' \
--build-arg HTTPS_PROXY=${PIP_PROXY} ."
>> sh "docker run -ti ${DOCKER_REGISTRY}/${DOCKER_TAG}:latest sh" <<<
}
}
stage('Linting') {
sh '''
awd=$(pwd)
echo '===== Linting START ====='
for file in $(find . -name '*.py'); do
filename=$(basename $file)
if [[ ${file:(-3)} == ".py" ]] && [[ $filename = *"test"* ]] ; then
echo "perform PEP8 lint (python pylint blah) for $filename"
cd $awd && cd $(dirname "${file}") && pycodestyle "${filename}"
fi
done
echo '===== Linting END ====='
'''
}
You need to mount the workspace of your Jenkins job (containing your python project) as volume (see "docker run -v" option) to your container and then run the "next step" build step inside this container. You can do this by providing a shell script as part of your project's source code, which does the "next step" or write this script in a previous build stage.
It would be something like this:
sh "chmod +x build.sh"
sh "docker run -v $WORKSPACE:/workspace ${DOCKER_REGISTRY}/${DOCKER_TAG}:latest /workspace/build.sh"
build.sh is an executable script, which is part of your project's workspace and performans the "next step".
$WORKSPACE is the folder that is used by your jenkins job (normally /var/jenkins_home/jobs//workspace - it is provided by Jenkins as a build variable.
Please note: This solution requires that the Docker daemon is running on the same host as Jenkins! Otherwise the workspace will not be available to your container.
Another solution would be to run Jenkins as Docker container, so you can share the Jenkins home/workspaces easily with the containers you run within your build jobs, like described here:
Running Jenkins tests in Docker containers build from dockerfile in codebase
I am trying to remove the directory junit located in the workspace of my Jenkins job using scripted Pipeline which looks somewhat like this:
node {
stage('Build') {
checkout scm
app = docker.build("...")
}
stage('Test') {
app.withRun("--name = ${CONTAINER_ID} ...") {
// sh "mkdir -p junit"
// sh "rm -rf junit/"
dir "junit" {
deleteDir
}
sh "docker exec ${CONTAINER_ID} /bin/bash -c 'source venv/bin/activate && python run.py test -x junit'"
sh "docker cp ${CONTAINER_ID}:/home/foo/junit junit"
}
}
junit 'junit/*.xml'
}
However I am getting the following (red haring?) error, e.g.
java.lang.ClassCastException:
hudson.tasks.junit.pipeline.JUnitResultsStep.testResults expects class
java.lang.String but received class
org.jenkinsci.plugins.workflow.cps.CpsClosure2
However when I am using the shell steps:
sh "mkdir -p junit"
sh "rm -rf junit/"
It works as expected. What am I doing wrong?
Try to use parentheses:
dir ("junit") {
deleteDir()
}
Im trying to create a virtualenv(stage) in Jenkins and setting the needed environment variables before the virtualenv can be created.
stage('create virtualenvironment') {
sh 'export PATH=/usr/local/bin/virtualenv:$PATH'
sh 'export VIRTUALENVWRAPPER_PYTHON=/usr/local/bin/python'
sh 'export VIRTUALENVWRAPPER_VIRTUALENV=/usr/local/bin/virtualenv'
sh 'source /usr/local/bin/virtualenvwrapper.sh'
echo 'createvirtualenvwrapper'
sh 'mkvirtualenv testproject'
}
When I execute this script - I get this message -
mkvirtualenv: command not found
When I print all the above env variables nothing is set? Not sure if the sh command is working as expected in scripted pipeline.
I'm not 100% sure but my guess is that, when you do a sh 'Some command' it executes a shell script and it is done.
So what is happening is that, each of your sh commands is being treated as a separate shell script which is executing the commands and is alive only for that session and closes once the script is done.
So try to combine all of the above commands to a single sh command along with the mkvirtualenv testproject and it should work.
For readability create a new Shell script like runProject.sh and the above commands in this shell script and then you can just call
sh runProject.sh
Hope it helps :)