Jenkinsfile agent fails to find source code file - docker

I have a Jenkins pipeline running in a Docker container. My pipeline consists of three stages: Build, Test, and Deliver. Each stage makes use of an agent and the Build and Test stages work perfectly. However, for some reason the Deliver stage fails because the cdrx/pyinstaller-linux:python2 agent that runs the pyinstaller command can't find the source code in the mounted volume. I verified the file does exist and is in the correct location. When the job gets to stage 3 "Deliver" it fails to find add2vals.py. Any idea why this is happening, I'm baffled, miffed, jaded.
Jenkinsfile Pipeline Script
pipeline {
agent none
options {
skipStagesAfterUnstable()
}
stages {
stage('Build') {
agent {
docker {
image 'python:2-alpine'
}
}
steps {
sh 'python -m py_compile sources/add2vals.py sources/calc.py'
stash(name: 'compiled-results', includes: 'sources/*.py*')
}
}
stage('Test') {
agent {
docker {
image 'qnib/pytest'
}
}
steps {
sh 'py.test --junit-xml test-reports/results.xml sources/test_calc.py'
}
post {
always {
junit 'test-reports/results.xml'
}
}
}
stage('Deliver') {
agent any
environment {
VOLUME = '$(pwd)/sources:/src'
IMAGE = 'cdrx/pyinstaller-linux:python2'
}
steps {
dir(path: env.BUILD_ID) {
unstash(name: 'compiled-results')
sh "docker run --rm -v ${VOLUME} ${IMAGE} 'pyinstaller -F add2vals.py'"
}
}
post {
success {
archiveArtifacts "${env.BUILD_ID}/sources/dist/add2vals"
sh "docker run --rm -v ${VOLUME} ${IMAGE} 'rm -rf build dist'"
}
}
}
}
}
EDIT
After about two days of almost full time researching and attempts to resolve this issue I've been unable to. As of now I think there is a high likely hood of this being a bug in Docker. The files in the mounted volume just are not visible in the path on the container they are mounted to plain and simple. So be advised, will keep at it and update when I have something useful. If you encounter this I highly suggest just using Dind as oppose to Docker CLI installed on a jenkins container. Note this applies to a Windows 10 host with Docker Desktop installed using Linux containers. Hope this is helpful for the time being.

Related

How to set docker path for Jenkins local? docker: command not found

I am trying to run Zalenium from Jenkins local installed on my Mac. I am able to execute tests locally from Eclipse by first spinning up docker from Terminal. Now I am trying to execute tests via pipeline.
Here's the pipeline code:
pipeline {
agent any
tools
{
maven 'M2_HOME'
jdk 'JAVA_HOME'
}
stages {
stage('Code and Dependencies'){
parallel{
stage('Checkout Code'){
steps{
git(url: 'https://github.com/xxxxx')
}
}
stage('Initialise Tools') {
steps {
tool(name: 'M2_HOME', type: 'maven')
tool(name: 'JAVA_HOME', type: 'jdk')
}
}
stage('Install Dependencies'){
steps{
sh 'docker pull elgalu/selenium'
sh 'docker pull dosel/zalenium'
}
}
}
}
}
}
Global tools configuration:
[![enter image description here][1]][1]
testuser#blr-ml-test ~ % which docker
/usr/local/bin/docker
testuser#blr-ml-test ~ % docker -v
Docker version 19.03.12, build 48a66213fe
But when I run the job, I get:
/Users/test/.jenkins/workspace/ZaleniumPipeline#tmp/durable-16989357/script.sh: line 1: docker: command not found
I am able to run from local Jenkins though. I suspect this is a path setting issue. Tried few similar questions but none worked for me. What am I doing wrong?
I am following this example: https://github.com/DevOpsPlayground/Hands-on-with-Continuous-Testing-using-Jenkins-and-Zalenium

How to execute single step or post-build action on a docker host if Jenkins pipeline is dockerized?

Suppose I have a dockerized pipeline with multiple steps. The docker container is defined in the beginning of Jenkinsfile:
pipeline {
agent {
docker {
image 'gradle:latest'
}
}
stages {
// multiple steps, all executed in 'gradle' container
}
post {
always {
sh 'git whatever-command' // will not work in 'gradle' container
}
}
}
I would like to execute some git commands in a post-build action. The problem is that gradle image does not have git executable.
script.sh: line 1: git: command not found
How can I execute it on Docker host still using gradle container for all other build steps? Of course I do not want to explicitly specify container for each step but that specific post-post action.
Ok, below is my working solution with grouping multiple stages (Build and Test) in a single dockerized stage (Dockerized gradle) and single workspace reused between docker host and docker container (see reuseNode docs):
pipeline {
agent {
// the code will be checked out on out of available docker hosts
label 'docker'
}
stages {
stage('Dockerized gradle') {
agent {
docker {
reuseNode true // < -- the most important part
image 'gradle:6.5.1-jdk11'
}
}
stages{
// Stages in this block will be executed inside of a gradle container
stage('Build') {
steps{
script {
sh "gradle build -x test"
}
}
}
stage('Test') {
steps{
script {
sh "gradle test"
}
}
}
}
}
stage('Cucumber Report') {
// this stage will be executed on docker host labeled 'docker'
steps {
cucumber 'build/cucumber.json'
}
}
}
post {
always {
sh 'git whatever-command' // this will also work outside of 'gradle' container and reuse original workspace
}
}
}

Running docker container inside Jenkins piepline

Once i try to run docker container inside jenkins pipeline it fails - log. Jenkins is local. Since there's
Jenkins does not seem to be running inside a container
line in console output i assume that in might be necessary to run containerized Jenkins?
Dockerfile
FROM ubuntu
ENV customnEnvVar="test."
Jenkinsfile
#!groovy
pipeline {
agent { dockerfile true }
stages {
steps {
sh 'echo customEnvVar = $customEnvVar'
}
}
}

Docker command not found in local Jenkins multi branch pipeline

I have BookStore Spring Boot project that needs to be deployed through Jenkins. Docker installed in my local machine (macOS) and Jenkinsfile created as follows
pipeline
{
agent
{
docker
{
image 'maven:3-alpine'
//This exposes application through port 8081 to outside world
args '-u root -p 8081:8081 -v /var/run/docker.sock:/var/run/docker.sock '
}
}
stages
{
stage('Build')
{
steps
{
sh 'mvn -B -DskipTests clean package'
}
}
stage('Test')
{
steps {
//sh 'mvn test'
sh 'echo "test"'
}
post {
always {
//junit 'target/surefire-reports/*.xml'
sh 'echo "test"'
}
}
}
stage('Deliver for development')
{
when {
branch 'development'
}
steps {
sh './jenkins/scripts/deliver-for-development.sh'
input message: 'Finished using the web site? (Click "Proceed" to continue)'
}
}
stage('Deploy for production')
{
when {
branch 'production'
}
steps {
sh './jenkins/scripts/deploy-for-production.sh'
input message: 'Finished using the web site? (Click "Proceed" to continue)'
}
}
stage('Deliver') {
when {
branch 'production'
}
steps {
sh 'bash ./jenkins/deliver.sh'
}
}
}
}
I created multi-branch pipeline in Jenkins and when I try to run it, I got following error
/Users/Shared/Jenkins/Home/workspace/BookStore_master-VPWQ32ZZPV7CVOXNI4XOB3VSGH56MTF3W34KXKZFJKOBMSGLRZQQ#tmp/durable-70dd5a81/script.sh: line 2: docker: command not found
script returned exit code 127
This looks strange to me as docker available in local machine, and also configured Global Tool Configuration section with appropriate details as shown below. I looked into several posts and none of the solutions worked so far.
I faced the same issue on the Mac and the following answer helped me.
docker: command not found ( mac mini ) only happens in jenkins shell step but work from command prompt.
The solution is to add the following line into the /usr/local/Cellar/jenkins-lts/2.176.3/homebrew.mxcl.jenkins-lts.plist file so that Jenkins able to find the docker command from the host machine.
<key>EnvironmentVariables</key>
<dict>
<key>PATH</key>
<string>/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/Applications/Docker.app/Contents/Resources/bin/:/Users/Kh0a/Library/Group\ Containers/group.com.docker/Applications/Docker.app/Contents/Resources/bin</string>
</dict>
I had the same issue and was able to resolve it thanks to this thread https://stackoverflow.com/a/50029962/6943587.
You need to specify the docker label, aka which agent(s) have docker. There are two ways to do this, that I know of.
(Option 1 - preferred) Set docker label in Jenkinsfile
Set the agent as docker image with docker agent label.
// Jenkinsfile
pipeline {
// Assign to docker agent(s) label, could also be 'any'
agent {
label 'docker'
}
stages {
stage('Docker node test') {
agent {
docker {
// Set both label and image
label 'docker'
image 'node:7-alpine'
args '--name docker-node' // list any args
}
}
steps {
// Steps run in node:7-alpine docker container on docker agent
sh 'node --version'
}
}
stage('Docker maven test') {
agent {
docker {
// Set both label and image
label 'docker'
image 'maven:3-alpine'
}
}
steps {
// Steps run in maven:3-alpine docker container on docker agent
sh 'mvn --version'
}
}
}
}
(Option 2) Set docker label in configuration
Set the "docker label" in the Jenkins configuration under "Pipeline Model Definition", per the Jenkins docs here. This will only run the pipeline builds on agents with this label. Then you can create your pipeline like so...
// Jenkinsfile
pipeline {
// "Top-level" agent is assigned to docker agents via Jenkins pipeline configuration
agent none
stages {
stage('Docker node test') {
agent {
docker {
image 'node:7-alpine'
args '--name docker-node' // list any args
}
}
steps {
// Steps run in node:7-alpine docker container on docker agent
sh 'node --version'
}
}
stage('Docker maven test') {
agent {
docker {
image 'maven:3-alpine'
}
}
steps {
// Steps run in maven:3-alpine docker container on docker agent
sh 'mvn --version'
}
}
}
}
Hope this helps
Option 1 is preferred over option 2 because the Jenkinsfile configures
what machine(s) to run the docker agents on without relying on the
Jenkins pipeline configuration which could be deleted or edited in the
future.
Since you have chosen install automatically option in Global Tool Configuration section, Jenkins will not look for the docker in your system.
You can resolve this issue by unchecking the install automatically option for docker in Global Tool Configuration section
download docker installer,
install it and
give the path of installer to Jenkins.
Example screenshot is below.
Setup docker installer path in jenkins under Global Tool Configuration
I was able to solve this by retrieving Docker and Maven values from Global Tool Configuration section and adding them to environment PATH as shown below
Updated Jenkinsfile:
node {
stage('Initialize')
{
def dockerHome = tool 'MyDocker'
def mavenHome = tool 'MyMaven'
env.PATH = "${dockerHome}/bin:${mavenHome}/bin:${env.PATH}"
}
stage('Checkout')
{
checkout scm
}
stage('Build')
{
sh 'uname -a'
sh 'mvn -B -DskipTests clean package'
}
stage('Test')
{
//sh 'mvn test'
sh 'ifconfig'
}
stage('Deliver')
{
sh 'bash ./jenkins/deliver.sh'
}
}
There seems to be an issue with automated docker installer. I encountered the same issue on docker on centos 7.
I downloaded the docker cli executables from https://download.docker.com/linux/static/stable/x86_64/ and extracted them into jenkins docker volume on host (/var/lib/docker/volumes/jenkins_home/_data/docker). Then copied from /var/jenkins_home/docker to /usr/bin using shell on docker container.
After coping the executables, the build worked as expected.
In my case I had docker command issues because I was using jenkins-lts which is also a docker. After trying to debug for quite a while, I realized referencing docker command with in a docker might be an issue. I stopped the jenkins-lts service, downloaded jenkins.war file and ran the same pipeline script with docker command. It started working. My pipeline script has agent any, it still works in jenkins.war version of jenkins
If you are on windows
Follow from here:-
https://www.katacoda.com/courses/jenkins/build-docker-images
Just apply the line separator Unix and Mac Os : "\n" in your ".sh" files with your code editor. It worked for me.
add -v $(which docker):/usr/bin/docker while running container

Deploy generated WAR to glassfish

I have a Jenkins container in docker.
When I build something successfully, I want to deploy it to a glassfish docker container.
https://docs.oracle.com/cd/E19798-01/821-1757/ghgmi/index.html
Mentioned on the given website, copying a war in the autodeploy folder will auto deploy it. But how do I connect to the glassfish container?
https://github.com/jenkinsci/postbuildscript-plugin
With this plugin you can execute a script after building.
I use Jenkins Pipeline job to control my containers.
In that case, you can use something like this in your pipeline script:
node ("YOUR_SLAVE_MACHINE_NAME") {
stage('Build Image'){
app = docker.build('NAME_OF_IMAGE:latest', '/jenkins_home/workspace/NAME_OF_THIS_JOB')
}
stage('Run container') {
try {
app.inside(' -p 8080:8080 ') { // or any properties you want to deliver
sh '/usr/local/glassfish4/bin/asadmin start-domain'
sh '/usr/local/glassfish4/bin/asadmin -u admin deploy /YOUR_APP.war'
sh '/usr/local/glassfish4/bin/asadmin stop-domain'
sh '/usr/local/glassfish4/bin/asadmin start-domain --verbose'
sh 'sleep 10000d'
}
}
catch (exc) {
echo 'Application container is down. ' + exc
}
}
}

Resources