I have BookStore Spring Boot project that needs to be deployed through Jenkins. Docker installed in my local machine (macOS) and Jenkinsfile created as follows
pipeline
{
agent
{
docker
{
image 'maven:3-alpine'
//This exposes application through port 8081 to outside world
args '-u root -p 8081:8081 -v /var/run/docker.sock:/var/run/docker.sock '
}
}
stages
{
stage('Build')
{
steps
{
sh 'mvn -B -DskipTests clean package'
}
}
stage('Test')
{
steps {
//sh 'mvn test'
sh 'echo "test"'
}
post {
always {
//junit 'target/surefire-reports/*.xml'
sh 'echo "test"'
}
}
}
stage('Deliver for development')
{
when {
branch 'development'
}
steps {
sh './jenkins/scripts/deliver-for-development.sh'
input message: 'Finished using the web site? (Click "Proceed" to continue)'
}
}
stage('Deploy for production')
{
when {
branch 'production'
}
steps {
sh './jenkins/scripts/deploy-for-production.sh'
input message: 'Finished using the web site? (Click "Proceed" to continue)'
}
}
stage('Deliver') {
when {
branch 'production'
}
steps {
sh 'bash ./jenkins/deliver.sh'
}
}
}
}
I created multi-branch pipeline in Jenkins and when I try to run it, I got following error
/Users/Shared/Jenkins/Home/workspace/BookStore_master-VPWQ32ZZPV7CVOXNI4XOB3VSGH56MTF3W34KXKZFJKOBMSGLRZQQ#tmp/durable-70dd5a81/script.sh: line 2: docker: command not found
script returned exit code 127
This looks strange to me as docker available in local machine, and also configured Global Tool Configuration section with appropriate details as shown below. I looked into several posts and none of the solutions worked so far.
I faced the same issue on the Mac and the following answer helped me.
docker: command not found ( mac mini ) only happens in jenkins shell step but work from command prompt.
The solution is to add the following line into the /usr/local/Cellar/jenkins-lts/2.176.3/homebrew.mxcl.jenkins-lts.plist file so that Jenkins able to find the docker command from the host machine.
<key>EnvironmentVariables</key>
<dict>
<key>PATH</key>
<string>/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/Applications/Docker.app/Contents/Resources/bin/:/Users/Kh0a/Library/Group\ Containers/group.com.docker/Applications/Docker.app/Contents/Resources/bin</string>
</dict>
I had the same issue and was able to resolve it thanks to this thread https://stackoverflow.com/a/50029962/6943587.
You need to specify the docker label, aka which agent(s) have docker. There are two ways to do this, that I know of.
(Option 1 - preferred) Set docker label in Jenkinsfile
Set the agent as docker image with docker agent label.
// Jenkinsfile
pipeline {
// Assign to docker agent(s) label, could also be 'any'
agent {
label 'docker'
}
stages {
stage('Docker node test') {
agent {
docker {
// Set both label and image
label 'docker'
image 'node:7-alpine'
args '--name docker-node' // list any args
}
}
steps {
// Steps run in node:7-alpine docker container on docker agent
sh 'node --version'
}
}
stage('Docker maven test') {
agent {
docker {
// Set both label and image
label 'docker'
image 'maven:3-alpine'
}
}
steps {
// Steps run in maven:3-alpine docker container on docker agent
sh 'mvn --version'
}
}
}
}
(Option 2) Set docker label in configuration
Set the "docker label" in the Jenkins configuration under "Pipeline Model Definition", per the Jenkins docs here. This will only run the pipeline builds on agents with this label. Then you can create your pipeline like so...
// Jenkinsfile
pipeline {
// "Top-level" agent is assigned to docker agents via Jenkins pipeline configuration
agent none
stages {
stage('Docker node test') {
agent {
docker {
image 'node:7-alpine'
args '--name docker-node' // list any args
}
}
steps {
// Steps run in node:7-alpine docker container on docker agent
sh 'node --version'
}
}
stage('Docker maven test') {
agent {
docker {
image 'maven:3-alpine'
}
}
steps {
// Steps run in maven:3-alpine docker container on docker agent
sh 'mvn --version'
}
}
}
}
Hope this helps
Option 1 is preferred over option 2 because the Jenkinsfile configures
what machine(s) to run the docker agents on without relying on the
Jenkins pipeline configuration which could be deleted or edited in the
future.
Since you have chosen install automatically option in Global Tool Configuration section, Jenkins will not look for the docker in your system.
You can resolve this issue by unchecking the install automatically option for docker in Global Tool Configuration section
download docker installer,
install it and
give the path of installer to Jenkins.
Example screenshot is below.
Setup docker installer path in jenkins under Global Tool Configuration
I was able to solve this by retrieving Docker and Maven values from Global Tool Configuration section and adding them to environment PATH as shown below
Updated Jenkinsfile:
node {
stage('Initialize')
{
def dockerHome = tool 'MyDocker'
def mavenHome = tool 'MyMaven'
env.PATH = "${dockerHome}/bin:${mavenHome}/bin:${env.PATH}"
}
stage('Checkout')
{
checkout scm
}
stage('Build')
{
sh 'uname -a'
sh 'mvn -B -DskipTests clean package'
}
stage('Test')
{
//sh 'mvn test'
sh 'ifconfig'
}
stage('Deliver')
{
sh 'bash ./jenkins/deliver.sh'
}
}
There seems to be an issue with automated docker installer. I encountered the same issue on docker on centos 7.
I downloaded the docker cli executables from https://download.docker.com/linux/static/stable/x86_64/ and extracted them into jenkins docker volume on host (/var/lib/docker/volumes/jenkins_home/_data/docker). Then copied from /var/jenkins_home/docker to /usr/bin using shell on docker container.
After coping the executables, the build worked as expected.
In my case I had docker command issues because I was using jenkins-lts which is also a docker. After trying to debug for quite a while, I realized referencing docker command with in a docker might be an issue. I stopped the jenkins-lts service, downloaded jenkins.war file and ran the same pipeline script with docker command. It started working. My pipeline script has agent any, it still works in jenkins.war version of jenkins
If you are on windows
Follow from here:-
https://www.katacoda.com/courses/jenkins/build-docker-images
Just apply the line separator Unix and Mac Os : "\n" in your ".sh" files with your code editor. It worked for me.
add -v $(which docker):/usr/bin/docker while running container
Related
I am trying to run Zalenium from Jenkins local installed on my Mac. I am able to execute tests locally from Eclipse by first spinning up docker from Terminal. Now I am trying to execute tests via pipeline.
Here's the pipeline code:
pipeline {
agent any
tools
{
maven 'M2_HOME'
jdk 'JAVA_HOME'
}
stages {
stage('Code and Dependencies'){
parallel{
stage('Checkout Code'){
steps{
git(url: 'https://github.com/xxxxx')
}
}
stage('Initialise Tools') {
steps {
tool(name: 'M2_HOME', type: 'maven')
tool(name: 'JAVA_HOME', type: 'jdk')
}
}
stage('Install Dependencies'){
steps{
sh 'docker pull elgalu/selenium'
sh 'docker pull dosel/zalenium'
}
}
}
}
}
}
Global tools configuration:
[![enter image description here][1]][1]
testuser#blr-ml-test ~ % which docker
/usr/local/bin/docker
testuser#blr-ml-test ~ % docker -v
Docker version 19.03.12, build 48a66213fe
But when I run the job, I get:
/Users/test/.jenkins/workspace/ZaleniumPipeline#tmp/durable-16989357/script.sh: line 1: docker: command not found
I am able to run from local Jenkins though. I suspect this is a path setting issue. Tried few similar questions but none worked for me. What am I doing wrong?
I am following this example: https://github.com/DevOpsPlayground/Hands-on-with-Continuous-Testing-using-Jenkins-and-Zalenium
all,
As i try to use Jenkins to build-a-java-app-with-maven (https://jenkins.io/doc/tutorials/build-a-java-app-with-maven/).
I have a problem, my environment accessing network have to use private proxy.
My Jenkins run in a container and when I use the below pipeline, the Jenkins container will pull maven image and maven run in the container but because the environment has a proxy, maven container was not config to use a proxy, so maven can not download dependencies.
Can anyone give me help on how to let the maven container to use a proxy? thanks
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v /root/.m2:/root/.m2'
}
}
stages {
stage('Build') {
steps {
sh 'mvn -B -DskipTests clean package'
}
}
}
}
Did you try using withEnv to set proxy_host environment variable?
Also take a look at:
https://git.oneapi-project.net/microgateway/plugins/backend-jwt/commit/7287ff0516d5c9a91f5f34b37ec2d6e80350f5c7
https://issues.jenkins-ci.org/browse/JENKINS-43077
https://wiki.jenkins.io/display/JENKINS/JenkinsBehindProxy
First : to use a corporate proxy with Maven, you need to configure it in your settings.xml (see Configuring a proxy in the Maven documentation).
Second : in a pipeline running in a container, you can do this in many ways. Here are some of them :
Given you are mapping /root/.m2:/root/.m2, you could simply put your settings.xml in /root/.m2. This assumes you're executing Jenkins as root, which I highly discourage in production, for security reasons.
If you're executing Jenkins as a different user, you have to adapt the volume mapping accordingly.
Map your settings.xml file as a volume in your container and tell Maven to use it, like so :
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v /root/.m2:/root/.m2 -v /path/to/settings.xml:/my/settings.xml:ro'
}
}
stages {
stage('Build') {
steps {
sh 'mvn -s /my/settings.xml -B -DskipTests clean package'
}
}
}
}
Store the settings.xml file as a credential in Jenkins and use it with withCredentials, which can make sense if your settings.xml contains passwords :
...
steps {
withCredentials([
file(credentialsId: 'maven-settings', variable: 'MAVEN_SETTINGS')
]) {
sh 'mvn -s $MAVEN_SETTINGS -B -DskipTests clean package'
}
}
...
Instead of using the official Maven image maven:3-alpine, build a custom image containing your settings.xml file and use it in your pipeline.
I have a Jenkins pipeline running in a Docker container. My pipeline consists of three stages: Build, Test, and Deliver. Each stage makes use of an agent and the Build and Test stages work perfectly. However, for some reason the Deliver stage fails because the cdrx/pyinstaller-linux:python2 agent that runs the pyinstaller command can't find the source code in the mounted volume. I verified the file does exist and is in the correct location. When the job gets to stage 3 "Deliver" it fails to find add2vals.py. Any idea why this is happening, I'm baffled, miffed, jaded.
Jenkinsfile Pipeline Script
pipeline {
agent none
options {
skipStagesAfterUnstable()
}
stages {
stage('Build') {
agent {
docker {
image 'python:2-alpine'
}
}
steps {
sh 'python -m py_compile sources/add2vals.py sources/calc.py'
stash(name: 'compiled-results', includes: 'sources/*.py*')
}
}
stage('Test') {
agent {
docker {
image 'qnib/pytest'
}
}
steps {
sh 'py.test --junit-xml test-reports/results.xml sources/test_calc.py'
}
post {
always {
junit 'test-reports/results.xml'
}
}
}
stage('Deliver') {
agent any
environment {
VOLUME = '$(pwd)/sources:/src'
IMAGE = 'cdrx/pyinstaller-linux:python2'
}
steps {
dir(path: env.BUILD_ID) {
unstash(name: 'compiled-results')
sh "docker run --rm -v ${VOLUME} ${IMAGE} 'pyinstaller -F add2vals.py'"
}
}
post {
success {
archiveArtifacts "${env.BUILD_ID}/sources/dist/add2vals"
sh "docker run --rm -v ${VOLUME} ${IMAGE} 'rm -rf build dist'"
}
}
}
}
}
EDIT
After about two days of almost full time researching and attempts to resolve this issue I've been unable to. As of now I think there is a high likely hood of this being a bug in Docker. The files in the mounted volume just are not visible in the path on the container they are mounted to plain and simple. So be advised, will keep at it and update when I have something useful. If you encounter this I highly suggest just using Dind as oppose to Docker CLI installed on a jenkins container. Note this applies to a Windows 10 host with Docker Desktop installed using Linux containers. Hope this is helpful for the time being.
I have Jenkins running on an EC2 instance. I also have installed docker on it. I am using a pipeline job and want to run tests using docker-compose. What is the pipeline agent that I can use?
The agent should be able to handle docker-compose and makecommands.
I have done this but I am not using docker-Compose.Instead of that I am Using normal docker file.I have done similar to this.
pipeline {
agent { dockerfile true }
stages {
stage('Test') {
steps {
sh 'node --version'
sh 'svn --version'
}
}
}
You can also find here for more https://jenkins.io/doc/book/pipeline/docker/
Why is it that docker not found when i use docker as an agent in jenkins pipeline?
+ docker inspect -f . node:7-alpine
/var/jenkins_home/workspace/poobao-aws-services#tmp/durable-
13f890b0/script.sh: 2: /var/jenkins_home/workspace/project-
name#tmp/durable-13f890b0/script.sh: docker: not found
In Global Tools Configuration, I have docker as automatically install.
I have docker set to install automatically as follows, with a declarative pipeline as follows...
My jenkinsfile then has this initialization stage (amended from here)
stage('Install dependencies') {
steps {
script {
def dockerTool = tool name: 'docker', type: 'org.jenkinsci.plugins.docker.commons.tools.DockerTool'
withEnv(["DOCKER=${dockerTool}/bin"]) {
//stages
//here we can trigger: sh "sudo ${DOCKER}/docker ..."
}
}
}
}
When built it then installs automatically...