How to configure Jenkins job to build with Docker? - docker

I'm attempting to build a branch using Jenkins and a 'docker in the docker' container to build a container from src.
I define the Docker cloud instance here:
Should an extra tab be available that enable the job to use the Docker cloud instance setup above?
The job is a multi-branch pipeline:
But when I attempt to configure a job that uses the docker cloud instance, configured above, the option to build with docker is not available:
The build log contains:
time="2021-04-04T14:27:16Z" level=error msg="failed to dial gRPC:
cannot connect to the Docker daemon. Is 'docker daemon' running on
this host?: dial unix /var/run/docker.sock: connect: no such file or
directory" error during connect: Post
http://%2Fvar%2Frun%2Fdocker.sock/v1.40/build?buildargs=%7B%7D&cachefrom=%5B%5D&cgroupparent=&cpuperiod=0&cpuquota=0&cpusetcpus=&cpusetmems=&cpushares=0&dockerfile=Dockerfile&labels=%7B%7D&memory=0&memswap=0&networkmode=default&rm=1&session=vgpahcarinxfh05klhxyk02gg&shmsize=0&t=ron%2Fml-services&target=&ulimits=null&version=1:
context canceled [Pipeline] } [Pipeline] // stage [Pipeline] }
[Pipeline] // node [Pipeline] End of Pipeline [Bitbucket] Notifying
commit build result [Bitbucket] Build result notified ERROR: script
returned exit code 1 Finished: FAILURE
which suggests the build is searching for Docker on the same host as Jenkins, but I'm attempting to build with Docker on a different host?
Have I configured Docker with Jenkins correctly?
My Jenkinsfile contains:
node {
def app
stage('Clone repository') {
checkout scm
}
stage('Build image') {
app = docker.build("ron/services")
}
stage('Push image') {
docker.withRegistry('https://registry.hub.docker.com', 'git') {
app.push("${env.BUILD_NUMBER}")
app.push("latest")
}
}
}
Update:
Clicking the checkmark at Expose DOCKER_HOST , rebuilding contains error:
+ docker build -t ron/services .
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
[Bitbucket] Notifying commit build result
[Bitbucket] Build result notified
ERROR: script returned exit code 1
Finished: FAILURE

The docker cli tries to connect using the docker socket in /var/run. This means that no external daemon is configured, for example using the environment variable DOCKER_HOST.
Try clicking the checkmark at Expose DOCKER_HOST.

Not clear if this is what you are trying to do, but configuring Docker cloud will tell your Jenkins to launch a container on 10.241.0.198 (client), and run your jenkins job in that container. To make this work, there are a couple of things to check:
ensure that jenkins user on jenkins server can access port 2371 on client, ie 'Test Connection' returns success
Turn on 'Expose DOCKER_HOST' if you want to use docker in the container
configure ssh so that jenkins user on jenkins server can ssh to the container when it's running on the client (CMD ["/usr/sbin/sshd", "-D"] in Dockerfile)
In Docker Agent Template: configure a label; turn on 'enabled'; configure a docker image to run in the container; set Remote filesystem Root: /home/jenkins
In Container Settings: (very important!!) add /var/run/docker.sock:/var/run/docker.sock to Volumes
To get your Pipeline job to run on the docker image, set the agent label to the label you provided in step 4.
A couple of gotchas when creating the image to run in the container:
install both openssh-clients and openssh-server
install java
install any other build tools you might need, eg git
install docker if you want docker in docker support
configure for sftp in /etc/ssh/sshd_config eg, Add
# override default of no subsystems
Subsystem sftp /usr/lib/openssh/sftp-server
Match group sftp
X11Forwarding no
AllowTCPForwarding no
ForceCommand internal-sftp

Related

How to resolve connection refused when Jenkin in Docker connect to Docker Daemon?

I am trying to run the pipeline example of python.
I am able to start Jenkins in Docker and connect to Git but I get "docker not found" error when trigger the job:
+ docker inspect -f . python:3.5.1
/var/jenkins_home/workspace/example-project#tmp/durable-6edc6c68/script.sh: 1: /var/jenkins_home/workspace/example-project#tmp/durable-6edc6c68/script.sh: docker: not found
[Pipeline] isUnix
[Pipeline] sh
+ docker pull python:3.5.1
/var/jenkins_home/workspace/example-project#tmp/durable-bd8f56f3/script.sh: 1: /var/jenkins_home/workspace/example-project#tmp/durable-bd8f56f3/script.sh: docker: not found
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 127
I have also referred to other online solution by adding Docker to be install automatically in Global Tool Configuration:
However, my Jenkins in Docker container is unable to connect to Docker Daemon when I configure the Docker plugin
On the other hand, my another Jenkins that is installed in Window 10 is able to connect to Docker Daemon. Therefore, may I know what can goes wrong?

Docker and Jenkins integration [duplicate]

This question already has answers here:
Docker not found when building docker image using Docker Jenkins container pipeline
(7 answers)
Closed 1 year ago.
I've added the BitBucket server integration plugin (https://plugins.jenkins.io/atlassian-bitbucket-server-integration/) and can connect to the BitBucket cloud repo from Jenkins:
But I receive an error when I try to build:
/var/jenkins_home/workspace/bb_add-jenkins-file#tmp/durable-c49dbeca/script.sh: 1: /var/jenkins_home/workspace/bb_add-jenkins-file#tmp/durable-c49dbeca/script.sh: docker: not found
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
[Bitbucket] Notifying commit build result
[Bitbucket] Build result notified
ERROR: script returned exit code 127
Finished: FAILURE
So it seems I need to install Docker on the Jenkins instance ?
https://plugins.jenkins.io/docker-build-publish/
I'm following this tutorial to configure Docker with Jenkins: https://medium.com/#karthi.net/docker-tutorial-build-docker-images-using-jenkins-d2880e65b74
and have reached this step:
On my own Jenkins Docker setup page I have :
I'm unsure what Docker URL should be used? Do I need to provision a new container instance within the Kubernetes cluster and run docker within this new instance? This new Docker instance is then the Docker Host URI field?
I think that plugin requires the docker cli to be present.
If you run jenkins as a docker image itself, use an image that provides the docker cli, for example https://hub.docker.com/r/trion/jenkins-docker-client
If you want to use the host docker daemon for building, you need to bind-mount the docker socket.
If you want to use a sidecar container to provide the docker daemon, for example using a docker-in-docker setup you can usually use the container name as docker host or kubernetes service name. This depends on how you provide the sidecar container and there is no general answer to that.

Jenkins declarative pipeline problem when running docker-in-docker

I just encountered a problem when running a Jenkins declarative pipeline on a Jenkins server that is itself running inside Docker, having access to the docker.sock from the host.
The structure of the pipeline is rather simple:
pipeline {
agent {
docker { image 'gradle:jdk11' }
}
stages {
stage('Checkout') {
steps {
// ...
}
}
stage('Assemble public API documentation') {
environment {
// ...
}
steps {
// ...
}
}
stage('Generate documentation') {
steps {
// ...
}
}
stage('Upload documentation to Firebase') {
agent {
docker {
image 'node:12'
reuseNode false
}
}
steps {
// ...
}
}
}
}
The idea is to run three stages in the first container, and then create a new container for the final stage.
The following is printed when entering the last stage:
[Pipeline] stage
[Pipeline] { (Upload documentation to Firebase)
[Pipeline] getContext
[Pipeline] isUnix
[Pipeline] sh
+ docker inspect -f . node:12
/var/jenkins_home/workspace/publish_public_api_doc#tmp/durable-bc4d65d1/script.sh: 1: /var/jenkins_home/workspace/publish_public_api_doc#tmp/durable-bc4d65d1/script.sh: docker: not found
[Pipeline] isUnix
[Pipeline] sh
+ docker pull node:12
/var/jenkins_home/workspace/publish_public_api_doc#tmp/durable-297d223a/script.sh: 1: /var/jenkins_home/workspace/publish_public_api_doc#tmp/durable-297d223a/script.sh: docker: not found
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
$ docker stop --time=1 367647f97c9eed52bf85c13c2bc2203bb7194adac803d37cab0e0d0435325efa
$ docker rm -f 367647f97c9eed52bf85c13c2bc2203bb7194adac803d37cab0e0d0435325efa
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 127
Finished: FAILURE
I don't really understand what is happening here.
In order to debug this, I logged in to that machine, and ran the docker command from the host, as well as from inside the running Jenkins container, and it was working.
The way this is set up is that the Docker client is installed in the image, i.e. the binary itself is not shared into the container.
Since the docker command is "not found", the only explanation that I have is that the docker command to start the agent for the final stage is not executed in the "top-level" Jenkins container, but in the JDK one, which does not have the docker executable inside.
This, however, would seem unexpected, if not a bug.
I'd be thankful if anyone was shedding some light on this.
Jenkins pipeline agents/nodes
Your pipeline has specified an agent to run on at the top-most level. The pipeline will execute all commands on that agent (or within a docker container in your scenario), until another agent is specified. When a new agent is specified, the top-level agent will connect to it via some protocol and the new agent will execute all pipeline stages/steps that are within this agents scope. Once out of scope, the connection to the new agent will be closed and the top-level agent will once again execute all commands.
What's causing the error?
The forth stage attempts to change the execution context to a new agent. The current agent, the gradle:jdk11 container, will execute the steps to connect to this new agent. As the new agent is a docker container, the gradle:jdk11 container will attempt to use the docker command itself to spin up the new container.
As you suspected there is no docker binary/service within this container.
Why is this the expected behaviour?
Assume that the top level agent is a different physical machine connected via tcp or ssh, rather than a docker container. This machine would need to have all the tools installed on it for compiling, generating docs, running unit tests, etc. E.g. it wouldn't use the doxygen binary installed on the Jenkins master as it should provide this itself (throwing errors if doxygen doesn't exist in the $PATH). Likewise, this machine would need docker installer to spin up the container in the forth stage.
How can I get my pipeline working?
You could create your own custom docker image inheriting from gradle:jdk11 and share the host systems' docker. This would allow your custom image to spin up the docker image required in the forth stage. You would use agent { docker { image 'my-custom-img' } } at a global scope.
Alternatively you could use the master agent (or other physical machines) at a global scope and have each stage spin up its own container. Each stage would have a clean working environment, so you'd need to use stash/unstash or a mounted volume to share src/docs between stages.

how to run docker commands inside jenkins pipeline jobs

In my Manage Jenkins > Global Tool Configuration, i have already configured a tool called "docker" as follows:
name: docker
install automatically: CHECKED
docker version: latest
Then all I have in my jenkinsfile is the following and nothing else:
node {
DOCKER_HOME = tool "docker"
sh """
echo $DOCKER_HOME
ls $DOCKER_HOME/bin/
$DOCKER_HOME/bin/docker images
$DOCKER_HOME/bin/docker ps -a
"""
}
I get an error like this "Cannot connect to the Docker daemon. Is the docker daemon running on this host?".
Following is the full console log:
Started by user Syed Rakib Al Hasan
[Pipeline] node
Running on master in /var/jenkins_home/workspace/helloDocker
[Pipeline] {
[Pipeline] tool
[Pipeline] sh
[helloDocker] Running shell script
+ echo /var/jenkins_home/tools/org.jenkinsci.plugins.docker.commons.tools.DockerTool/docker
/var/jenkins_home/tools/org.jenkinsci.plugins.docker.commons.tools.DockerTool/docker
+ ls /var/jenkins_home/tools/org.jenkinsci.plugins.docker.commons.tools.DockerTool/docker/bin/
docker
+ /var/jenkins_home/tools/org.jenkinsci.plugins.docker.commons.tools.DockerTool/docker/bin/docker images
Cannot connect to the Docker daemon. Is the docker daemon running on this host?
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE
How do i ensure that the docker daemon/service is running/started before my pipeline reaches the line to run docker commands.
Is there any other native docker-build-step plugin way to achieve what I am doing here? Like docker ps -a or docker images or docker build -t?
Some assumptions:
Let's say my chosen node do not already have docker/docker-engine installed/running in my host machine. That's the purpose of the tool command to automatically install docker in the node if it is not already there.
This Jenkins plugin is for the docker client; I'd solve (work around) by:
setting up jenkins slaves where docker daemon is reachable, add a label
setting up a housekeeping job which will fail if docker daemon was not reachable (so we can notify the infra team without having the QA to figure out and escalate the problem)
assign jobs which assumes the docker daemon to be reachable to this label
I hope it helps, and I'm curious if any of you have a better solution!

Cannot get Jenkins Docker slave to build docker images

I am currently experimenting with Docker in combination with Jenkins to streamline the CI/CD workflow for a new project. I do so on a Mac with Docker 1.12 installed.
This is what I do:
Use docker machine to create a new Docker server
Use the official Jenkins Docker image to spin up a Jenkins instance on that server
Install the "Yet Another Docker Plugin" and "CloudBees Docker Pipeline" plugins.
Add a "Docker Cloud" using the IP of the Docker server above and the third party Docker DinD image tehranian/dind-jenkins-slave
With this setup, I run a very simple pipeline job like this:
node('docker') {
docker.image('hseeberger/scala-sbt').inside {
stage 'Checkout'
echo 'We got here!'
}
}
Jenkins spins up a Docker instance as expected and executes the job. So the basic Docker setup is working as expected.
But the Docker command within the job fails. Log output looks something like this:
[Pipeline] node
Still waiting to schedule task
Docker-23ebf3d8dd4f is offline
Running on Docker-23ebf3d8dd4f in /home/jenkins/workspace/docker-test
[Pipeline] {
[Pipeline] sh
[docker-test] Running shell script
+ docker inspect -f . hseeberger/scala-sbt
Cannot connect to the Docker daemon. Is the docker daemon running on this host?
[Pipeline] sh
[docker-test] Running shell script
+ docker pull hseeberger/scala-sbt
Using default tag: latest
Warning: failed to get default registry endpoint from daemon (Cannot connect to the Docker daemon. Is the docker daemon running on this host?). Using system default: https://index.docker.io/v1/
Cannot connect to the Docker daemon. Is the docker daemon running on this host?
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Now when I browse around for solutions, it is usually mentioned that the Docker socket needs to be provided to the container as a volume, but that doesn't seem to work either.
Since the general setup seems to be working, wouldn't the slave simply have to do the same thing as the Jenkins plugin does to spin up the Docker slave in the first place? That is, use the URL of the Docker server to control it? Since I assume this is an extremely common use-case, there must be a Docker image for Jenkins Docker slaves that can do this out of the box, right? What am I missing?
You might need to set the docker env and use the content of docker-machine env node in your running shellscript.

Resources