I'm using Jkube maven plugin to generate a Docker image via a Jenkins pipeline on AWS EC2 instance under Ubuntu.
When pipeline executes mvn clean install k8s:build I'm getting this error :
[ERROR] Failed to execute goal org.eclipse.jkube:kubernetes-maven-plugin:1.3.0:build (default-cli) on project social-carpooling-frontend: Execution default-cli of goal org.eclipse.jkube:kubernetes-maven-plugin:1.3.0:build failed: No <dockerHost> given, no DOCKER_HOST environment variable, no read/writable '/var/run/docker.sock' or '//./pipe/docker_engine' and no external provider like Docker machine configured -> [Help 1]
And this is the Jenkins pipeline :
pipeline {
agent any
stages {
stage('Docker Check Stage') {
steps {
sh '/home/bitnami/downloads/apache-maven-3.8.1/bin/mvn clean install k8s:build -Premote'
}
}
}
}
When I log using ssh to this machine and execute docker -v it says Docker version 20.10.0, build 7287ab3
So Docker is really installed and daemon is started, but when I trigger it via maven it seems not to find it !
Any ideas ?
The problem was due to the user who's running maven command which doesn't have access to docker.sock
The solution is to modify the read/write permission on docker.sock this way :
sudo chmod 776 /var/run/docker.sock
Related
Hello i want to build some maven projects in docker.
While the build process some more docker containers will be spawned so i want to isolate them inside... docker.
for Example:
Develop-branch[database-container, core-container]
Feature(1..n)[database-container, core-container]
I tried to use Jenkins docker plugin and configured the linux-server as cloud using unix socket.
For the building container i use docker:dind and installed inside maven java and git.
FROM docker:dind
RUN apk add openjdk11 maven git
compailing is working but if it comes to the
build stepp calling docker i got the Error:
[ERROR] DOCKER> Cannot create docker access object [No such file or directory]
I'm trying to use a pipeline build in Openshift 3.9 where I need to use the docker CLI. I can't figure out how to have the 'docker' command available in my pipeline.
I've tried the code below with declarative pipeline, but getting "docker: command not found"
pipeline {
agent {
docker { image 'node:7-alpine' }
}
stages {
stage('Test') {
steps {
sh 'node --version'
}
}
}
}
The code was copied from here:
https://jenkins.io/doc/book/pipeline/docker/
I also tried the scripted version of it:
node {
/* Requires the Docker Pipeline plugin to be installed */
docker.image('node:7-alpine').inside {
stage('Test') {
sh 'node --version'
}
}
}
But getting the same error: "docker: command not found"
The docker pipeline plugin is installed (version: 1.17)
Openshift version: 3.9
Any suggestions? Thank you!
Seems to be that you don't have docker installed on the node where you run your pipeline. You need to install it first. You cannot use docker command just after installing Docker Plugin without pre-configuration:
By default, the Docker Pipeline plugin will communicate with a local Docker daemon, typically accessed through /var/run/docker.sock.
Openshift provides Jenkins slave images of three types,
Maven supported
Nodejs supported
Base image
Neither of them have docker installed and believe me, that's not a good idea.
In Openshift, Jenkins runs as Pod( running the docker container inside), and you want to get the docker within the container. So if you want to make docker available,
you have to create a jenkins slave image by extending the base image and add docker to it
Push it to the registry
Start using it!
But, do above if you really really want that, just a word of caution ;)
When I was trying to run a Jenkins pipeline project, it failed giving this message under the "docker pull node:6-alpine":
<.jenkins/workspace/simple-node-js-react-npm-app#tmp/durable-431710c5/script.sh: line 2: docker: command not found
script returned exit code 127>
I have no idea what's going on here, and I couldn't access the directory mentioned in the error. I am pretty new to Jenkins.
As mentioned here, using the JENKINS Docker Plugin or JENKINS Docker Pipeline Plugin would not be enough to allow a node to use docker.
You still need to install docker on the node itself.
Please follow below steps:
Install the docker engine (yum install docker) on server where Jenkins is running.
Verify docker is installed: run command which docker.
Click on Jenkins manage plugin and install docker plugin.
I have a jenkins pipeline script which creates the Docker image and deploys it to the docker hub. I have installed the docker plugin but it complains about "docker command not found". I am not sure if I need to install docker in the same machine or something else need to happen?
Yes you have to install docker on the slave machine which is running that pipeline script of docker plugin. I would suggest adding a label docker to the slave that has docker installed and then use the pipeline script as:
node('docker') {
...
}
I have jenkins running as Docker container, I tried to install jenkins build and publish plugin here and copied Dockerfile inside jenkins workspace, but whenever I run the build, it gives me:
Started by user Jenkins Admin
Building in workspace /var/lib/jenkins/jobs/workspace
[workspace] $ docker build -t index.docker.io/test/openshift:latest --pull=true /var/lib/jenkins/jobs/test/workspace
ERROR: Cannot run program "docker" (in directory "/var/lib/jenkins/jobs/workspace"): error=2, No such file or directory
java.io.IOException: Cannot run program "docker" (in directory "/var/lib/jenkins/jobs/workspace"): error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at hudson.Proc$LocalProc.<init>(Proc.java:244)
at hudson.Proc$LocalProc.<init>(Proc.java:216)
at hudson.Launcher$LocalLauncher.launch(Launcher.java:803)
at hudson.Launcher$ProcStarter.start(Launcher.java:381)
Build step 'Docker Build and Publish' marked build as failure
Finished: FAILURE
could you please tell me why is that so?
Inside a Docker container you have no access to the docker-binary by default (hence the error message No such file or directory).
If you want to use Docker within a Docker container, you need to either use DinD (Docker-in-Docker) or DooD (Docker-outside-of-Docker).
The first is a separate Docker installation within your Jenkins-container, the second only mounts the hosts Docker installation via volumes.
Further reading about DinD in general and in regards to Jenkins:
https://jpetazzo.github.io/2015/09/03/do-not-use-docker-in-docker-for-ci/
https://github.com/killercentury/docker-jenkins-dind
https://github.com/tehranian/dind-jenkins-slave
Further reading about DooD in general and in regards to Jenkins:
http://container-solutions.com/running-docker-in-jenkins-in-docker/
https://hub.docker.com/r/axltxl/jenkins-dood/
Update
The information on using the Workflow plugin below is no longer correct.
I have since written a plugin called docker-swarm-slave that offers a build-wrapper you can configure for a job which automatically provisions a Docker-container for a build, if you use my jenkins-dood-image or are running directly on bare metal.
Documentation unfortunately is rather sparse, but maybe it is useful to somebody.
I have a similar use-case: I want to be able to automatically start a Docker-container with a specified image running a Jenkins Swarm client that will take over the build.
My jenkins-dood-image contains a script docker-slave which lets me automatically provision a Docker-Swarm-slave and execute what I need on it using the Workflow-plugin with a script like the following:
node('master') {
stage 'Create docker-slave'
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'swarm-login', usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD']]) {
sh 'docker-slave --job-name $JOB_NAME --build-number $BUILD_NUMBER -i pitkley/python-swarm:3.4 -u $USERNAME -p $PASSWORD -- -labels "${JOB_NAME}_${BUILD_NUMBER}"'
}
stage 'Execute on docker-slave'
node("${env.JOB_NAME}_${env.BUILD_NUMBER}") {
sh 'hostname'
}
stage 'Remove docker-slave'
sh 'docker-slave --job-name $JOB_NAME --build-number $BUILD_NUMBER --rm'
}
(This assumes you need credentials to authenticate which are saved with a short-ID of swarm-credentials.)