Declarative jenkins pipeline - jenkins

I am using declarative Jenkins pipeline.
I am newbie in Jenkins so i don't understand something. I don't know how to handle this error and can someone tell me what is best options to do builds.
I have bash script which build, tag and push docker image to repository.
This is the part of my Jenkinsfile;
pipeline {
agent {
kubernetes {
label 'bmf-worker'
defaultContainer 'jnlp'
yaml """
apiVersion: v1
kind: Pod
metadata:
labels:
component: ci
spec:
# Use service account that can deploy to all namespaces
serviceAccountName: service-reader
containers:
- name: docker
image: docker
command:
- cat
tty: true
- name: kubectl
image: gcr.io/cloud-builders/kubectl
command:
- cat
tty: true
- name: gcloud
image: google/cloud-sdk
command:
- cat
tty: true
"""
}
}
stages {
stage('Code Checkout and Setup') {
steps {
echo 'Code Checkout and Setup'
}
}
stage('Build') {
parallel {
stage('Build') {
steps {
echo 'Start building Frontend and Backend Docker images'
}
}
stage('Build BMF Frontend') {
steps {
container('gcloud') {
echo 'Building Bmf Frontend Image'
sh 'chmod +x build.sh'
sh './build.sh --build_bmf_frontend'
}
}
}
stage('Tag BMF Frontend') {
steps {
container('gcloud') {
echo 'Building Bmf Frontend Image'
sh 'chmod +x build.sh'
sh './build.sh --tag_frontend'
}
}
}
stage('Build BMF Backend') {
steps {
container('gcloud') {
echo 'Buildinging Bmf Backend Images'
sh 'chmod +x build.sh'
sh './build.sh --build_bmf_backend'
}
}
}
stage('Tag BMF Backend') {
steps {
container('gcloud') {
echo 'Building Bmf Frontend Image'
sh 'chmod +x build.sh'
sh './build.sh --tag_frontend'
}
}
}
}
}
How to use podTemplate to execute my steps. When am using docker container for Stage Build BMF Backend i have these errors;
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
/home/jenkins/workspace/BMF/bmf-web#tmp/durable-c146e810/script.sh: line 1: ./build.sh: not found
With gcloud container defined in podTemplate;
time="2019-03-12T13:40:56Z" level=error msg="failed to dial gRPC: cannot connect to the Docker daemon. Is 'docker daemon' running on this host?: dial unix /var/run/docker.sock: connect: no such file or directory"
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
And how to tag docker images because I need docker to tag and git because tags is short commit. When am using docker there is no git.
My Jenkins master is on Google Cloud Kubernetes.
Can someone explain me better solution to execute jobs.

The first issue you are facing is related to Docker, not Jenkins.
Docker commands can only be run by root, or by users in the docker group.
If you want the Jenkins user to be able to execute Docker commands then you can run the following command as root to add Jenkins to the Docker group:
usermod -aG docker jenkins
This is documented in the Docker docs.
Be aware that giving a user access to Docker effectively grants them root access, so be cautious of which users you add to this group.

Related

Docker: not found when running cmds in jenkinsfile

I am new to docker and CI. I am trying to create a jenkinsfile that would build and test my application, then build a docker image with the Dockerfile i've composed and then push it into AWS ECR. The steps I am stuck on is building an image with docker, i receive and error message docker: not found. I downloaded docker plug-in and configured it in the global tool configuration tab. Am i not adding it into tools correctly?
There was another post wear you could use kubernetes to do that however kubernetes no longer supports docker.
image of how i configured docker in global tools config:
global tool config
error
/var/jenkins_home/workspace/client-pipeline_feature-jenkins#tmp/durable-41220eb0/script.sh: 1: /var/jenkins_home/workspace/client-pipeline_feature-jenkins#tmp/durable-41220eb0/script.sh: docker: not found
error with permission to sock
def gv
containerVersion = "1.0"
appName = "foodcore"
imageName = appName + ":" + version
pipeline {
agent any
environment {
CI = 'true'
}
tools {
nodejs "node"
docker "docker"
}
stages {
stage("init") {
steps {
script {
gv = load "script.groovy"
CODE_CHANGES = gv.getGitChanges()
}
}
}
stage("build frontend") {
steps {
dir("client") {
sh 'npm install'
}
}
}
stage("build backend") {
steps {
dir("server") {
sh 'npm install'
}
}
}
stage("test") {
when {
expression {
script {
CODE_CHANGES == false
}
}
}
steps {
dir("client") {
sh 'npm test'
}
}
}
stage("build docker image") {
when {
expression {
script {
env.BRANCH_NAME.toString().equals('Main') && CODE_CHANGES == false
}
}
}
steps {
sh "docker build -t ${imageName} ."
}
}
stage("push docker image") {
when {
expression {
env.BRANCH_NAME.toString().equals('Main')
}
}
steps {
sh 'aws ecr get-login-password --region us-east-2 | docker login --username AWS --password-stdin repoURI'
sh 'docker tag foodcore:latest ...repoURI
sh 'docker push repoURI'
}
}
}
}
Use echo hello world to make...
Docker should be installed on the server Jenkins is running. The docker plugin provided by Jenkins is just like a tool to generate some snippets for the pipeline scripts. Installing and configuring the tool doesn't install a docker daemon. Please check if docker is installed on the OS or not.
As we can see in the thread, you are start getting permission denied on docker.sock.
docker.sock permissions will be lost if you restart system or docker service.
To make it persistence setup a cron to change ownership after each reboot
#reboot chmod 777 /var/run/docker.sock
and When you restart the docker, make sure to run the below command
chmod 777 /var/run/docker.sock
Or you can put a cron for it also, which will execute in each every 5 minutes.
To use docker inside Jenkins build, There are 2 methods.
Use Jenkins docker plugins as describe in above solution.
Or install docker itself in the Jenkins container and mount the docker.sock file.

How to pull & run docker image on remote server through jenkins pipeline

I have 2 aws ubuntu instance: 1st-server and 2nd-server.
Below is my jenkins pipeline script which create docker image and runs container on 1st-server and push the image to docker hub repo. That's working fine.
I want to pull image and deploy it on 2nd-server.
When I do ssh for 2nd server through below pipeline script but it logins to 1st-server, even if ssh credential ('my-ssh-key') are of 2nd-server. I'm confused how it logging to 1st-server and I checked with touch commands so the file is creating on 1st-server.
pipeline {
environment {
registry = "docker-user/docker-repo"
registryCredential = 'docker-cred'
dockerImage = ''
}
agent any
stages {
stage('Cloning Git') {
steps {
git url: 'https://github.com/git-user/jenkins-flask-tutorial.git/'
}
}
stage('Building image') {
steps{
script {
sh "sudo docker build -t flask-app-one ."
sh "sudo docker run -p 5000:5000 --name flask-app-one -d flask-app-one "
sh "docker tag flask-app-one:latest docker-user/myrepo:flask-app-push-test"
}
}
}
stage('Push Image') {
steps{
script {
docker.withRegistry( '', registryCredential ) {
sh "docker push docker-user/docker-repo:flask-app-push-test"
sshagent(['my-ssh-key']) {
sh 'ssh -o StrictHostKeyChecking=no ubuntu#2ndserver && cd /home/ubuntu/ && sudo touch test-file && docker pull docker-user/docker-repo:flask-app-push-test'
}
}
}
}
}
My question is, how to login to 2nd server and pull the docker image on 2nd server via through jenkins pipeline script? Help me out where I'm doing wrong.
This is more of an alternative than a solution. You can execute the remote commands as part of ssh. This will execute the command on the server and disconnect.
ssh name#ip "ls -la /home/ubuntu/"

Docker is listening to port specified in run command

I created a pipeline in Jenkins which takes an app from Github, builds the app, and then builds an image and then finally runs that image with the app.
the Dockerfile is:
FROM javastreets/mule:latest
COPY ./target/jenkins-demo-api-1.0.0-1.0.0-SNAPSHOT-mule-application.jar /opt/mule/apps/
CMD [ "/opt/mule/bin/mule"]
here jenkins-demo-api-1.0.0-1.0.0-SNAPSHOT-mule-application.jar is the app that is built in Jenkins from Github.
the pipeline script is as:
pipeline {
agent any
tools{
maven 'M3'
}
stages {
stage('git pull'){
steps{
git branch: 'master', credentialsId: '025fbee3-18cc-4298-ac9b-adac*****', url: 'https://github.com/treadston-e/mule-jenkins.git'
}
}
stage('Build') {
steps {
bat "mvn clean package"
}
}
stage('build image'){
steps{
bat 'docker build -t docker-demo .'
}
}
stage('run image'){
steps{
bat 'docker run -d -p 127.0.0.1:8081:8081 docker-demo'
}
}
}
}
the pipeline executes successfully but when I try to hit, http:localhost:8081 response I receive is This page isn’t working
what should I do?
The localhost you are referring to is a localhost of the docker container which is not the same as of your client. Try to specify network in your docker run command.
docker run -d --network host -p 8081:8081 docker-demo
if you would like to check on which IP address the bridge is running, you can check it as follows:
docker network inspect bridge

Jenkins Pipeline Docker Network not found

So I have this setup
stage('Build') {
steps {
sh """ docker-compose -f docker-compose.yml up -d """
sh """ docker-compose -f docker-compose.yml exec -T app buildApp """
}
stage('Start UI server') {
steps {
script { env.NETWORK_ID = get network id with some script }
sh """ docker-compose -f docker-compose.yml exec -d -T app startUiServer """
}
}
stage('UI Smoke Testing') {
agent {
docker {
alwaysPull true
image 'some custom image'
registryUrl 'some custom registry'
registryCredentialsId 'some credentials'
args "-u root --network ${env.NETWORK_ID}"
}
}
steps { sh """ run the tests """ }
}
And for some reason
the pipeline fails with this error. Most of the time, not all the time
java.io.IOException: Failed to run image 'my image'. Error: docker: Error response from daemon: network 3c5b5b45ca0e not found.
So the Network ID is the right one. I've checked.
Any ideas why this is failing?
i really appreciate any help.

Build docker image in Jenkins (in docker container) pipeline

I use Jenkins from docker container. And I want to build docker image in Jenkins pipeline but docker is not exist in this container (where Jenkins).
Jenkins container deployed by Docker Compose, yml file:
version: "3.3"
services:
jenkins:
image: jenkins:alpine
ports:
- 8085:8080
volumes:
- ./FOR_JENKINS:/var/jenkins_home
What we can do to build docker image in Jenkins pipeline?
Can we deploy some docker container with docker and use once for build docker image? or something else? How do you doing with they?
Edit:
Thanks #VonC, I checked your information, but... "permission denied"
Docker Compose file:
version: "3.3"
services:
jenkins:
image: jenkins:alpine
ports:
- 8085:8080
volumes:
- ./FOR_JENKINS:/var/jenkins_home
# - /var/run/docker.sock:/var/run/docker.sock:rw
- /var/run:/var/run:rw
Jenkinsfile:
pipeline {
agent any
stages {
stage('Build') {
steps {
echo "Compiling..."
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt compile"
}
}
/*stage('Unit Test') {
steps {
echo "Testing..."
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt coverage 'test-only * -- -F 4'"
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt coverageReport"
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt scalastyle || true"
}
}*/
stage('DockerPublish') {
steps {
echo "Docker Stage ..."
// Generate Jenkinsfile and prepare the artifact files.
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt docker:stage"
echo "Docker Build-2 ..."
// Run the Docker tool to build the image
script {
docker.withTool('docker') {
echo "D1- ..."
//withDockerServer([credentialsId: "AWS-Jenkins-Build-Slave", uri: "tcp://192.168.0.29:2376"]) {
echo "D2- ..."
sh "printenv"
echo "D3- ..."
//sh "docker images"
echo "D4- ..."
docker.build('my-app:latest', 'target/docker/stage').inside("--volume=/var/run/docker.sock:/var/run/docker.sock")
echo "D5- ..."
//base.push("tmp-fromjenkins")
//}
}
}
}
}
}
}
Result:
[job1] Running shell script
+ docker build -t my-app:latest target/docker/stage
Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Post http://%2Fvar%2Frun%2Fdocker.sock/v1.29/build?buildargs=%7B%7D&cachefrom=%5B%5D&cgroupparent=&cpuperiod=0&cpuquota=0&cpusetcpus=&cpusetmems=&cpushares=0&dockerfile=Dockerfile&labels=%7B%7D&memory=0&memswap=0&networkmode=default&rm=1&shmsize=0&t=my-app%3Alatest&target=&ulimits=null: dial unix /var/run/docker.sock: connect: permission denied
script returned exit code 1
Edit:
Last problem with "permission denied" fixed with:
>>sudo chmod 0777 /var/run/docker.sock
Worked State:
Call in host:
>>sudo chmod 0777 /var/run/docker.sock
Docker Compose file:
version: "3.3"
services:
jenkins:
image: jenkins:alpine
ports:
- 8085:8080
volumes:
- ./FOR_JENKINS:/var/jenkins_home
# - /var/run/docker.sock:/var/run/docker.sock:rw
- /var/run:/var/run:rw
Jenkinsfile:
pipeline {
agent any
stages {
stage('Build') {
steps {
echo "Compiling..."
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt compile"
}
}
/*stage('Unit Test') {
steps {
echo "Testing..."
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt coverage 'test-only * -- -F 4'"
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt coverageReport"
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt scalastyle || true"
}
}*/
stage('DockerPublish') {
steps {
echo "Docker Stage ..."
// Generate Jenkinsfile and prepare the artifact files.
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt docker:stage"
echo "Docker Build-2 ..."
// Run the Docker tool to build the image
script {
docker.withTool('docker') {
echo "D1- ..."
//withDockerServer([credentialsId: "AWS-Jenkins-Build-Slave", uri: "tcp://192.168.0.29:2376"]) {
echo "D2- ..."
sh "printenv"
echo "D3- ..."
//sh "docker images"
echo "D4- ..."
docker.build('my-app:latest', 'target/docker/stage')
echo "D5- ..."
//base.push("tmp-fromjenkins")
//}
}
}
}
}
}
}
My resolve:
I add some step in Jenkinsfile and get:
pipeline {
agent any
//def app
stages {
stage('Build') {
steps {
echo "Compiling..."
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt compile"
}
}
stage('DockerPublish') {
steps {
echo "Docker Stage ..."
// Generate Jenkinsfile and prepare the artifact files.
sh "${tool name: 'sbt', type: 'org.jvnet.hudson.plugins.SbtPluginBuilder$SbtInstallation'}/bin/sbt docker:stage"
echo "Docker Build ..."
// Run the Docker tool to build the image
script {
docker.withTool('docker') {
echo "Environment:"
sh "printenv"
app = docker.build('ivanbuh/myservice:latest', 'target/docker/stage')
echo "Push to Docker repository ..."
docker.withRegistry('https://registry.hub.docker.com', 'docker-hub-credentials') {
app.push("${env.BUILD_NUMBER}")
app.push("latest")
}
echo "Complated ..."
}
}
}
}
//https://boxboat.com/2017/05/30/jenkins-blue-ocean-pipeline/
//https://gist.github.com/bvis/68f3ab6946134f7379c80f1a9132057a
stage ('Deploy') {
steps {
sh "docker stack deploy myservice --compose-file docker-compose.yml"
}
}
}
}
You can look at "Docker in Docker in Jenkins pipeline". It includes the step:
inside the Jenkinsfile, I need to connect my build container to the outer Docker instance. This is done by mounting the Docker socket itself:
docker.build('my-build-image').inside("--volume=/var/run/docker.sock:/var/run/docker.sock") {
// The build here
}
You can see a similar approach in "Building containers with Docker in Docker and Jenkins".
In order to make the Docker from the host system available I need to make the API available to the Jenkins docker container. You can do this by mapping the docker socket that is available on the parent system.
I have created a small docker-compose file where I map both my volumes and the docker socket as following:
jenkins:
container_name: jenkins
image: myjenkins:latest
ports:
- "8080:8080"
volumes:
- /Users/devuser/dev/docker/volumes/jenkins:/var/jenkins_home
- /var/run:/var/run:rw
Please note the special mapping the ‘/var/run’ with rw privileges, this is needed to make sure the Jenkins container has access to the host systems docker.sock.
And, as I mentioned before, you might need to run docker in privilege mode.
Or, as the OP reported:
sudo chmod 0777 /var/run/docker.sock

Resources