Building docker images inside a Jenkins container - docker

I am using a jenkins container to execute a pipeline based on this Jenkinsfile:
pipeline {
agent any
tools {
maven 'Maven 3.6.0'
jdk 'jdk8'
}
stages {
stage('Pull from git') {
steps {
checkout scm
}
}
stage('Compile App') {
steps {
sh "mvn clean install"
}
}
stage('Build da Imagem') {
steps {
script {
docker.withTool("docker") {
def readyImage = docker.build("dummy-project/dummy-project-image", "./docker")
}
}
}
}
}
}
At the last stage i'm getting this Error when it tries to build the docker image.
Is it possible build a docker image inside jenkins container?

Your pipeline executing agent doesn't communicate with docker daemon, so you need to configure it properly and you have three ways (the ones I know):
1) Provide your agent with a docker installation
2) Add a Docker installation from https:/$JENKINS_URL/configureTools/
3) If you use Kubernetes as orchestrator you may add a podTemplate definition at the beginning of your pipeline and then use it, here an example:
// Name of the application (do not use spaces)
def appName = "my-app"
// Start of podTemplate
def label = "mypod-${UUID.randomUUID().toString()}"
podTemplate(
label: label,
containers: [
containerTemplate(
name: 'docker',
image: 'docker',
command: 'cat',
ttyEnabled: true)],
volumes: [
hostPathVolume(hostPath: '/var/run/docker.sock', mountPath: '/var/run/docker.sock'),
hostPathVolume(hostPath: '/usr/bin/kubectl', mountPath: '/usr/bin/kubectl'),
secretVolume(mountPath: '/etc/kubernetes', secretName: 'cluster-admin')],
annotations: [
podAnnotation(key: "development", value: appName)]
)
// End of podTemplate
[...inside your pipeline]
container('docker') {
stage('Docker Image and Push') {
docker.withRegistry('https://registry.domain.it', 'nexus') {
def img = docker.build(appName, '.')
img.push('latest')
}
I hope this helps you

Related

How to run dynamic stages in paralell on jenkins with a separate kubernetes agent for each stage

I tried combining things I have found on the syntax but this is as close as I can get. It creates multiple stages but says they have no steps.
I can get it to run a bunch of parallel steps on the same agent if I move the agent syntax down to where the "test" stage is defined but I want to spin up separate pods for each one so I can actually use the kubernetes cluster effectively and do my work parallel.
attached is an example Jenkinsfile for reference
def parallelStagesMap
def generateStage(job) {
return {
stage ("$job.key") {
agent {
kubernetes {
cloud 'kubernetes'
yaml """
apiVersion: v1
kind: Pod
spec:
containers:
- name: name
image: image
command:
- sleep
args:
- infinity
"""
}
}
steps {
sh """
do some important stuff
"""
}
}
}
}
pipeline {
agent none
stages {
stage('Create List of Stages to run in Parallel') {
steps {
script {
def map = [
"name" : "aparam",
"name2" : "aparam2"
]
parallelStagesMap = map.collectEntries {
["${it.key}" : generateStage(it)]
}
}
}
}
stage('Test') {
steps {
script {
parallel parallelStagesMap
}
}
}
stage('Release') {
agent etc
steps {
etc
}
}
}
}
To run your dynamically created jobs in parallel you will have to use scripted pipeline syntax.
The equivalent syntax for the declarative kubernetes agent in the scripted pipeline is podTemplate and node (see full Doucumentation):
podTemplate(yaml: '''
apiVersion: v1
kind: Pod
spec:
containers:
- name: maven
image: maven:3.8.1-jdk-8
command:
- sleep
args:
- 99d
''') {
node(POD_LABEL) {
...
}
}
Notice that the podTemplate can receive the cloud parameter in addition to the yaml but it defaults to kubernetes so there is no need to pass it.
So in your case you can use this syntax to run the jobs in parallel on different agents:
// Assuming yaml is same for all nodes - if not it can be passed as parameter
podYaml= """
apiVersion: v1
kind: Pod
spec:
containers:
- name: name
image: image
command:
- sleep
args:
- infinity
"""
pipeline {
agent none
stages {
stage('Create List of Stages to run in Parallel') {
steps {
script {
def map = ["name" : "aparam",
"name2" : "aparam2"]
parallel map.collectEntries {
["${it.key}" : generateStage(it)]
}
}
}
}
}
}
def generateStage(job) {
return {
stage(job.key) {
podTemplate(yaml:podYaml) {
node(POD_LABEL) {
// Each execution runs on its own node (pod)
sh "do some important stuff with ${job.value}"
}
}
}
}
}
As explained in this answer:
Dynamic parallel stages could be created only by using Scripted Pipelines. The API built-it Declarative Pipeline is not available (like agent).
So, you can't run dynamic stages in parallel on different agents.
To achieve what you want to do, a solution would be to trigger another pipeline that run on a new kube pod and wait for its completion before next steps.
Here is the Jenkinsfiles for more understanding:
Main job Jenkinsfile:
def parallelJobsMap
def triggerJob(item) {
return {
build job: 'myChildJob', parameters: [string(name: 'MY_PARAM', value: "${item.value}")], wait: true
}
}
pipeline {
agent none
stages {
stage('Create List of Stages to run in Parallel') {
steps {
script {
def map = [
"name" : "aparam",
"name2" : "aparam2"
]
parallelJobsMap = map.collectEntries {
["${it.key}" : triggerJob(it)]
}
}
}
}
stage('Test') {
steps {
script {
parallel parallelJobsMap
}
}
}
stage('Release') {
agent any
steps {
echo "Release stuff"
}
}
}
}
Child job Jenkinsfile:
pipeline {
agent none
parameters {
string(
name: 'MY_PARAM',
description: 'My beautiful parameter',
defaultValue: 'A default value',
trim: true
)
}
stages {
stage ("Job") {
agent {
kubernetes {
cloud 'kubernetes'
yaml """
apiVersion: v1
kind: Pod
spec:
containers:
- name: name
image: image
command:
- sleep
args:
- infinity
"""
}
}
steps {
echo "Do some important stuff with the parameter " + params.MY_PARAM
}
}
}
}

Enabling Jenkins to build on a separate Docker in Docker container

I have a docker in docker setup that build's docker images which is not on the same node as the Jenkins node. When I try to build using the Jenkins node I receive:
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
To fix I can build a docker image using below within Jenkinsfile:
stage('Docker Build') {
agent any
steps {
script {
withDockerServer([uri: "tcp://10.44.10.8:2375"]) {
withDockerRegistry([credentialsId: 'docker', url: "https://index.docker.io/v1/"]) {
def image = docker.build("ron/reactive")
image.push()
}
}
}
}
}
This works as expected, I can use the above Jenkins pipeline config to build a Docker container.
I'm attempting to use the Docker server running at tcp://10.44.10.8:2375 to package a Java Maven project on a new container running on Docker. I've defined the pipeline build as :
pipeline {
agent any
stages {
stage('Maven package') {
agent {
docker {
image 'maven:3-alpine'
args '-v /root/.m2:/root/.m2'
}
}
stages {
stage('Build') {
steps {
sh 'mvn -B -DskipTests clean package'
}
}
}
}
}
}
And receive this message from Jenkins with no further output:
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Maven package)
[Pipeline] node
Still waiting to schedule task
‘Jenkins’ doesn’t have label ‘dockerserverlabel’
I've configured the Docker label in Jenkins as :
Which matches the 'Docker Build' settings from the Jenkins file above.
But it seems I've not included some other config within Jenkins and/or the Jenkinsfile to enable the docker image to be built on tcp://10.44.10.8:2375 ?
I'm working through https://www.jenkins.io/doc/tutorials/build-a-java-app-with-maven/ which describes a pipeline for building a maven project on Docker:
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v /root/.m2:/root/.m2'
}
}
stages {
stage('Build') {
steps {
sh 'mvn -B -DskipTests clean package'
}
}
}
}
But how to configure the build on a separate Docker container is not described.
Can this Jenkins config:
stage('Docker Build') {
agent any
steps {
script {
withDockerServer([uri: "tcp://10.44.10.8:2375"]) {
withDockerRegistry([credentialsId: 'docker', url: "https://index.docker.io/v1/"]) {
def image = docker.build("ron/reactive")
image.push()
}
}
}
}
}
be used with
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v /root/.m2:/root/.m2'
}
}
stages {
stage('Build') {
steps {
sh 'mvn -B -DskipTests clean package'
}
}
}
}
?

Jenkins Declarative: Kubernetes Plugin with multiple agents

I am trying to setup a Jenkins declarative pipeline to use two different agents during its execution. The agents are dynamically spawned by the Kubernetes plugin. For sake of argument and simplicity, let's assume I want to do this:
On Agent 1 (Cloud name: "ubuntu"):
Run apt-get and some installs
Run a shell script
Additional steps
On Agent 2 (Cloud name: "fedora"):
Run dnf and some installs
Run a shell script
Additional steps
The problem I have is that if if I use a global agent declaration:
pipeline {
agent {
kubernetes {
cloud 'ubuntu'
label "ubuntu-agent"
containerTemplate {
name 'support'
image 'blenderfox/support'
ttyEnabled true
command 'cat'
}
}
}
...
}
Then that is used across all the stages if I don't declare an agent on each of the stages.
If I use agent none:
pipeline {
agent none
...
}
Then I have to declare an agent spec for each stage, for example:
stage ("apt update") {
agent {
kubernetes {
cloud 'ubuntu'
label "ubuntu-agent"
containerTemplate {
name 'support'
image 'blenderfox/support'
ttyEnabled true
command 'cat'
}
}
}
steps {
sh """
apt update
"""
}
}
While this would work for me in that I can declare per stage which agent I want, the problem this method causes, is that it spins up a new agent for each stage, meaning the state isn't carried between, for example, these two stages:
stage ("apt-update") {
agent {
....
}
steps {
sh """
apt update
"""
}
}
stage ("apt-install") {
agent {
....
}
steps {
sh """
apt install -y ....
"""
}
}
Can I reuse the same agent across stages? For example, something like this:
stage ("provision agent") {
agent {
...
label "ubuntu-agent"
...
}
steps {
sh """
echo "Provisioning agent"
"""
}
}
stage ("apt-update") {
agent {
label "ubuntu-agent" //reuse agent from previous stage
}
steps {
sh """
apt update
"""
}
}
stage ("apt-install") {
agent {
label "ubuntu-agent" //reuse agent from previous stage
}
steps {
sh """
apt install -y ....
"""
}
}
Found a solution. Very hacky but it works:
pipeline {
agent none
stages {
stage ("Provision dev agent") {
agent {
kubernetes {
cloud 'dev-cloud'
label "dev-agent-${env.BUILD_NUMBER}"
slaveConnectTimeout 300
idleMinutes 5
yamlFile "jenkins-dev-agent.yaml"
}
}
steps {
sh """
## Do any agent init steps here
"""
}
}
stage ("Do something on dev agent") {
agent {
kubernetes {
label "dev-agent-${env.BUILD_NUMBER}"
}
}
steps {
sh """
## Do something here
"""
}
}
stage ("Provision production agent") {
agent {
kubernetes {
cloud 'prod-cloud'
label "prod-agent-${env.BUILD_NUMBER}"
slaveConnectTimeout 300
idleMinutes 5
yamlFile "jenkins-prod-agent.yaml"
}
}
steps {
sh """
## Do any agent init steps here
"""
}
}
stage ("Do something on prod agent") {
agent {
kubernetes {
label "prod-agent-${env.BUILD_NUMBER}"
}
}
steps {
sh """
## Do something here
"""
}
}
}
}
The agent yamls vary, but you can do something like this:
spec:
containers:
- name: docker
image: docker:18.06.1
command: ["tail", "-f", "/dev/null"]
imagePullPolicy: Always
volumeMounts:
- name: docker
mountPath: /var/run/docker.sock
volumes:
- hostPath:
path: "/var/run/docker.sock"
name: "docker"
And then use the agent like so:
stage ("docker build") {
agent {
kubernetes {
label "dev-agent-${env.BUILD_NUMBER}"
}
}
steps {
container('docker') {
sh """
## docker build....
"""
}
}
}
There's a solution for this using sequential stages - you define a stage with your agent, and then you can nest other stages inside it
pipeline {
agent none
stages {
stage ("Provision dev agent") {
agent {
kubernetes {
cloud 'dev-cloud'
slaveConnectTimeout 300
yamlFile "jenkins-dev-agent.yaml"
}
}
stages {
stage ("Do something on dev agent") {
steps {
sh """
## Do something here
"""
}
}
stage ("Do something else on dev agent") {
steps {
sh """
## Do something here
"""
}
}
}
}
stage ("Provision prod agent") {
agent {
kubernetes {
cloud 'prod-cloud'
slaveConnectTimeout 300
yamlFile "jenkins-prod-agent.yaml"
}
}
stages {
stage ("Do something on prod agent") {
steps {
sh """
## Do something here
"""
}
}
stage ("Do something else on prod agent") {
steps {
sh """
## Do something here
"""
}
}
}
}
}
}

Jenkins Pipeline docker.withRegistry() push leads to endless loop

I managed to setup a jenkins on kubernetes and gitbucket on kubernetes. Now I am trying out to create my own first docker file for uploading on dockerhub. Unfortunately it fails while uploading to docker. Build is successfully, but I cant manage how to upload it to dockerhub (private repository).
Jenkinsfile
def label = "${BUILD_TAG}"
podTemplate(label: label, containers: [
containerTemplate(name: 'docker', image: 'docker:latest', command: 'cat', ttyEnabled: true)
],
volumes: [
hostPathVolume(mountPath: '/var/run/docker.sock', hostPath: '/var/run/docker.sock')
]) {
node(label) {
def app
def myRepo = checkout scm
def gitCommit = myRepo.GIT_COMMIT
def gitBranch = myRepo.GIT_BRANCH
def shortGitCommit = "${gitCommit[0..10]}"
def previousGitCommit = sh(script: "git rev-parse ${gitCommit}~", returnStdout: true)
stage('Decommission Infrastructure') {
container('kubectl') {
echo "Decmomission..."
}
}
stage('Build application') {
container('docker') {
app = docker.build("fasautomation/recon", ".")
}
}
stage('Run unit tests') {
container('docker') {
app.inside {
sh 'echo "Test passed"'
}
}
}
stage('Docker publish') {
container('docker') {
docker.withRegistry('https://registry.hub.docker.com', '<<jenkins store-credentials>>') {
echo "Pushing 1..."
// Push tagged version
app.push("${env.BUILD_NUMBER}")
echo "Pushing 2..."
// Push latest-tagged version
app.push("latest")
echo "Pushed!"
}
}
}
stage('Deployment') {
container('docker') {
// Deploy to Kubernetes
echo 'Deploying'
}
}
stage('Provision Infrastructure') {
container('kubectl') {
echo 'Provision...'
}
}
}
}
Jenkins Logs
[...]
[Pipeline] stage (hide)
[Pipeline] { (Docker publish)
[Pipeline] container
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withDockerRegistry
Executing sh script inside container docker of pod jenkins-recon-master-116-0ksw8-f7779
Executing command: "docker" "login" "-u" "*****" "-p" ******** "https://index.docker.io/v1/"
exit
<<endless loading symbol>>
Does anyone has a clue how to debug here? Credentials work. Not sure why there is the exit in the log without the logs for pushing afterwards... :-(

node is not a kubernetes node

I am trying to run simple jenkins pipeline for Maven project. When I try to run it on Jenkins, I am getting below error:
ERROR: Node is not a Kubernetes node:
I have searched everything related to this error but could not find anything.
Can someone tell me where am I doing mistake?
Jenkinsfile:
pipeline {
agent {
kubernetes {
cloud 'openshift'
label 'test'
yamlFile 'jenkins/BuildPod.yaml'
}
}
stages {
stage('Build stage') {
steps {
sh 'mvn -B clean verify'
}
}
stage('Test stage') {
steps {
sh 'mvn test'
}
}
stage('Package stage') {
steps {
sh 'mvn package'
}
}
}
}
BuildPod.yaml:
kind: Pod
apiVersion: v1
metadata:
name: test
labels:
app: test
spec:
containers:
- name: jnlp
image: openshift/jenkins-slave-base-centos7:latest
envFrom:
- configMapRef:
name: jenkins-config
- name: oc-dev
image: reliefmelone/ocalpine-os:latest
tty: true
command:
- cat
- name: maven
image: maven:3.6.1-jdk-13
tty: true
command:
- cat
- name: jdk
image: 13-jdk-alpine
tty: true
command:
- cat
I just want to build my project now. But it is not working.
You're missing the container in your stage step.
Example:
stage('Build stage') {
steps {
container('maven') {
sh 'mvn -B clean verify'
}
}
}

Resources