Is it possible to specify a Dockerfile as basis for a container job, instead of a reference to an already built container?
This is possible with Jenkins, and a much appreciated feature:
pipeline {
stages {
stage("My Stage") {
agent {
dockerfile {
filename 'Dockerfile'
}
}
steps {
//
}
}
}
}
I would suggest something like:
container:
dockerfile: Dockerfile
Related
How can I use an image built by kaniko in Jenkins pipeline? I want to build the image, use that image to run tests for my app, and then push the image. With docker that would look something like this:
steps {
container('docker') {
script {
myimage = docker.build('image:tag')
}
}
container('docker') {
script {
myimage.inside {
sh 'pipenv run test'
}
}
}
# and somewhere below I would use `docker.withRegistry('some registry') { myimage.push() }
}
I am not sure how to translate that part myimage.inside from docker. With kaniko I have this:
steps {
container('kaniko') {
script {
sh '/kaniko/executor --tar-path=myimage.tar --context . --no-push --destination myregistry:tag'
}
}
container(???) {
# how can I use that image from above to run my tests??
}
# and somewhere below I use `crane` to push the image.
}
Not sure if this is relevant, but whole pipeline would run in kubernetes environment, thus I would want to avoid using docker in docker (DinD).
I have multiple stages defined inside my Jenkinsfile (pipeline mode).
I would like to run one stage (the local build) inside one Docker container,
and some other stages (the cross-build) inside another Docker container.
However, I would also like the Docker containers to be specified by corresponding Dockerfiles instead of just by an image name, since I need some customisation in the respective containers.
The Jenkins documentation states that both these things are possible.
However, it also states that Dockerfile will always be used as the filename for the docker container specification.
How can I create two separate Dockerfiles and tell Jenkins to use one of them for one stage, and the other one for some other stage(s)?
From my understanding, the Jenkinsfile should look something like this:
pipeline {
agent None
stages {
stage('Local Build') {
agent { dockerfile true }
steps {
sh 'mvn --version'
}
}
stage('Cross-build') {
agent { dockerfile true } // <--- How do I specify *which*
// Dockerfile to use here?
steps {
sh 'node --version'
}
}
}
}
You can add the following details to the Dockerfile agent. So you can simply use filename and dir options.
agent {
// Equivalent to "docker build -f Dockerfile.build --build-arg version=1.0.2 ./build/
dockerfile {
filename 'Dockerfile.build'
dir 'build'
label 'my-defined-label'
additionalBuildArgs '--build-arg version=1.0.2'
args '-v /tmp:/tmp'
}
}
Suppose I have a dockerized pipeline with multiple steps. The docker container is defined in the beginning of Jenkinsfile:
pipeline {
agent {
docker {
image 'gradle:latest'
}
}
stages {
// multiple steps, all executed in 'gradle' container
}
post {
always {
sh 'git whatever-command' // will not work in 'gradle' container
}
}
}
I would like to execute some git commands in a post-build action. The problem is that gradle image does not have git executable.
script.sh: line 1: git: command not found
How can I execute it on Docker host still using gradle container for all other build steps? Of course I do not want to explicitly specify container for each step but that specific post-post action.
Ok, below is my working solution with grouping multiple stages (Build and Test) in a single dockerized stage (Dockerized gradle) and single workspace reused between docker host and docker container (see reuseNode docs):
pipeline {
agent {
// the code will be checked out on out of available docker hosts
label 'docker'
}
stages {
stage('Dockerized gradle') {
agent {
docker {
reuseNode true // < -- the most important part
image 'gradle:6.5.1-jdk11'
}
}
stages{
// Stages in this block will be executed inside of a gradle container
stage('Build') {
steps{
script {
sh "gradle build -x test"
}
}
}
stage('Test') {
steps{
script {
sh "gradle test"
}
}
}
}
}
stage('Cucumber Report') {
// this stage will be executed on docker host labeled 'docker'
steps {
cucumber 'build/cucumber.json'
}
}
}
post {
always {
sh 'git whatever-command' // this will also work outside of 'gradle' container and reuse original workspace
}
}
}
I am using the declarative format for pipeline files and running inside of a docker container that is defined using a Dockerfile in my project's root directory.
My Jenkinsfile looks like this:
pipeline {
agent {
dockerfile {
additionalBuildArgs '--network host'
}
}
stages {
stage('Test') {
steps {
sh 'pytest --version'
}
}
}
I would like to pass additional arguments to the docker run command similar to this question ... How to pass docker container arguments when running the image in a Jenkinsfile
Is it possible to do that in the declarative pipeline format, or should I switch?
Edit:
This is essentially the equivalent of what I am trying to do in non-declarative:
node {
def pytestImage = docker.build('pytest-image:latest', '--network host .')
pytestImage.inside('--network=host') {
sh 'pytest --version'
// other commands ...
}
}
You can add args option to your dockerfile. It passes arguments directly to a docker run invocation:
pipeline {
agent {
dockerfile {
additionalBuildArgs '--network host'
args '--network=host'
}
}
stages {
stage('Test') {
steps {
sh 'pytest --version'
}
}
}
More information here
We have a jenkins master and some slaves. All on their own servers.
Now we have installed Docker on the slaves. We try to create a very basic pipeline now which will perform one step in the container.
I saw configurations where the agent is a docker container.
But we want something like this:
pipeline {
agent any
triggers {
pollSCM pipelineParams.polling
}
options {
buildDiscarder(logRotator(numToKeepStr: '3'))
}
stages {
stage('Clone') {
steps {
//clone repo scm..
}
}
stage ('npm') {
steps {
script {
sh 'npm ...'
}
}
stage ('docker') {
steps {
//start docker container and mount project in it
}
}
...
How do we have to configure the docker step? Do we have to define a new agent insdie the stage step while we have agent any already above?