Using container and Docker image for Windows and Linux task - docker

I have a requirement to run the specific stages between Linux and Windows agent. I am able to achieve this by putting agents and labels in every single stage but every time pipeline is looking for new executor in every stage which is consuming a lot of time.
So I am trying to search out for the way where the pipeline runtime can be shorten by using only same linux label for Linux tasks and docker windows image for Windows task.
With only Linux task - Pipeline is working fine and no new executor is being assigned. Indeed this saving pipeline runtime. However when there is a parameter to build the project then I am unable to invoke the Docker agent. I am appending the sample Jenkins pipeline code below for quick reference.
error - docker: not found.
I think this is because Docker is not available in label linux. But if I put agent in every single stage then it is working fine but consuming a lot of time in starting the agent every time. So with the same resources I want to achieve this.
pipeline {
agent {
label 'Linux'
}
stages {
stage ('Linux') {
steps {
container('linux') {
script{
some task
}
}
}
}
}
stage('env setup') {
steps {
container('linux') {
script {
some task
}
}
}
}
stage ('windows') {
agent {
docker {
label 'docker label'
image "docker image"
reuseNode true
}
}
stages {
stage ('build') {
steps {
script {
build with windows
}
}
}
stage ('Push') {
steps {
script {
push with windows
}
}
}
}
}
stage ('Deploy') {
steps {
container('linux') {
script {
deploy using linux
}
}
}
}
}

Related

How to invoke the job from one slave(server) to another slave(server) in jenkins

Please, can you advise as I am planning to invoke a job that is on different server. For example:
Slave1 server1: deploy job
Slave2 server2: build job
I want build job trigger the deploy job. Any suggestion, please
If you are trying to run two Jobs in the same Jenkins server. Your Pipeline should look something like below. Here from build Job you can call the Deploy Job.
Build Job
pipeline {
agent { label 'server2' }
stages {
stage('Build') {
steps {
build job: 'DeployJobName'
}
}
}
}
Deploy Job
pipeline {
agent { label 'server1' }
stages {
stage('Deploy') {
steps {
// Deploy something
}
}
}
}
Update
If you want to trigger a Job in a different Jenkins server, you can use a Plugin like RemoteTriggerPlugin or simply using the Jenkins API.
curl http://serer1:8080/job/DeployJobName/build?token=YOUR_REMOTE_TRIGGER_TOKEN
pipeline {
agent { label 'server2' }
stages {
stage('Build') {
steps {
sh 'curl http://server1:8080/job/DeployJobName/build?token=YOUR_REMOTE_TRIGGER_TOKEN'
}
}
}
}

Can't use specific images in Jenkinsfile

I've got a problem with a pipeline in which I want to build ARM images.
So, I was willing to use an arm32v7 image to run as build agent:
pipeline {
agent {
docker {
image 'arm32v7/docker:dind'
}
}
stages {
stage('Build atom images') {
steps {
// Building my images.
}
stage('Push to registry') {
agent any
steps{
script {
withDockerRegistry(credentialsId: 'cred', url: 'https://registry.custom') {
// Pushing images to registry.
}
}
}
}
}
}
But, when pipeline runs, here's what I got:
+ docker inspect -f . arm32v7/docker:dind
.
Failed to run top '81cc646256b727780420048da5ff10e5a3256510fc8a787137651941ee54d8a0'. Error: Error response from daemon: Container 81cc646256b727780420048da5ff10e5a3256510fc8a787137651941ee54d8a0 is not running
This happens with every kind of image I choose. Can you help me on this? Am i missing something?

Jenkins pipeline post build to run on a previously used agent workspace

In the Jenkins pipeline master-slave environment having multiple slaves type, how to force a post-build task to run in the same slave & workspace where a previous stage was executed.
For example, in the following pipeline code snippet, 3 different slaves are used. The "Dynamic Server Creation" stage is executed on "miscslave" type agent where we run a "vagrant up" command. The next 2 stages are executed in different slaves, 'performance_slave and 'seleniumslave'. Now when the tests are executed against the dynamic servers, we need to destroy it by running "vagrant destroy" as a post stage task. However, it needs to be run from the same slave "miscslave" workspace where the "vagrant up" executed, as it created a ".vagrant" directory there with the dynamic server machine info, which is required to run the "destroy".
How can we force the pipeline here to execute this post build task in the same workspace where "Dynamic Server Creation" was executed?
pipeline {
agent { label 'miscslave' }
stages {
stage('Stage 1') { }
stage("DynamicServer Creation") {
agent {
label 'miscslave'
}
stages {
stage('DynamicServer Creation') {
//Create Dynamic server using vagrant up, create a .vagrant dir to save the create machine info
}
stage('DynamicServer Test') {
parallel {
stage("Execute Performance Tests") {
agent { label 'performance_slave' }
}
stage("Execute UI Tests") {
agent { label 'seleniumslave' }
}
}
}
}
post {
always {
//Delete the dynamic server using vagrant destroy. It has to be run in the same workspace where "vagrant up" was executed in "Dynamic Server creation stage"
}
}
}
}
}```

How to configure Jenkins to run my test cases in parallel?

I am building a test system with Jenkins with multiple slave nodes. I have multiple test cases where each of them take at more than 15 minutes to run.
I want to make the system in a way that when I start tests Jenkins running each test case in a node which is free and at the end collects and summarizes the test results.
I have opened a Jenkins job which is general test-case job and it is parametrized where the parameter is the "test name". But I see that Jenkins is executing the jobs sequentially.
How can I configure Jenkins to run builds for the same job (with different parameters) in parallel?
simple syntax for parallel:
pipeline {
stages {
stage('Run Tests In Parallel') {
parallel {
stage('Projects Test 1') {
agent {
node { label "your jenkins label" }
}
steps{
script {
your test 1
}
}
post{
always {
script {
echo ' always'
}
}
}
}
stage('Projects Test 2') {
agent {
node { label "your jenkins label" }
}
steps{
script {
your test 2
}
}
post{
always {
script {
echo ' always'
}
}
}
}
}
}
}
}
Hi you can use parallel stages in jenkins which runs parallely. also use agent any at each stage so it will use any free node.
Check Parallel Stages document for more info

Jenkinsfile - Agents questions

I have two questions to following example:
pipeline {
agent { label "docker" }
stages {
stage('Build') {
agent {
docker {
image 'maven:3.5.0-jdk-8'
}
}
steps {
...
}
}
}
}
Question 1:
When I declare agent in top level of Jenkinsfile it means that it will be used for all below stages. So what is difference between:
agent { label "docker" }
and
agent {
docker {
image 'maven:3.5.0-jdk-8'
}
}
First one will use docker agent and second will use docker agent with maven image as executable environment? Where label "docker" agent is configured/installed?
Question 2:
How label tag is working? I know that somewhere is already created agent and using label I just point to it - like in example above: by default I use "docker" agent? it also means that during steps {...} this agent will be overridden by maven agent?
Question 3:
Last question for following example:
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v ... -e ...'
}
}
stages {
stage('Maven Build') {
steps {
sh '...'
}
}
stage('Docker push') {
agent {
docker 'openjdk:8-jdk-alpine'
}
steps {
script {
docker.build("my-image")
}
}
}
}
post {
...
}
}
I want to build first stage using docker container with maven:3-alpine image. During build following error is printed:
...tmp/durable-54b54bdc/script.sh: line 1: docker: not found
So I modified this example, here is the working result:
pipeline {
agent any
stages {
stage('Docker push') {
steps {
script {
docker.build("my-image")
}
}
}
}
}
How is it working agent any in this case? Which agent can execute docker.build?
Answer 1:
agent { label "docker" }
This will try to find an agent with label docker and execute the steps in that agent.
agent {
docker {
image 'maven:3.5.0-jdk-8'
}
}
This will try to pull the docker image with name maven:3.x.x and start the container and execute the steps mentioned in the pipeline. If you are using MultiJob this will be executed in the slave with the label based on this configuration:
Answer 2:
Defining agent at the top-level of the Pipeline ensures that an Executor will be assigned to the agent labeled docker. To my knowledge, I assume that docker container will be created in the agent labeled docker and steps will be executed inside the container.
Answer 3:
The reason could be, you may not have configured Docker Label (refer above image). The task could have executed in the master where the docker is not installed. The reason for other one working could be because, the job is executed in an agent where docker is installed.

Resources