Using withEnv in a declarative pipeline - jenkins

I'm trying to run docker command in my declarative pipeline, to install docker env on my slave machine i'm trying to use docker commons plugin "https://plugins.jenkins.io/docker-commons/", but no success.
Further research i have got below link mentioning how to use this plugin.
https://automatingguy.com/2017/11/06/jenkins-pipelines-simple-delivery-flow/
I have configured docker in manage jenkins -> global tool configuration, but dont find how to use below section in my declarative pipeline of jenkins, i think below structure/syntax will work for scripted jenkins pipeline
def dockerTool = tool name: 'docker', type:
'org.jenkinsci.plugins.docker.commons.tools.DockerTool'
withEnv(["DOCKER=${dockerTool}/bin"]) {
stages{}
}
Can someone pls help, how i can use docker common tool in declarative pipeline of jenkins.
Note: I cannot switch to scripted pipeline due to standardization with other projects

Here is the working example
pipeline{
agent any
stages{
stage('test') {
steps{
script{
test_env="this is test env"
withEnv(["myEnv=${test_env}"]){
echo "${env.myEnv}"
}
}
}
}
}
}

I have this feeling that you are don't need to use either withEnv or docker commons. Have you seen this? https://www.jenkins.io/doc/book/pipeline/docker/
There are plenty of good examples of how to use docker with Jenkinsfile.
My attempt to answer your question (if I got it right), if you are asking about declarative equivalent for scripted withEnv, then probably you are looking for environment {}? Something like this:
pipeline {
agent any
environment {
DOCKER = "${dockerTool}/bin"
}
stages {
stage('One') {
steps {
// steps here
}
}
}
}

Here is a working declarative pipeline solution as of Docker Commons v1.17
Note: the tool name, dockerTool is a keyword and docker-19.03.11 is name I gave my installation in Jenkins > Manage Jenkins > Global Tool Configuration page.
pipeline {
agent any
tools {
dockerTool 'docker-19.03.11'
}
stages {
stage('build') {
steps {
sh'''
echo 'FROM mongo:3.2' > Dockerfile
echo 'CMD ["/bin/echo", "HELLO WORLD...."]' >> Dockerfile
'''
script {
docker.withRegistry('http://192.168.99.100:5000/v2/') {
def image = docker.build('test/helloworld2:$BUILD_NUMBER')
}
}
}
}
}
}

Related

How to select a jenkins agent for a build based on github branch?

Our project is written in java and deployed on to a Solaris environment in production. But , most of our test and dev machines are in Linux and so are our most Jenkins agents.
I'm looking for a way to run the Jenkins build in a Solaris agent only when the branch is master/release_branch and choose a Linux agent when the branch is something else.
Idea is to ensure we haven't introduced any compatibility issues in Solaris.
I'm looking for a declarative pipeline approach something like this, but will also select linux agent when condition is not met.
stage('build') {
steps {
mvn clean
}
when {
branch comparator: 'EQUALS', pattern: 'master'
beforeAgent true
}
agent {
label 'Solaris'
}
}
I don't think when can do that. If you don't mind a bit of scripting you can do it like this
pipeline {
agent none
stages {
stage('build') {
agent {
label env.GIT_BRANCH == 'master' ? 'Solaris' : 'Linux'
}
steps {
sh 'hostname'
}
}
}
}
Assuming you are storing the pipeline in git. I'm sure Bitbucket has a similar env variable.

Jenkins Declarative Pipeline with Mandatory stages

I am trying to build a Jenkins Declarative pipeline with a Jenkinsfile. The Jenkinsfile would be present on the repo of the project.
The Jenkinsfile would be something like the following:
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
} }
However, I would like to enforce some stages in Jenkins regardless of the file. So as an example the pipeline would run Build -> Test -> Deploy stages from the file and an additional stage predefined on Jenkins like if it was a scripted pipeline.
Do you know if this is possible? How can I do it?
You could use a shared library to reuse code. It won't be as smooth as you probably liked, but you might use global variables and/or custom steps to encapsulate common functionality.
This would require some glue code (#Library(), script { } etc.), but this approach is very explicit and powerful - you can use library in any place in your pipeline.
Maybe you can try out Jenkins Templating Engine.
It gives you comprehensive Jenkins pipeline governance and templating capability.

Set environment variables from shell script in jenkins

I am trying to automate my build using Jenkins. My build process needs to execute three different shell scripts. The first script sets some environment variables which is used by the second and the third scripts.
I am trying with a pipeline job in Jenkins where each script is executed stage by stage. However I am unable to get the environment variables from the first script to the next one.
NB: There is a set of variables that are being set.So I don't feel like using a simple variable will do.
Please help
You are probably confusing declarative pipeline with scripted pipeline
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}
stages {
stage('Build') {
steps {
sh 'printenv'
}
}
}
}
Jenkinsfile (Scripted Pipeline)
node {
withEnv(['DISABLE_AUTH=true',
'DB_ENGINE=sqlite']) {
stage('Build') {
sh 'printenv'
}
}
}

How to run the same job with two different agents with Declarative Syntax?

I have two jobs running on two different OS, all the build steps are the same, it is the tools (jdk and maven), the meta data that are different.
I want to make a job that include both jobs on two agents depending on the OS.
I'm using Jenkins Pipeline Declarative Syntax, the problem is that I couldn't find a way to declare tools for a specific agent.
In Jenkins Pipeline, we can declare tools inside the entire pipeline or inside a specific stage and that's it.
PS: I need to use the declarative Syntax: no use of node {}
If I do so:
stage('Environment Set Up Linux') {
agent {
label "linux"
}
tools {
jdk 'oracle-jdk-1.8'
}
steps {
echo "Environment tools have been configured"
}
}
stage('Environment Set Up Solaris') {
agent {
label "solaris-64"
}
tools {
jdk 'oracle-jdk-1.7'
}
steps {
echo "Environment tools have been configured"
}
}
The tools will be used only for those stages not all stages and making tools in every stage would be stupid.
Define the common tools which are available on every slave in the entire pipeline and the specific ones in the stage section:
pipeline {
agent any
tools {
maven 'Maven 3.3.9'
}
stages {
stage('test'){
tools {
maven 'Maven 2.2.1'
}
steps {
sh 'mvn --version'
}
}
stage('random'){
steps {
sh 'mvn --version'
}
}
}
}
In this case the output in the stage 'test' is 2.2.1 because I define my tools in the stage section which overwrites the global pipeline. In the stage random I define no tools inside the stage so the tools which are defined in the global pipeline are used and 3.3.9 is printed. I hope this is what you meant.
In your case it could be all agents contain jdk1.8 and you want to use it in nearly ever stage (define it in the pipeline), if there is one stage in which you want to use jdk 1.7, just define the tools in the stage section which will overwrite the global config.

Jenkins Pipeline Across Multiple Docker Images

Using a declarative pipeline in Jenkins, how do I run stages across multiple versions of a docker image. I want to execute the following jenkinsfile on python 2.7, 3.5, and 3.6. Below is a pipeline file for building and testing a python project in a docker container
pipeline {
agent {
docker {
image 'python:2.7.14'
}
}
stages {
stage('Build') {
steps {
sh 'pip install pipenv'
sh 'pipenv install --dev'
}
}
stage('Test') {
steps {
sh 'pipenv run pytest --junitxml=TestResults.xml'
}
}
}
post {
always {
junit 'TestResults.xml'
}
}
}
What is minimal amount of code to make sure the same steps succeed across python 3.5 and 3.6? The hope is that if a test fails, it is evident which version(s) the test fails on.
Or is what I'm asking for not possible for declarative pipelines (eg. scripted pipelines may be what would most elegantly solve this problem)
As a comparison, this is how Travis CI let's you specify runs across different python version.
I had to resort to a scripted pipeline and combine all the stages
def pythons = ["2.7.14", "3.5.4", "3.6.2"]
def steps = pythons.collectEntries {
["python $it": job(it)]
}
parallel steps
def job(version) {
return {
docker.image("python:${version}").inside {
checkout scm
sh 'pip install pipenv'
sh 'pipenv install --dev'
sh 'pipenv run pytest --junitxml=TestResults.xml'
junit 'TestResults.xml'
}
}
}
The resulting pipeline looks like
Ideally we'd be able to break up each job into stages (Setup, Build, Test), but
the UI currently doesn't support this (still not supported).

Resources