Set environment variables from shell script in jenkins - jenkins

I am trying to automate my build using Jenkins. My build process needs to execute three different shell scripts. The first script sets some environment variables which is used by the second and the third scripts.
I am trying with a pipeline job in Jenkins where each script is executed stage by stage. However I am unable to get the environment variables from the first script to the next one.
NB: There is a set of variables that are being set.So I don't feel like using a simple variable will do.
Please help

You are probably confusing declarative pipeline with scripted pipeline
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}
stages {
stage('Build') {
steps {
sh 'printenv'
}
}
}
}
Jenkinsfile (Scripted Pipeline)
node {
withEnv(['DISABLE_AUTH=true',
'DB_ENGINE=sqlite']) {
stage('Build') {
sh 'printenv'
}
}
}

Related

How can i use Environment Injector Plugin with Jenkinsfile or Pipeline

I have Jenkins declarative file, and adding Jenkins plugin Environment Injector Plugin
I use Environment Injector Plugin to configure/inject environment variables, the problem is when i try to run printev to check available environment variable, nothing added to env variable
I use same plugin for Freestyle project and works fine. Can I use Environment Injector Plugin with pipeline or not possible at all?
I have try with echo $VAR_NAME and printev inside pipeline declaration and no luck
You don't have to rely on the environment inject plugin in a declarative Pipeline to set environment variables. You can use an environment block instead. Check here for details.
pipeline {
agent {
label '!windows'
}
environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}
stages {
stage('Build') {
steps {
echo "Database engine is ${DB_ENGINE}"
echo "DISABLE_AUTH is ${DISABLE_AUTH}"
sh 'printenv'
}
}
}
}

Jenkins Declarative Pipeline with Mandatory stages

I am trying to build a Jenkins Declarative pipeline with a Jenkinsfile. The Jenkinsfile would be present on the repo of the project.
The Jenkinsfile would be something like the following:
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
} }
However, I would like to enforce some stages in Jenkins regardless of the file. So as an example the pipeline would run Build -> Test -> Deploy stages from the file and an additional stage predefined on Jenkins like if it was a scripted pipeline.
Do you know if this is possible? How can I do it?
You could use a shared library to reuse code. It won't be as smooth as you probably liked, but you might use global variables and/or custom steps to encapsulate common functionality.
This would require some glue code (#Library(), script { } etc.), but this approach is very explicit and powerful - you can use library in any place in your pipeline.
Maybe you can try out Jenkins Templating Engine.
It gives you comprehensive Jenkins pipeline governance and templating capability.

Using withEnv in a declarative pipeline

I'm trying to run docker command in my declarative pipeline, to install docker env on my slave machine i'm trying to use docker commons plugin "https://plugins.jenkins.io/docker-commons/", but no success.
Further research i have got below link mentioning how to use this plugin.
https://automatingguy.com/2017/11/06/jenkins-pipelines-simple-delivery-flow/
I have configured docker in manage jenkins -> global tool configuration, but dont find how to use below section in my declarative pipeline of jenkins, i think below structure/syntax will work for scripted jenkins pipeline
def dockerTool = tool name: 'docker', type:
'org.jenkinsci.plugins.docker.commons.tools.DockerTool'
withEnv(["DOCKER=${dockerTool}/bin"]) {
stages{}
}
Can someone pls help, how i can use docker common tool in declarative pipeline of jenkins.
Note: I cannot switch to scripted pipeline due to standardization with other projects
Here is the working example
pipeline{
agent any
stages{
stage('test') {
steps{
script{
test_env="this is test env"
withEnv(["myEnv=${test_env}"]){
echo "${env.myEnv}"
}
}
}
}
}
}
I have this feeling that you are don't need to use either withEnv or docker commons. Have you seen this? https://www.jenkins.io/doc/book/pipeline/docker/
There are plenty of good examples of how to use docker with Jenkinsfile.
My attempt to answer your question (if I got it right), if you are asking about declarative equivalent for scripted withEnv, then probably you are looking for environment {}? Something like this:
pipeline {
agent any
environment {
DOCKER = "${dockerTool}/bin"
}
stages {
stage('One') {
steps {
// steps here
}
}
}
}
Here is a working declarative pipeline solution as of Docker Commons v1.17
Note: the tool name, dockerTool is a keyword and docker-19.03.11 is name I gave my installation in Jenkins > Manage Jenkins > Global Tool Configuration page.
pipeline {
agent any
tools {
dockerTool 'docker-19.03.11'
}
stages {
stage('build') {
steps {
sh'''
echo 'FROM mongo:3.2' > Dockerfile
echo 'CMD ["/bin/echo", "HELLO WORLD...."]' >> Dockerfile
'''
script {
docker.withRegistry('http://192.168.99.100:5000/v2/') {
def image = docker.build('test/helloworld2:$BUILD_NUMBER')
}
}
}
}
}
}

How to run the same job with two different agents with Declarative Syntax?

I have two jobs running on two different OS, all the build steps are the same, it is the tools (jdk and maven), the meta data that are different.
I want to make a job that include both jobs on two agents depending on the OS.
I'm using Jenkins Pipeline Declarative Syntax, the problem is that I couldn't find a way to declare tools for a specific agent.
In Jenkins Pipeline, we can declare tools inside the entire pipeline or inside a specific stage and that's it.
PS: I need to use the declarative Syntax: no use of node {}
If I do so:
stage('Environment Set Up Linux') {
agent {
label "linux"
}
tools {
jdk 'oracle-jdk-1.8'
}
steps {
echo "Environment tools have been configured"
}
}
stage('Environment Set Up Solaris') {
agent {
label "solaris-64"
}
tools {
jdk 'oracle-jdk-1.7'
}
steps {
echo "Environment tools have been configured"
}
}
The tools will be used only for those stages not all stages and making tools in every stage would be stupid.
Define the common tools which are available on every slave in the entire pipeline and the specific ones in the stage section:
pipeline {
agent any
tools {
maven 'Maven 3.3.9'
}
stages {
stage('test'){
tools {
maven 'Maven 2.2.1'
}
steps {
sh 'mvn --version'
}
}
stage('random'){
steps {
sh 'mvn --version'
}
}
}
}
In this case the output in the stage 'test' is 2.2.1 because I define my tools in the stage section which overwrites the global pipeline. In the stage random I define no tools inside the stage so the tools which are defined in the global pipeline are used and 3.3.9 is printed. I hope this is what you meant.
In your case it could be all agents contain jdk1.8 and you want to use it in nearly ever stage (define it in the pipeline), if there is one stage in which you want to use jdk 1.7, just define the tools in the stage section which will overwrite the global config.

No such DSL method `stages`

I'm trying to create my first Groovy script for Jenkins:
After looking here https://jenkins.io/doc/book/pipeline/, I created this:
node {
stages {
stage('HelloWorld') {
echo 'Hello World'
}
stage('git clone') {
git clone "ssh://git#mywebsite.example/myrepo.git"
}
}
}
However, I'm getting:
java.lang.NoSuchMethodError: No such DSL method "stages" found among steps
What am I missing?
Also, how can I pass my credentials to the Git Repository without writing the password in plain text?
You are confusing and mixing Scripted Pipeline with Declarative Pipeline, for complete difference see here. But the short story:
declarative pipelines is a new extension of the pipeline DSL (it is basically a pipeline script with only one step, a pipeline step with arguments (called directives), these directives should follow a specific syntax. The point of this new format is that it is more strict and therefor should be easier for those new to pipelines, allow for graphical editing and much more.
scripted pipelines is the fallback for advanced requirements.
So, if we look at your script, you first open a node step, which is from scripted pipelines. Then you use stages which is one of the directives of the pipeline step defined in declarative pipeline. So you can for example write:
pipeline {
...
stages {
stage('HelloWorld') {
steps {
echo 'Hello World'
}
}
stage('git clone') {
steps {
git clone "ssh://git#mywebsite.example/myrepo.git"
}
}
}
}
So if you want to use declarative pipeline that is the way to go.
If you want to scripted pipeline, then you write:
node {
stage('HelloWorld') {
echo 'Hello World'
}
stage('git clone') {
git clone "ssh://git#mywebsite.example/myrepo.git"
}
}
E.g.: skip the stages block.
A Jenkinsfile can be written using two types of syntax - Declarative and Scripted.
Declarative and Scripted Pipelines are constructed fundamentally differently. Declarative Pipeline is a more recent feature of Jenkins Pipeline which:
provides richer syntactical features over Scripted Pipeline syntax, and
is designed to make writing and reading Pipeline code easier.
Many of the individual syntactical components (or "steps") written into a Jenkinsfile, however, are common to both Declarative and Scripted Pipeline.
Example:
Declarative Pipeline fundamentals
In Declarative Pipeline syntax, the pipeline block defines all the work done throughout your entire Pipeline.
Jenkinsfile (Declarative Pipeline):
pipeline {
agent any 1
stages {
stage('Build') { 2
steps {
// 3
}
}
stage('Test') { 4
steps {
// 5
}
}
stage('Deploy') { 6
steps {
// 7
}
}
}
}
Execute this Pipeline or any of its stages, on any available agent.
Defines the "Build" stage.
Perform some steps related to the "Build" stage.
Defines the "Test" stage.
Perform some steps related to the "Test" stage.
Defines the "Deploy" stage.
Perform some steps related to the "Deploy" stage.
Scripted Pipeline fundamentals
In Scripted Pipeline syntax, one or more node blocks do the core work throughout the entire Pipeline. Although this is not a mandatory requirement of Scripted Pipeline syntax, confining your Pipeline's work inside of a node block does two things:
Schedules the steps contained within the block to run by adding an item to the Jenkins queue. As soon as an executor is free on a node, the steps will run.
Creates a workspace (a directory specific to that particular Pipeline) where work can be done on files checked out from source control.
Caution: Depending on your Jenkins configuration, some workspaces may not get automatically cleaned up after a period of inactivity. See tickets and discussion linked from JENKINS-2111 for more information.
Jenkinsfile (Scripted Pipeline):
node { 1
stage('Build') { 2
// 3
}
stage('Test') { 4
// 5
}
stage('Deploy') { 6
// 7
}
}
Execute this Pipeline or any of its stages, on any available agent.
Defines the "Build" stage. stage blocks are optional in Scripted Pipeline syntax. However, implementing stage blocks in a Scripted Pipeline provides clearer visualization of each `stage's subset of tasks/steps in the Jenkins UI.
Perform some steps related to the "Build" stage.
Defines the "Test" stage. 5
Perform some steps related to the "Test" stage.
Defines the "Deploy" stage.
Perform some steps related to the "Deploy" stage.
Pipeline example
Here is an example of a Jenkinsfile using Declarative and it's equivalent scriptive Pipeline syntax:
Jenkinsfile (Declarative Pipeline):
pipeline {
agent any
options {
skipStagesAfterUnstable()
}
stages {
stage('Build') {
steps {
sh 'make'
}
}
stage('Test'){
steps {
sh 'make check'
junit 'reports/**/*.xml'
}
}
stage('Deploy') {
steps {
sh 'make publish'
}
}
}
}
Jenkinsfile (Scripted Pipeline):
node {
stage('Build') {
sh 'make'
}
stage('Test') {
sh 'make check'
junit 'reports/**/*.xml'
}
if (currentBuild.currentResult == 'SUCCESS') {
stage('Deploy') {
sh 'make publish'
}
}
}

Resources