How to retrieve Jenkins environment from Groovy script? - jenkins

I am setting a Jenkins. I am programming with my pipeline using Global Pipeline Libraries to be able to increase reusability. Scripts are object oriented and in Groovy. Information about the concept can be found there
I don't manage to retrieve the Jenkins specific environment using my library script. I would like for instance to access:
Build_ID
Build_Number
JOB_Name
Workspace_path
I tryied to use env.WORKSPACE but it is returning a NULL. I manage to retrieve it directly in the pipeline but this is not my goal.
I am using Jenkins 2.303.1.

Depending on how you write your scripts, you might need to inject the Jenkins environment. For example, if you go for a more object oriented way
// vars/whatever.groovy
import ...
#Field
def myTool = new MyTool(this)
// src/.../MyTool.groovy
import ...
class MyTool {
private final jenkins
MyTool(steps) {
this.jenkins = jenkins
}
def echoBuildNumber() {
this.jenkins.echo(this.jenkins.env.BUILD_NUMBER)
}
}
// Jenkinsfile
#Library(...)
node {
echo env.BUILD_NUMBER // echoes build number
whatever.myTool.echoBuildNumber() // echoes build number
}

So the env which you are looking for can be accessible using like this in groovy script
${env.BUILD_NUMBER}
${env.JOB_NAME}
${env.WORKSPACE}
${env.BUILD_ID}

Related

Jenkins to Bamboo Migration & Running Groovy

I'm fairly new to Jenkins and a total newbie to Bamboo. I have a Jenkins Pipeline and I'm trying to create an equivalent in Bamboo (I believe it's called a Plan).
I've got some groovy code that I want to run in my Bamboo plan.
I'll simplify the code below for brevity and clarity.
Assume this file is called me_myEvent.groovy and is stored at https://github.com/myuser/repo1
def processEvent( Map args ) {
String strArg1 = args.myArg1;
String strArg2 = args.myArg2;
// etc...
}
My Jenkins pipeline has a global pipeline library (myGitLibraryFromGlobal) linking to https://github.com/myuser/repo1 and my pipeline is:
#Library('myGitLibraryFromGlobal#master') abc
pipeline {
agent any
stages {
stage('First Stage') {
steps {
script {
def myObj = new com.mysite.me_myEvent();
def returnVal = myObj.processEvent(arg1: 'foo', arg2: 'bar');
}
}
})
}
}
I've got the GitHub repo saved in Bamboo as a global linked repository called abc123.
Can I achieve the same thing in Bamboo using the script task? What would this look like in Bamboo?
The short answer is NO, as Atlassian Bamboo doesn't support the DSL Groovy or Scripted Groovy pipeline. Also, please keep in mind that when you run the Jenkins Groovy pipeline, then Jenkins adds its own environment to the script execution, it is not just running "bare" groove script (i.e. without exposed Jenkins commands and variables).
If you need to run a "bare" groovy script supporting the idea of Configuration as Code, one solution is to create a Bamboo Java/YAML spec. Then you need to create ScriptTask.
// this is called when the plan and stages are created
new Job("JobName","JobKey").tasks(new VcsCheckoutTask(), // to download the groovy script
new ScriptTask().inlineBody("groovy me_myEvent.groovy").interpreterBinSh())
Note: your Bamboo build agent should have a pre-installed groovy.
Another solution is to use the Bamboo plugin Groovy Tasks for Bamboo.

Jenkins Share common environment variables in a groovy method

I am building a declarative JenkinsFile, I have some common variables that I want to be shared across some Jenkins projects and jobs.
So I created a jenkins shared library, but for some reason i can't get my Jenkins file to to read the common environment variables from common groovy.
pipeline {
environment {
commonEnv()
Email_Notification_Enabled="true"
Slack_Notification_Enabled="false"
}
}
and in my groovy i had:
def call() {
a = "abc"
b = "abc"
}
It throws error that commonEnv() is not allowed in environments.
What is the possible way to achieve such behaviour.
You could write a Groovy method that sets the common environment variables. Please refer this Stack Overflow question to know how to do this. Include that method in Jenkins pipeline shared library.
Now call this Groovy method in declarative pipeline of each of your jobs. Remember that in a declarative pipeline, you may use Groovy only inside the script step. So, your pipeline would look something like:
pipeline {
stages {
stage("First stage") {
steps {
script {
// call to Groovy method that sets environment variables
}
// other steps
}
}
// other stages
}
}
Hope, it helps.
Since you need to have environment variables that are shared across all Jenkins projects and jobs, you should set them up on Jenkins instance level rather than on a Jenkins project or job level.
So, instead of doing it in a Jenkinsfile (which will do it at Jenkins job level), I will do it in Manage Jenkins > Configure System > Global properties > Environment Variables:
The environment variables could then be read in the pipeline script from Jenkins Global Variable env:
echo "This is my Jenkins global environment variable ${env.MY_ENV_VAR_NAME}"

How to set build name in Jenkins Job DSL?

According to the documentation in https://jenkinsci.github.io/job-dsl-plugin/#method/javaposse.jobdsl.dsl.helpers.wrapper.MavenWrapperContext.buildName
Following code should update build name in Build History in Jenkins jobs:
// define the build name based on the build number and an environment variable
job('example') {
wrappers {
buildName('#${BUILD_NUMBER} on ${ENV,var="BRANCH"}')
}
}
Unfortunately, it is not doing it.
Is there any way to change build name from Jenkins Job DSL script?
I know I can change it from Jenkins Pipeline Script but it is not needed for me in this particular job. All I use in the job is steps.
steps {
shell("docker cp ...")
shell("git clone ...")
...
}
I would like to emphasise I am looking for a native Jenkins Job DSL solution and not a Jenkins Pipeline Script one or any other hacky way like manipulation of environment variables.
I have managed to solve my issue today.
The script did not work because it requires build-name-setter plugin installed in Jenkins. After I have installed it works perfectly.
Unfortunately, by default jobdsl processor does not inform about missing plugins. The parameter enabling that is described here https://issues.jenkins-ci.org/browse/JENKINS-37417
Here's a minimal pipeline changing the build's display name and description. IMHO this is pretty straight forward.
pipeline {
agent any
environment {
VERSION = "1.2.3-SNAPSHOT"
}
stages {
stage("set build name") {
steps {
script {
currentBuild.displayName = "v${env.VERSION}"
currentBuild.description = "#${BUILD_NUMBER} (v${env.VERSION})"
}
}
}
}
}
It results in the following representation in Jenkins' UI:
setBuildName("your_build_name") in a groovyPostBuild step may do the trick as well.
Needs Groovy Postbuild Plugin.

In jenkins declarative pipeline, how can I set environment variable based on method?

In jenkins declarative pipeline, how can I set the value of an environment variable based on custom groovy/powershell method? For instance, if I have a delcarative pipeline as follows, can I use a shared library method to set this value?
Essentially I am trying to use a multibranch Declarative Pipeline jenkins job which has a deploy stage, but I need to ensure that develop branches are deployed to DEV, Release branches are deploying to STG, but using the same pipeline. My thought was to create an environment variable that is set based on a custom method (in perhaps Groovy in shared library), and that method would simply look at the current value for env.BRANCH and simply have a little logic to set the value of the target deploy environment. Here is an example of what I envision
pipeline {
environment {
DEPLOY_ENV = mapBranchToDeployEnvironment(${BRANCH})
}
And then in my deploy stage I would use this value in two powershell invocations
bat "powershell .\\Deploy-Service -Environment ${DEPLOY_ENV}"
bat "powershell .\\Deploy-ServiceProxy -Environment ${DEPLOY_ENV}"
Otherwise, How are people current solving the problem of using the same pipeline to deploy to different environments while using the variables across many other function invocations? What is the recommended approach from Jenkins on mapping a branch name that triggered the build to an environment (if any) it should be deployed to?
Based on my understanding, the Declarative Pipeline allows a pipeline to be "multibranch", which, if the job deploys as well, it needs to map to an deploy environment. How else would a pipeline deploy using multibranch to multiple environments when all the global jenkins pipeline environment variables are the same value for every job /branch execution?
In the above scenario, the pipeline variable 'DEPLOY_ENV' is derived from other environment variables that are set by the job and are available typically at the stage level, but here we are looking to set the value globally so that we can use it across stages
Update: My issue was that I didnt realize how simple it was and instead thought that I had to pass in a stage or script object into a groovy shared library function, when in fact its as simple as creating a shared library, then directly referencing the environment variables in the method. Easy. Thank you.
I had exactly the same problem, and indeed it is possible to use a shared library method. But there is another solution, more simple if you do not have a shared library set-up yet, that consists in defining a groovy method before the Pipeline statement and then use it inside your pipeline like this :
def getEnvFromBranch(branch) {
if (branch == 'master') {
return 'production'
} else {
return 'staging'
}
}
pipeline {
agent any
environment {
targetedEnv = getEnvFromBranch(env.BRANCH_NAME)
}
stages {
stage('Build') {
steps {
echo "Building in ${env.targetedEnv}"
}
}
}
}
You can do exactly what you're suggesting. You should create a jenkins shared library with a var (a new DSL method). These can be called to assign to a pipeline-wide environment variable. You had it basically correct. Here's a Jenkinsfile fragment to assign to an environment variable:
environment {
DEPLOY_ENV = mapBranchToDeployEnvironment()
}
You don't need to pass the branch to the mapBranchToDeployEnvironment DSL method, since you can access the branch in that method. sample contents of vars/mapBranchToDeployEnvironment.groovy in shared library look like this:
def call() {
echo "branch is: ${env.BRANCH_NAME}"
if (env.BRANCH_NAME == 'master') {
return 'prod'
} else {
return 'staging'
}
}
You probably shouldn't expect this to be a five minute task, but you'll get it. Good luck!
stage('Prepare env variables') {
steps {
script {
if (env.BRANCH_NAME == 'master') {
echo 'Copying project-stg.env file...';
sh 'cp /opt/project-stg.env .env';
} else {
echo 'Copying project-dev.env file...';
sh 'cp /opt/project-dev.env .env';
}
}
}
}

Running a groovy script through jenkinsfile which runs on a remote linux box

I have a abc.groovy script which takes an argument. In my local I run it as
$ groovy abc.groovy <argumentValue>
I have stored this abc.groovy in a remote linux box under path "/home/path/to a directory/" and I have a jenkins pipeline job with a Jenkinsfile. How can I call abc.groovy from the JenkinsFile.
You can use GroovyShell to evaluate your script.
GroovyShell shell = new GroovyShell()
def execute = shell.parse(new File('/path/to/abc.groovy'))
execute.method()
You'll want to use the load step in your Jenkinsfile like this:
def pipeline {
agent 'slave'
stages {
stage ('Load Groovy Script') {
steps {
load 'path/to/abc.groovy'
}
}
}
(This example uses the declarative pipeline syntax, but is easily ported to scripted)
Note: you can't pass parameters to the groovy script in the load step, however this isn't hard to work around.

Resources