Using variables created inside other job build inside on Jenkinsfile - jenkins

I need to read some vars created inside another job. Easier to be explain with pseudo code:
my job:
{
build job:"create cluster" //this job will create some vars (cluster_name)
//used this var from my job
echo "${cluster_name}"
}
The best will be with declarative pipelines but I can always use script {}

Firstly in your create cluster job you need to put that variable into environment variable. You can do it this way
//create cluster Jenkinsfile
env.CLUSTER_NAME = cluster_name
Then in your upstream job you can receive that variable using a result of build step.
def result = build job: 'create cluster'
echo result.buildVariables.CLUSTER_NAME

Related

Groovy script to get Jenkins pipeline's script path

I want to run a script in Jenkins Script Console to retrieve scriptPath parameter of all jobs/pipelines configured in Jenkins. I found the way to get names of the pipelines but I want scriptPath parameters of each pipeline.
Any leads?
All pipeline jobs are instances of org.jenkinsci.plugins.workflow.job.WorkflowJob and can be found using the Jenkins.instance.getAllItems function.
Once found, each job contains an attribute of the FlowDefinition class whcih is accessible via the getDefinition() method. There are two types of definition for pipelines:
CpsFlowDefinition - for pipelines which define an inline script (not SCM), the script is accessible via the getScript() method.
CpsScmFlowDefinition - for pipelines which define an SCM script, the script is accessible via the getScriptPath() method.
So to achieve what you want you can go over relevant jobs and extract the relevant attribute:
def pipelineJobs =Jenkins.instance.getAllItems(org.jenkinsci.plugins.workflow.job.WorkflowJob)
def scmJobs = pipelineJobs.findAll { it.definition =~ 'CpsScmFlowDefinition'}
scmJobs.each {
println "Pipeline Name: ${it.name}"
println "SCM Script Path: ${it.definition.scriptPath}"
}
If all your jobs are SCM pipelines you can use the following single liner:
Jenkins.instance.getAllItems(org.jenkinsci.plugins.workflow.job.WorkflowJob)*.definition.scriptPath
For a single specific job you can use:
Jenkins.instance.getItemByFullName("<PIPELINE_NAME>").definition.scriptPath // or just script for inline definition

How to retrieve Jenkins environment from Groovy script?

I am setting a Jenkins. I am programming with my pipeline using Global Pipeline Libraries to be able to increase reusability. Scripts are object oriented and in Groovy. Information about the concept can be found there
I don't manage to retrieve the Jenkins specific environment using my library script. I would like for instance to access:
Build_ID
Build_Number
JOB_Name
Workspace_path
I tryied to use env.WORKSPACE but it is returning a NULL. I manage to retrieve it directly in the pipeline but this is not my goal.
I am using Jenkins 2.303.1.
Depending on how you write your scripts, you might need to inject the Jenkins environment. For example, if you go for a more object oriented way
// vars/whatever.groovy
import ...
#Field
def myTool = new MyTool(this)
// src/.../MyTool.groovy
import ...
class MyTool {
private final jenkins
MyTool(steps) {
this.jenkins = jenkins
}
def echoBuildNumber() {
this.jenkins.echo(this.jenkins.env.BUILD_NUMBER)
}
}
// Jenkinsfile
#Library(...)
node {
echo env.BUILD_NUMBER // echoes build number
whatever.myTool.echoBuildNumber() // echoes build number
}
So the env which you are looking for can be accessible using like this in groovy script
${env.BUILD_NUMBER}
${env.JOB_NAME}
${env.WORKSPACE}
${env.BUILD_ID}

How to pass environment variable to Jenkins Remote API when submitting job

I have a declarative pipeline job (this is not multi-branch pipeline job using Jenkinsfile) without parameters but some stages are conditional based on value in environment variable:
stage('deploy-release') {
when {
environment name: 'GIT_BRANCH', value: 'master'
}
steps {
sh "mvn deploy:deploy-file -B -DpomFile=pom.xml -Dfile=target/example.jar -DrepositoryId=maven-releases -Durl=${NEXUS_URL}/repository/maven-releases/"
}
}
I want to trigger the job from external system but I need to pass correct value of given environment variable. Is there some way how to do that via Jenkins Remote API?
For passing value of given environment variable, you need to define parameters with the exact same name as that of environment variable for your job by selecting "This build is parameterized".
You can refer Parameterized Build

Passing workspace url of Job A to Job B in Jenkins

I have two pipeline jobs Job A and Job B. I need to pass the workspace url of Job A (say /var/lib/jenkins/workspace/JobA) to be used by Job B. The main idea is I am trying to copy the contents of target folder which is generated due to maven build but I don't want to use Copy Artifacts Plugin or Archive Artifacts Plugin to achieve the same.
I have tried using the option "This job is parameterized" where Job A is the upstream of Job B but i am unable to so using that option.
Can anyone help to achieve the same ?
The WORKSPACE variable is an env variable from Jenkins and is pointing like / .
For eg.
If the job name is Job_A --> the workspace value will be <jenkins_path>/Job_A
For eg.
If the job name is Job_B --> the workspace value will be <jenkins_path>/Job_B
So you can't use the WORKSPACE var and expects the Job_B to point to Job_A workspace value.
The below can be used to get certain properties from the upstream job.
Jenkins - How to get and use upstream info in downstream
Even if you want to hard code it in the Job_B it will be fine(not recommended)
Also for this to work your node should be same for both the jobs
I have found a way to do the same and it is working fine.
I have made the Job B a parameterized job using "This project is parameterized" and used string parameter.
Then, in the pipeline script of Job A, i invoked the Job B by passing WORKSPACE env variable. Here is the declarative pipeline script for Job A:
pipeline {
agent any
stages
{
stage ('Build JobB')
{
steps {
build job: 'jobB', parameters: [string(name: 'UPSTREAM_WORKSPACE', value: "${env.WORKSPACE}")]
}
}
} }
Now, in Job B pipeline you can try to echo the variable UPSTREAM_WORKSPACE. This is how we can pass the workspace url and use it to copy the artifacts.

Can I update a Jenkins Global Environment Variable from a pipeline script?

If I define an environment variable (eg. VersionNum) under Jenkins Global Properties, can I update the value within a pipeline script? I was hoping to use it to store version information and update according to script execution results.
What I want to do is write a pipeline script like:
node {
stage {'Stage1') {
VersionNum = '5'
}
}
that will update the global environment variable so the new value that will persist and can be used by other Jenkins jobs.
Rather than try to use the global environment variable, I read a properties file with the Pipeline Utility Steps plugin:
def props = readProperties file:"${WORKSPACE}\\BuildVersion.properties"
MajVersion = props['MAJOR_VERSION'].trim()
MinVersion = props['MINOR_VERSION'].trim()
Then if I change a value, I write it back with:
bat "(echo MAJOR_VERSION=${MajVersion} && echo MINOR_VERSION=${MinVersion}) \u003E \"%WORKSPACE%\\BuildVersion.properties\""

Resources