How to share environment variable value across different Jenkins Pipelines? - jenkins

I have two Jenkins Pipelines :
Pipeline A : In a stage, I defined an environment variable called MAVEN_PROFILE (the user can choose a value from a list)
Pipeline B : I need to get the MAVEN_PROFILE environment variable value that was set in Pipeline A
I need two pipelines because I can't do it in a single Pipeline for process reason.
I saw there was some answers on how to share variable between stages within a single Pipeline but this not my case.
I want to share environment variable value between different Pipelines.
Pipeline A
pipeline {
agent any
...
stages {
stage('Profile Selection'){
steps {
script {
env.MAVEN_PROFILE = input message: 'Choose the profile :',
parameters: [choice(name: 'MAVEN_PROFILE',
choices: 'all\nserver\nclient', description: 'Profiles')]
}
}
}
stage(...){
steps {
script {
bat "mvn deploy -P ${env.MAVEN_PROFILE}"
}
}
}
... other stages
}
}
Pipeline B
pipeline {
agent any
...
stages {
... other stages
stage(...){
steps {
script {
bat "mvn release ... -P ${env.environmentVariableValueFromPipelineA}"
}
}
}
}
}

They're not running in the same environment, so they can't directly share environment variables. The easiest is probably to write these values to a file in the workspace in pipeline A, and read them back in in pipeline B. Something like this:
Pipeline A:
sh "echo ${MAVEN_PROFILE} > .MAVEN_PROFILE"
Pipeline B:
def MAVEN_PROFILE = sh(script: 'cat .MAVEN_PROFILE', returnStdout: true).trim()

Related

What is differences between either of using def and without using def in Jenkinsfile in script block?

I have two Jenkinsfile for sample:
The content of A_Jenkinsfile is:
pipeline {
agent any
stages {
stage("first") {
steps {
script {
foo = "bar"
}
sh "echo ${foo}"
}
}
stage("two") {
steps {
sh "echo ${foo}"
}
}
}
}
The other one is B_Jenkinsfile and its content is:
pipeline {
agent any
stages {
stage("first") {
steps {
script {
def foo = "bar"
}
sh "echo ${foo}"
}
}
stage("two") {
steps {
sh "echo ${foo}"
}
}
}
}
When I build them, B_Jenkinsfile is failed and A_Jenkinsfile is success.
What is differences between either of using def and without using def in Jenkinsfile in script block?
There are two types of Pipeline syntax. Declarative Pipeline and Scripted Pipeline. A declarative pipeline starts with a pipeline {} wrapper and will have Stages and Steps. Declarative pipeline limits what is available to the user with a more strict and pre-defined structure. Where in scripted Pipeline it's more closer to groovy, and users will have more flexibility on what they can do. When you run something in a Script block in a declarative Pipeline, The script step takes a block of the Scripted Pipeline and executes that in the Declarative Pipeline. Basically, it runs a Groovy script for you. So your question can be rephrased as what def means in a Groovy script.
Simply in a Groovy script, if you omit adding the def keyword the variable will be added to the current script's binding. So it will be considered as a Global variable. If you use def the variable will be scoped, and you will only be able to use it in the current script block. There are multiple detailed answers for this here, so I'm not going to repeat them.

Passing/Injecting/Referring Environment variables from Jenkins file to Shared Library which has the generic code

I have a common Jenkins shared library for all the repositories as below
vars/_publish.groovy
def call(opts) {
pipeline {
environment {
abc= credentials(’abc')
def= credentials(‘def’)
}
stages {
stage('Build') {
steps{
sh ‘docker build'
}
}
jenkinsfile
#Library('my-shared-library#branch') _
_publish() {
}
This works fine successfully for 1 single service. now I want to keep this jenkins shared library for all the services/projects but each service has different env variables.
The environment block {} in vars/_publish.groovy has 10 env variables that are not constant for all the projects/services. The values differ according to the project/services.
How can I pass env variables to this jenkins shared library for different projects? Each project/service has different Jenkins pipeline. can I pass the variables from jenkinsfile to shared library?
Can anyone help?
You can pass them with the help of withEnv (Documentation) method from your Jenkins file.
#Library('my-shared-library#branch') _
node(''){
withEnv([
“credId='id-cred'”,
“y=20”
]){
_publish() {}
}
}
// shared lib
def call(opts) {
stage('Build') {
echo "env is ${env.credId}"
withCredentialswithCredentials([usernamePassword(credentialsId: env.credId, usernameVariable: 'USER', passwordVariable: 'PASSWORD')]){
sh ‘docker build'
}
}
}

passing Jenkins env variables between stages on different agents

I've looked at this Pass Artifact or String to upstream job in Jenkins Pipeline and this Pass variables between Jenkins stages and this How do I pass variables between stages in a declarative Jenkins pipeline?, but none of these questions seem to deal with my specific problem.
Basically I have a pipeline consisting of multiple stages, each run in its own agent.
In the first stage I run a shell script. Here two variables are generated. I would like to use these variables in the next stage. The methods I've seen so far seem to only work when passing variables within the same agent.
pipeline {
stages {
stage("stage 1") {
agent {
docker {
image 'my_image:latest'
}
}
steps {
sh ("""
export VAR1=foo
export VAR2=bar
""")
}
}
stage("stage 2") {
agent {
docker {
image 'my_other_image:latest'
}
}
steps {
sh ("echo "$VAR1 $VAR2")
//expecting to see "foo bar" printed here
}
}

Jenkins declarative pipline multiple slave

I have a pipeline with multiple stages, some of them are in parallel. Up until now I had a single code block indicating where the job should run.
pipeline {
triggers { pollSCM '0 0 * * 0' }
agent { dockerfile { label 'jenkins-slave'
filename 'Dockerfile'
}
}
stages{
stage('1'){
steps{ sh "blah" }
} // stage
} // stages
} // pipeline
What I need to do now is run a new stage on a different slave, NOT in docker.
I tried by adding an agent statement for that stage but it seems like it tries to run that stage withing a docker container on the second slave.
stage('test new slave') {
agent { node { label 'e2e-aws' } }
steps {
sh "ifconfig"
} // steps
} // stage
I get the following error message
13:14:23 unknown flag: --workdir
13:14:23 See 'docker exec --help'.
I tried setting the agent to none for the pipeline and using an agent for every step and have run into 2 issues
1. My post actions show an error
2. The stages that have parallel stages also had an error.
I can't find any examples that are similar to what I am doing.
You can use the node block to select a node to run a particular stage.
pipeline {
agent any
stages {
stage('Init') {
steps {
node('master'){
echo "Run inside a MASTER"
}
}
}
}
}

Create new Jenkins jobs using Pipeline Job and Groovy script

I have Jenkins pipeline Job with parameters (name, group, taskNumber)
I need to write pipeline script which will call groovy script (this one?: https://github.com/peterjenkins1/jenkins-scripts/blob/master/add-job.groovy)
I want to create new job (with name name_group_taskNamber) every times when I build main Pipeline Job.
I don't understand:
Where do I need to put may groovy script ?
How does Pipeline script should look like? :
node{
stage('Build'){
def pipeline = load "CreateJob.groovy"
pipeline.run()
}
}
You can use and configure a shared library like here (a git repo): https://github.com/lvthillo/shared-library . You need to configure this in your Jenkins global configuration.
It contains a folder vars/. Here you can manage pipelines and groovy scripts like my slackNotifier.groovy. The script is just a groovy script to print the build result in Slack.
In the jenkins pipeline job we will import our shared library:
#Library('name-of-shared-pipeline-library')_
mavenPipeline {
//define parameters
}
In the case above also the pipeline is in the shared library but this isn't necessary.
You can just write your pipeline in the job itself and call only the function from the pipeline like this:
This is the script in the shared library:
// vars/sayHello.groovy
def call(String name = 'human') {
echo "Hello, ${name}."
}
And in your pipeline:
final Lib= library('my-shared-library')
...
stage('stage name'){
echo "output"
Lib.sayHello.groovy('Peter')
}
...
EDIT:
In new declarative pipelines you can use:
pipeline {
agent { node { label 'xxx' } }
options {
buildDiscarder(logRotator(numToKeepStr: '3', artifactNumToKeepStr: '1'))
}
stages {
stage('test') {
steps {
sh 'echo "execute say hello script:"'
sayHello("Peter")
}
}
}
post {
always {
cleanWs()
}
}
}
def sayHello(String name = 'human') {
echo "Hello, ${name}."
}
output:
[test] Running shell script
+ echo 'execute say hello script:'
execute say hello script:
[Pipeline] echo
Hello, Peter.
[Pipeline] }
[Pipeline] // stage
We do it by using the https://wiki.jenkins.io/display/JENKINS/Jobcopy+Builder+plugin, try build another step in pipeline script and pass the parms which are to be considered

Resources