I've tried to use different agent for different environments (dev/prod) using if-else inside agent directive. But I'm getting errors if I use the below pipeline script. Any help is much appreciated!!
pipeline {
agent {
if (env.ENVIRONMENT == 'prod') {
label {
label "EC2-1"
customWorkspace "/home/ubuntu/eks-prod-backend/"
}
}
else if (env.ENVIRONMENT == 'dev') {
label {
label "EC2-2"
customWorkspace "/home/ubuntu/eks-dev-backend/"
}
}
}
}
This is the approach I would suggest. Define a variable before the "pipeline" block, for example:
def USED_LABEL = env.ENVIRONMENT == 'prod' ? "EC2-1" : "EC2-2"
def CUSTOM_WORKSPACE = env.ENVIRONMENT == 'prod' ? "/home/ubuntu/eks-prod-backend/" : "/home/ubuntu/eks-dev-backend/"
Then, just use it like this:
pipeline {
agent {
label USED_LABEL
customWorkspace CUSTOM_WORKSPACE
}
}
I am not sure if label inside label is needed, but you hopefully get the point. Use variables specified before the pipeline execution.
Maybe something like this could help you in case you have only two environments ?
pipeline {
agent {
label {
label env.ENVIRONMENT == 'prod' ? "EC2-1" : "EC2-2"
customWorkspace env.ENVIRONMENT == 'prod' ? "/home/ubuntu/eks-prod-backend/" : "/home/ubuntu/eks-dev-backend/"
}
}
stages {
stage("Build") {
steps {
echo "Hello, World!"
}
}
}
}
Otherwise, you can check this thread, this will perhaps help you.
Related
I have this pipeline that generates dynamic stages based on file content. I want the stages to run on different containers, so I thought to move the agent{docker{image ''}}} label to inside the stage in the generateStage() function but it's not possible since it's a scripted pipeline. How can I run these stages on separate containers, and still run them parallelly and generate them dynamically?
Would really appreciate your help.
Thanks!
def generateStage(job) {
return {
stage("stage: job") {
//do something
}
}
}
pipeline{
agent none
stages{
.
.
.
stage('parallel stages') {
agent {
docker{
image 'some-image:tag'
}
}
steps {
script {
def list = ["STAGE-A", "STAGE-B"....] // DYNAMIC LIST CREATED FROM A FILE
parallelStages = list.collectEntries{
["$it": generateStage(it)]
}
parallel parallelStages
}
}
}
Instead of using the agent option, you can do something like the below.
def generateStage(job) {
return {
stage("stage: job") {
docker.image('your-image').inside {
sh 'DO SOMETHING'
}
}
}
}
My non working pipeline ( declarative):
def parentWorkspace = "${params.WS}" // comes from upsteam job
def parentNode = "${params.NODENAME}" // comes from upsteam job
pipeline {
agent none
stages {
stage("checkout source and run tests") {
steps{
script {
if(parentWorkspace != "null")
{
agent {
node{
customWorkspace "${parentWorkspace} "
label "${parentNode} "
}
}
else {
agent {
label 'linuxVM'
}
}
}
steps {
git url: gitLocation, branch: branchName, poll: false
sh "mvn test ..."
}
}
}
}
if the "parentworkspace" varibale is not null, use custom workspace with provided parentworkspace path and node label. Otherwise use default workspace and node with label "linux" . is it possible ?
tried few options couldn't really make it work. any help is appreciated.
Try to define the if condition outside your pipeline block like below:
def parentWorkspace = "${params.WS}"
def parentNode = "${params.NODENAME}"
script {
if(parentWorkspace != "null") {
label "${parentNode} "
}
else {
label 'linuxVM'
}
}
pipeline {
//your pipeline code
}
For allocation of workspace use ws("${params.WS}")
Reply back if this doesn't solve the issue.
I found this works perfect for me for the custom workspace issue
agent {
label "whatever"
customWorkspace params.WS
}
If params.WS is null or empty string, you get the original workspace since the value used for customWorkspace can be a relative path with respect to the original workspace.
I am trying to create multiple pipeline jobs under a folder. Under this folder I have created some folder properties. I am having a hard time to use this folder properties across multiple stages in a job.
plugin used : https://wiki.jenkins.io/display/JENKINS/Folder+Properties+Plugin
def region
pipeline {
agent any
stages {
stage('Assign values to global properties') {
steps {
withFolderProperties{
region = "${env.appRegion}"
}
}
}
stage('Print') {
steps {
print(region)
}
}
}
}
Error:
Expected a step # line 8, column 21.
region = "${env.appRegion}"
Thanks in Advance
region = "${env.appRegion}" is not pipeline reserved name of step or directive. It's groovy statement. You should put them inside script step. If you used Scripted Pipeline you can put any kinds of groovy statement in anywhere. But for Declarative Pipeline any groovy statement should wrapped in script step.
steps {
script {
withFolderProperties{
region = "${env.appRegion}"
}
}
}
steps {
withFolderProperties{
script {
region = "${env.appRegion}"
}
}
}
I'm not sure which one code block above is work, but you can give a try.
#!groovy
def CI_NAMESPACE = ""
withFolderProperties{
CI_NAMESPACE = "${env.CI_NAMESPACE}"
}
println "CI_NAMESPACE = ${CI_NAMESPACE}"
if (CI_NAMESPACE == '' || CI_NAMESPACE == null || CI_NAMESPACE == 'null') {
currentBuild.result = 'ABORTED'
error('Not defined CI_NAMESPACE in Folder properies plugin!')
}
pipeline {
environment {
CI_NAMESPACE = "${CI_NAMESPACE}"
}
stages {
stage('Test') {
steps {
echo "CI_NAMESPACE: ${env.CI_NAMESPACE}"
}
}
}
}
Previously asked a question about how to overwrite variables defined in an environment directive and it seems that's not possible.
I want to set a variable in one stage and have it accessible to other stages.
In a declarative pipeline it seems the only way to do this is in a script{} block.
For example I need to set some vars after checkout. So at the end of the checkout stage I have a script{} block that sets those vars and they are accessible in other stages.
This works, but it feels wrong. And for the sake of readability I'd much prefer to declare these variables at the top of the pipeline and have them overwritten. So that would mean having a "set variables" stage at the beginning with a script{} block that just defines vars- thats ugly.
I'm pretty sure I'm missing an obvious feature here. Do declarative pipelines have a global variable feature or must I use script{}
This is working without an error,
def my_var
pipeline {
agent any
environment {
REVISION = ""
}
stages {
stage('Example') {
steps {
script{
my_var = 'value1'
}
}
}
stage('Example2') {
steps {
script{
echo "$my_var"
}
}
}
}
}
Like #mkobit says, you can define the variable to global level out of pipeline block. Have you tried that?
def my_var
pipeline {
agent any
stages {
stage('Example') {
steps {
my_var = 'value1'
}
}
stage('Example2') {
steps {
printl(my_var)
}
}
}
}
For strings, add it to the 'environment' block:
pipeline {
environment {
myGlobalValue = 'foo'
}
}
But for non-string variables, the easiest solution I've found for declarative pipelines is to wrap the values in a method.
Example:
pipeline {
// Now I can reference myGlobalValue() in my pipeline.
...
}
def myGlobalValue() {
return ['A', 'list', 'of', 'values']
// I can also reference myGlobalValue() in other methods below
def myGlobalSet() {
return myGlobalValue().toSet()
}
#Sameera's answer is good for most use cases. I had a problem with appending operator += though. So this did NOT work (MissingPropertyException):
def globalvar = ""
pipeline {
stages {
stage("whatever) {
steps {
script {
globalvar += "x"
}
}
}
}
}
But this did work:
globalvar = ""
pipeline {
stages {
stage("whatever) {
steps {
script {
globalvar += "x"
}
}
}
}
}
The correct syntax is:
For global static variable
somewhere at the top of the file, before pipeline {, declare:
def MY_VAR = 'something'
For global variable that you can edit and reuse accross stages:
At the top of your file, add an import to Field:
import groovy.transform.Field
somewhere before pipeline {, declare:
#Field def myVar
then inside your step, inside a script, set the variable
stage('some stage') {
steps {
script {
myVar = 'I mutate myVar with success'
}
}
}
to go even further, you can declare functions:
before the pipeline {
def initSteps() {
cleanWs()
checkout scm
}
and then
stages {
stage('Init') {
steps {
initSteps()
}
}
}
This worked for me
pipeline {
agent any
stages {
stage('Example') {
steps {
script{
env.my_var = 'value1'
}
}
}
stage('Example2') {
steps {
printl(my_var)
}
}
}
}
I am trying to do this
pipeline {
agent any
environment {
LOCAL_BUILD_PATH=env.WORKSPACE+'/build/'
}
stages {
stage('Stuff'){
steps{
echo LOCAL_BUILD_PATH
}
}
}
}
Result:
null/build/
How can I use Global Environments to create my environments?
So this is method that I ended up using
pipeline {
agent {
label 'master'
}
stages {
stage ("Setting Variables"){
steps {
script{
LOCAL_BUILD_PATH = "$env.WORKSPACE/build"
}
}
}
stage('Print Varliabe'){
steps{
echo LOCAL_BUILD_PATH
}
}
}
}
You can use something like this...
LOCAL_BUILD_PATH="${env.WORKSPACE}/build/"
Remember: use "(double quote) for variable in string
I think you should use:
steps {
echo "${env.LOCAL_BUILD_PATH}"
}
as in "environment" step you're defining environmental variables which are later accessible by env.your-variable-name
This a scope issue. Declare the variable, at the top and set it to null. Something like
def var = null
You should be able to set the value in a block/closure/stage and access it in another