Equivalent block to environment in scripted pipeline with Jenkins - jenkins

I have the following pipeline:
pipeline {
agent any
environment {
branch = 'master'
scmUrl = 'ssh://git#myrepo.git'
serverPort = '22'
}
stages {
stage('Stage 1') {
steps {
sh '/var/jenkins_home/contarpalabras.sh'
}
}
}
}
I want to change the pipeline to "scripted pipeline" in order to use try / catch blocks and have better error management. However i did not find how the equivalent to environment block in official documentation.

You can use the withEnv block like:
node {
withEnv(['DISABLE_AUTH=true',
'DB_ENGINE=sqlite']) {
stage('Build') {
sh 'printenv'
}
}
}
This info is also in the official documentation : https://jenkins.io/doc/pipeline/tour/environment/#

Related

Run parallel inside steps of a stage in declarative jenkins

So, I want to run my parallel stages inside a stage but I also want to write some shared code by each parallel stage which I have written in steps of parallel parent stage
The problem I faced is that that the parallel stages are not being run
stages {
stage('partent stage 1'){
something here
}
stage('parent stage 2') {
steps {
// common code for parallel stages
parallel {
stage ('1'){
// some shell command
}
stage('2') {
// some shell command
}
}
}
}
}
For executing shared code you can define variables and functions outside of the declarative pipeline:
def foo = true
def checkFoo {
return foo
}
pipeline {
stage('parallel stage') {
parallel {
stage('stage 1') {
steps {
script {
def baz = checkFoo()
}
sh “echo ${baz}”
}
}
stage('stage 2') {
steps {
script {
def baz = checkFoo()
}
sh “echo ${baz}”
}
}
}
}
}
You can also write a shared library, which you can use in all or certain jobs.
I’ve deleted my first answer, since it was pure BS.

How to pass variable from pipeline to job in Jenkins?

I need to create a unique identifier in a pipeline and then all jobs started from this pipeline should have access to this unique identifier.
I do not want to parameterize those jobs.
I thought that environment variable defined on a pipeline level will be accessible from jobs, but it isn't.
pipeline {
agent any
environment {
TEST_VAR = 'TEST_VAR'
}
stages {
stage('Stage1') {
steps {
build (job: 'job1')
}
}
}
}
You do not really need to parameterize the downstream pipelines but can still pass the variable as a parameter from the upstream and access it in the downstream.
Upstream pipeline
pipeline {
agent any
environment {
TEST_VAR = 'hello_world'
}
stages {
stage('Build-downstream-job') {
steps {
build job: 'downstream-job', parameters: [string(name: 'TEST_VAR', value: env.TEST_VAR)], wait: false
}
}
}
}
Downstream pipeline
pipeline {
agent any
stages {
stage('Get-test-var') {
steps {
println(params.TEST_VAR)
}
}
}
}
Downstream pipeline console output
[Pipeline] stage
[Pipeline] { (Get-test-var)
[Pipeline] echo
hello_world
[Pipeline] }
[Pipeline] // stage
You should try adding a '$' before TEST_VAR:
environment {
TEST_VAR = '$TEST_VAR'
}

How to get Jenkins credentials variable in all stages of my Jenkins Declarative Pipeline

How do I get Jenkins credentials variable i.e "mysqlpassword" accessible to all stages of my Jenkins Declarative Pipeline?
The below code snippet works fine and prints my credentials.
node {
stage('Getting Database Credentials') {
withCredentials([usernamePassword(credentialsId: 'mysql_creds', passwordVariable: 'mysqlpassword', usernameVariable: 'mysqlusername')])
{
creds = "\nUsername: ${mysqlusername}\nPassword: ${mysqlpassword}\n"
}
println creds
}
}
How can I incorporate the above code in my current pipeline so that mysqlusername & mysqlpassword variables are accessible to all stages across the pipeline script i.e globally.
My pipeline script layout looks like below:
pipeline { //indicate the job is written in Declarative Pipeline
agent { label 'Prod_Slave' }
environment {
STAGE_2_EXECUTED = "0"
}
stages {
stage ("First Stage") {
steps {
echo "First called in pipeline"
script {
echo "Inside script of First stage"
}
}
} // end of first stage
stage ("Second Stage") {
steps {
echo "Second stage called in pipeline"
script {
echo "Inside script of Second stage"
}
}
} // end of second stage
} //end of stages
} // end of pipeline
I m on the latest version of Jenkins.
Requesting solutions. Thank you.
You can do something like this. Here, you define you variables under environment { } and use it throughout your stages.
pipeline {
agent any
environment {
// More detail:
// https://jenkins.io/doc/book/pipeline/jenkinsfile/#usernames-and-passwords
MYSQL_CRED = credentials('mysql_creds')
}
stages {
stage('Run Some Command') {
steps{
echo "Running some command"
sh '<some-command> -u $MYSQL_CRED_USR -p $MYSQL_CRED_PSW'
}
}
}
Variables defined under environments are global to all the stages so can be used in the whole jenkinsfile.
More information about credentials() in official documentation.

Re-use agent in parallel stages of declarative pipeline

I'm using Declarative Pipelines 1.3.2 plugin and I want to use the same agent (as in only specifying the agent directive once) in multiple parallel stages:
stage('Parallel Deployment')
{
agent { dockerfile { label 'docker'; filename 'Dockerfile'; } }
parallel
{
stage('A') { steps { ... } }
stage('B') { steps { ... } }
}
}
However, Jenkins complains:
"agent" is not allowed in stage "Parallel Deployment" as it contains parallel stages
A solution is to duplicate the agent directive for each parallel stage, but this is tedious and leads to lot of duplicated code with many parallel stages:
stage('Parallel Deployment')
{
parallel
{
stage('A') {
agent { dockerfile { label 'docker'; filename 'Dockerfile'; } }
steps { ... }
}
stage('B') {
agent { dockerfile { label 'docker'; filename 'Dockerfile'; } }
steps { ... }
}
}
}
Is there a more idiomatic solution, or is duplicating agent directive necessary for each of the parallel stages?
Specifying the agent at pipeline level can be a solution, but has the potential downside that the agent is up & running for the whole duration of the build.
Also note that this means each stage (that doesn't define its own agent) is run on the same agent instance, not agent type. If the parallel processes are CPU / resource intensive, this might not be what you want.
Still, if you want to run parallel stages on one instance, and can't or want not to define the agent at pipeline level, here's a workaround for the declarative syntax:
stage('Parallel Deployment') {
agent { dockerfile { label 'docker'; filename 'Dockerfile'; } }
stages {
stage('A & B') {
parallel {
stage('A') { steps { ... } }
stage('B') { steps { ... } }
}
}
}
}
Or you go for a scripted pipeline, which doesn't have this limitation.
Declare the agent at Pipeline level so all stages run on the same agent.

Cannot define variable in pipeline stage

I'm trying to create a declarative Jenkins pipeline script but having issues with simple variable declaration.
Here is my script:
pipeline {
agent none
stages {
stage("first") {
def foo = "foo" // fails with "WorkflowScript: 5: Expected a step # line 5, column 13."
sh "echo ${foo}"
}
}
}
However, I get this error:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 5: Expected a step # line 5, column 13.
def foo = "foo"
^
I'm on Jenkins 2.7.4 and Pipeline 2.4.
The Declarative model for Jenkins Pipelines has a restricted subset of syntax that it allows in the stage blocks - see the syntax guide for more info. You can bypass that restriction by wrapping your steps in a script { ... } block, but as a result, you'll lose validation of syntax, parameters, etc within the script block.
I think error is not coming from the specified line but from the first 3 lines. Try this instead :
node {
stage("first") {
def foo = "foo"
sh "echo ${foo}"
}
}
I think you had some extra lines that are not valid...
From declaractive pipeline model documentation, it seems that you have to use an environment declaration block to declare your variables, e.g.:
pipeline {
environment {
FOO = "foo"
}
agent none
stages {
stage("first") {
sh "echo ${FOO}"
}
}
}
Agree with #Pom12, #abayer. To complete the answer you need to add script block
Try something like this:
pipeline {
agent any
environment {
ENV_NAME = "${env.BRANCH_NAME}"
}
// ----------------
stages {
stage('Build Container') {
steps {
echo 'Building Container..'
script {
if (ENVIRONMENT_NAME == 'development') {
ENV_NAME = 'Development'
} else if (ENVIRONMENT_NAME == 'release') {
ENV_NAME = 'Production'
}
}
echo 'Building Branch: ' + env.BRANCH_NAME
echo 'Build Number: ' + env.BUILD_NUMBER
echo 'Building Environment: ' + ENV_NAME
echo "Running your service with environemnt ${ENV_NAME} now"
}
}
}
}
In Jenkins 2.138.3 there are two different types of pipelines.
Declarative and Scripted pipelines.
"Declarative pipelines is a new extension of the pipeline DSL (it is basically a pipeline script with only one step, a pipeline step with arguments (called directives), these directives should follow a specific syntax. The point of this new format is that it is more strict and therefore should be easier for those new to pipelines, allow for graphical editing and much more.
scripted pipelines is the fallback for advanced requirements."
jenkins pipeline: agent vs node?
Here is an example of using environment and global variables in a Declarative Pipeline. From what I can tell enviroment are static after they are set.
def browser = 'Unknown'
pipeline {
agent any
environment {
//Use Pipeline Utility Steps plugin to read information from pom.xml into env variables
IMAGE = readMavenPom().getArtifactId()
VERSION = readMavenPom().getVersion()
}
stages {
stage('Example') {
steps {
script {
browser = sh(returnStdout: true, script: 'echo Chrome')
}
}
}
stage('SNAPSHOT') {
when {
expression {
return !env.JOB_NAME.equals("PROD") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "SNAPSHOT"
echo "${browser}"
}
}
stage('RELEASE') {
when {
expression {
return !env.JOB_NAME.equals("TEST") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "RELEASE"
echo "${browser}"
}
}
}//end of stages
}//end of pipeline
You are using a Declarative Pipeline which requires a script-step to execute Groovy code. This is a huge difference compared to the Scripted Pipeline where this is not necessary.
The official documentation says the following:
The script step takes a block of Scripted Pipeline and executes that
in the Declarative Pipeline.
pipeline {
agent none
stages {
stage("first") {
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
you can define the variable global , but when using this variable must to write in script block .
def foo="foo"
pipeline {
agent none
stages {
stage("first") {
script{
sh "echo ${foo}"
}
}
}
}
Try this declarative pipeline, its working
pipeline {
agent any
stages {
stage("first") {
steps{
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
}

Resources