Jenkinsfile Pipeline errors: "expected a symbol" and "undefined section" - jenkins

Can anyone explain why I get the following errors, and what can be a possible solution for them?
org.codehaus.groovy.control.MultipleCompilationErrorsException:
startup failed: WorkflowScript: 20: Expected a symbol # line 20,
column 4.
environment {
WorkflowScript: 17: Undefined section "error" # line 17, column 1.
pipeline {
The code in the Jenkinsfile is as follows:
#!groovy
def application, manifest, git, environment, artifactory, sonar
fileLoader.withGit('git#<reducted>', 'v1', 'ssh-key-credential-id-number') {
application = fileLoader.load('<reducted>');
manifest = fileLoader.load('<reducted>');
git = fileLoader.load('<reducted>');
environment = fileLoader.load('<reducted>');
}
pipeline {
agent { label 'cf_slave' }
environment {
def projectName = null
def githubOrg = null
def gitCommit = null
}
options {
skipDefaultCheckout()
}
stages {
stage ("Checkout SCM") {
steps {
checkout scm
script {
projectName = git.getGitRepositoryName()
githubOrg = git.getGitOrgName()
gitCommit = manifest.getGitCommit()
}
}
}
stage ("Unit tests") {
steps {
sh "./node_modules/.bin/mocha --reporter mocha-junit-reporter --reporter-options mochaFile=./testResults/results.xml"
junit allowEmptyResults: true, testResults: 'testResults/results.xml'
}
}
//stage ("SonarQube analysis") {
//...
//}
// stage("Simple deploy") {
// steps {
// // Login
// sh "cf api <reducted>"
// sh "cf login -u <reducted> -p <....>"
//
// // Deploy
// sh "cf push"
// }
// }
}
post {
// always {
// }
success {
sh "echo 'Pipeline reached the finish line!'"
// Notify in Slack
slackSend color: 'yellow', message: "Pipeline operation completed successfully. Check <reducted>"
}
failure {
sh "echo 'Pipeline failed'"
// Notify in Slack
slackSend color: 'red', message: "Pipeline operation failed!"
//Clean the execution workspace
//deleteDir()
}
unstable {
sh "echo 'Pipeline unstable :-('"
}
// changed {
// sh "echo 'Pipeline was previously failing but is now successful.'"
// }
}
}

Your Pipeline is mostly fine — adding Scripted Pipeline elements before the Declarative pipeline block is generally not a problem.
However, at the very start, you're defining an variable called environment (and git), which are basically overriding the elements declared by the various Pipeline plugins.
i.e. When you attempt to do pipeline { environment { … } }, the environment is referring to your variable declaration, which causes things to go wrong.
Rename those two variables, and you'll fix the first error message.
To fix the second error message, remove the attempts to declare variables from the environment block — this block is only intended for exporting environment variables for use during the build steps, e.g.:
environment {
FOO = 'bar'
BAR = 'baz'
}
The script block you have will work fine without these declarations. Alternatively, you can move those variable declarations to the top level of your script.

If you're using declarative pipeline (which you are, e.g. the outer pipeline step), then you may only declare the pipeline on the outer layer, e.g. you can't have variable and function definitions. This is the downside of using declarative pipeline.
More info here
As I see it you can solve this the following ways:
Use scripted pipeline instead
Move the code at the beginning to a global pipeline library (Might be tricky to solve variable scoping if a value is used in several places, but it should be doable.
Move the code at the beginning to an script step inside the pipeline and store the values as described here.

Related

Jenkins. Use a shared library on the options phase

So I have created a shared library in jenkins with a listener that gets triggered each time the pipelines reads a FlowNode so I can run groovy code before and after each stage, step, etc...
I'm able to call the shared library in a step phase like this:
pipeline {
agent any
stages {
stage('prepare') {
steps{
prepareStepsWrapper()
}
}
stage('step1') {
steps {
echo 'step1'
}
}
stage('step2') {
steps {
echo 'step2'
}
}
stage('step3') {
steps {
echo 'step3'
// fail on purpose
sh 'notfoundexecutablelol'
}
}
stage('step4') {
steps {
echo 'step4'
}
}
}
post{
always{
println env.getEnvironment()
}
}
}
And works pretty great!
With this approach the 'prepare' stage needs to be filtered out so I've switched to the options directive:
pipeline {
agent any
options {
prepareStepsWrapper()
}
stages {
stage('step1') {
steps {
echo 'step1'
}
}
...
}
}
But the pipeline fails with
WorkflowScript: 4: Invalid option type "prepareStepsWrapper"
tl;dr; How can I load a shared library within the options directive?
What does the option-stage do?
The options directive allows configuring Pipeline-specific options
from within the Pipeline itself.
You can't call the shared-library in the options-stage. This stage should not be used for execute any logic, rather it sets configurations for the pipeline. All availables options and the documentation can be found here.
You could try to create a stage that simply calls your prepareStepsWrapper() and use locks to avoid that other stages are executed before this stage.

Jenkinsfile Pipeline errors: “expected a step” and “undefined section”

Can anyone explain why I get the following errors, and what can be a possible solution for them?
Running in Durability level: MAX_SURVIVABILITY
org.codehaus.groovy.control.MultipleCompilationErrorsException:
startup failed: WorkflowScript: 43: Expected a step # line 43, column
17.
if (isUnix()) {
^
WorkflowScript: 71: Undefined section "success" # line 71, column 5.
success {
^
WorkflowScript: 78: Undefined section "failure" # line 78, column 5.
failure {
^
WorkflowScript: 16: Tool type "maven" does not have an install of
"MAVEN_HOME" configured - did you mean "Maven M2"? # line 16, column
19.
maven "MAVEN_HOME"
^
WorkflowScript: 17: Tool type "jdk" does not have an install of
"JAVA_HOME" configured - did you mean "null"? # line 17, column 17.
jdk "JAVA_HOME"
^
The code in the Jenkinsfile is as follows:
pipeline {
// agent defines where the pipeline will run.
agent {
// Here we define that we wish to run on the agent with the label SL202_win
label "SL202_win"
}
// The tools directive allows you to automatically install tools configured in
// Jenkins - note that it doesn't work inside Docker containers currently.
tools {
// Here we have pairs of tool symbols (not all tools have symbols, so if you
// try to use one from a plugin you've got installed and get an error and the
// tool isn't listed in the possible values, open a JIRA against that tool!)
// and installations configured in your Jenkins master's tools configuration.
maven "MAVEN_HOME"
jdk "JAVA_HOME"
}
environment {
// Environment variable identifiers need to be both valid bash variable
// identifiers and valid Groovy variable identifiers. If you use an invalid
// identifier, you'll get an error at validation time.
// Right now, you can't do more complicated Groovy expressions or nesting of
// other env vars in environment variable values, but that will be possible
// when https://issues.jenkins-ci.org/browse/JENKINS-41748 is merged and
// released.
mvnHome = "D:/Tools/apache-maven-3.5.2"
}
stages {
// At least one stage is required.
stage("Preparation") {
// Every stage must have a steps block containing at least one step.
steps {
// Get some code from a GitHub repository
git 'https://git.ceesiesdomain.nl/scm/rsd/test_automation.git'
}
}
stage('Build') {
steps {
// Run the maven build
if (isUnix()) {
sh "'${mvnHome}/bin/mvn' clean test -Dtest=TestRunner"
} else {
bat(/"${mvnHome}\bin\mvn" clean test -Dtest=TestRunner/)
}
}
}
stage('Results') {
steps {
cucumber buildStatus: 'UNSTABLE', failedFeaturesNumber: 999, failedScenariosNumber: 999, failedStepsNumber: 3, fileIncludePattern: '**/*.json', skippedStepsNumber: 999
}
}
}
// Post can be used both on individual stages and for the entire build.
post {
success {
echo "Test run completed succesfully."
}
failure {
echo "Test run failed."
}
always {
// Let's wipe out the workspace before we finish!
deleteDir()
echo "Workspace cleaned"
}
}
success {
mail(from: "jenkins#ceesiesdomain.nl",
to: "ceesie#ceesiesdomain.nl",
subject: "That build passed.",
body: "Nothing to see here")
}
failure {
mail(from: "jenkins#ceesiesdomain.nl",
to: "ceesie#ceesiesdomain.nl",
subject: "That build failed!",
body: "Nothing to see here")
}
// The options directive is for configuration that applies to the whole job.
options {
// For example, we'd like to make sure we only keep 10 builds at a time, so
// we don't fill up our storage!
buildDiscarder(logRotator(numToKeepStr:'10'))
// And we'd really like to be sure that this build doesn't hang forever, so
// let's time it out after an hour.
timeout(time: 60, unit: 'MINUTES')
}
}
I managed to get it working by trimming back all excess and starting with the pure essentials and iteratively adding steps and configs and running it after each change to verify the workings.
The script I ended up with now is:
pipeline {
agent {
label 'SL202_win'
}
stages {
stage("Fetch repository") {
steps {
git 'https://git.ceesiesdomain.nl/scm/rsd/test_automation.git'
}
}
stage('Run test') {
steps {
bat 'cd d:/SL202_Data/workspace/Front-end-SwiftNL/Sanctie_Regressie_Workflows_WCM'
bat 'mvn clean test -f d:/SL202_Data/workspace/Front-end-SwiftNL/Sanctie_Regressie_Workflows_WCM/pom.xml -Dtest=TestRunner'
}
}
}
post {
always {
echo 'Test run completed'
cucumber buildStatus: 'UNSTABLE', failedFeaturesNumber: 999, failedScenariosNumber: 999, failedStepsNumber: 3, fileIncludePattern: '**/*.json', skippedStepsNumber: 999
}
success {
echo 'Successfully!'
}
failure {
echo 'Failed!'
}
unstable {
echo 'This will run only if the run was marked as unstable'
}
changed {
echo 'This will run only if the state of the Pipeline has changed'
echo 'For example, if the Pipeline was previously failing but is now successful'
}
}
options {
timeout(time: 60, unit: 'MINUTES')
}
}
My way is to put the if or for block in a script {} block or dir("") {} block when using declarative pipelines.

How to force jenkins to reload a jenkinsfile?

My jenkinsfile has several paremeters, every time I make an update in the parameters (e.g. remove or add a new input) and commit the change to my SCM, I do not see the job input screen updated accordingly in jenkins, I have to run an execution, cancel it and then see my updated fields in
properties([
parameters([
string(name: 'a', defaultValue: 'aa', description: '*', ),
string(name: 'b', description: '*', ),
string(name: 'c', description: '*', ),
])
])
any clues?
One of the ugliest things I've done to get around this is create a Refresh parameter which basically exits the pipeline right away. This way I can run the pipeline just to update the properties.
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return parameters.Refresh == true }
}
steps {
echo("Ended pipeline early.")
}
}
stage('Run Jenkinsfile') {
when {
expression { return parameters.Refresh == false }
}
stage('Build') {
// steps
}
stage('Test') {
// steps
}
stage('Deploy') {
// steps
}
}
}
}
There really must be a better way, but I'm yet to find it :(
Unfortunately the answer of TomDotTom was not working for me - I had the same issue and my jenkins required another stages unter 'Run Jenkinsfile' because of the following error:
Unknown stage section "stage". Starting with version 0.5, steps in a stage must be in a ‘steps’ block.
Also I am using params instead of parameters as variable to check the condition (as described in Jenkins Syntax).
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return params.Refresh == true }
}
steps {
echo("stop")
}
}
stage('Run Jenkinsfile') {
when {
expression { return params.Refresh == false }
}
stages {
stage('Build') {
steps {
echo("build")
}
}
stage('Test') {
steps {
echo("test")
}
}
stage('Deploy') {
steps {
echo("deploy")
}
}
}
}
}
}
applied to Jenkins 2.233
The Jenkinsfile needs to be executed in order to update the job properties, so you need to start a build with the new file.
Apparently it is known Jenkins "issue" or "hidden secret" https://issues.jenkins.io/browse/JENKINS-41929.
I overcome this automatically using Jenkins Job DSL plugin.
I have Job DSL's seed job for my pipelines checking for changes in git repository with my pipeline.
pipelineJob('myJobName') {
// sets RELOAD=true for when the job is 'queued' below
parameters {
booleanParam('RELOAD', true)
}
definition {
cps {
script(readFileFromWorkspace('Jenkinsfile'))
sandbox()
}
}
// queue the job to run so it re-downloads its Jenkinsfile
queue('myJobName')
}
Upon changes seed job runs and re-generate pipeline's configuration including params. After pipeline is created/updated Job DSL will queue pipeline with special param RELOAD.
Pipeline than reacts to it in first stage and abort early. (Apparently there is no way in Jenkins to abort pipeline stop without error at the end of stage causing "red" pipeline.)
As parameters in Jenkinsfile are in properties, they will be set over anything set by seed job like RELOAD. At this stage pipeline is ready with actual params without any sign of RELOAD to confuse users.
properties([
parameters([
string(name: 'PARAM1', description: 'my Param1'),
string(name: 'PARAM2', description: 'my Param2'),
])
])
pipeline {
agent any
stages {
stage('Preparations') {
when { expression { return params.RELOAD == true } }
// Because this: https://issues.jenkins-ci.org/browse/JENKINS-41929
steps {
script {
if (currentBuild.getBuildCauses('hudson.model.Cause') != null) {
currentBuild.displayName = 'Parameter Initialization'
currentBuild.description = 'On first build we just load the parameters as they are not available of first run on new branches. A second run has been triggered automatically.'
currentBuild.result = 'ABORTED'
error('Stopping initial build as we only want to get the parameters')
}
}
}
}
stage('Parameters') {
steps {
echo 'Running real job steps...'
}
}
}
End result is as such that every time I update anything in Pipeline repository, all jobs generated by seed are updated and run to get updated params list. There will be message "Parameters initialization" to indicate such a job.
There is potentially way how to improve and only update affected pipelines but I haven't explore that as all my pipelines are in one repository and I'm happy with always updating them.
Another upgrade could be that if someone doesn't like "abort" with "error", you could have while condition in every other stage to skip it if parameter is RELOAD but I find adding when to every other stage cumbersome.
I initially tried #TomDotTom's answer but than I didn't liked manual effort.
Scripted pipeline workaround - can probably make it work in declarative as well.
Since you are using SCM, you can check which files have changed since last build (see here), and then decide what to do base on it.
Note that poll SCM on the job must be enabled to detect the Jenkinsfile changes automatically.
node('master') {
checkout scm
if (checkJenkinsfileChanges()) {
return // exit the build immediately
}
echo "build" // build stuff
}
private Boolean checkJenkinsfileChanges() {
filesChanged = getChangedFilesList()
jenkinsfileChanged = filesChanged.contains("Jenkinsfile")
if (jenkinsfileChanged) {
if (filesChanged.size() == 1) {
echo "Only Jenkinsfile changed, quitting"
} else {
echo "Rescheduling job with updated Jenkinsfile"
build job: env.JOB_NAME
}
}
return jenkinsfileChanged
}
// returns a list of changed files
private String[] getChangedFilesList() {
changedFiles = []
for (changeLogSet in currentBuild.changeSets) {
for (entry in changeLogSet.getItems()) { // for each commit in the detected changes
for (file in entry.getAffectedFiles()) {
changedFiles.add(file.getPath()) // add changed file to list
}
}
}
return changedFiles
}
I solve this by using Jenkins Job Builder python package. The main goal of this package is to achieve Jenkins Job as Code
To solve your problem I could simply use like below and keep that on SCM with a Jenkins pipeline which will listen to any changes for jobs.yaml file change and build the job for me so that whenever I trigger my job all the needed parameters will be ready for me.
jobs.yaml
- job:
name: 'job-name'
description: 'deploy template'
concurrent: true
properties:
- build-discarder:
days-to-keep: 7
- rebuild:
rebuild-disabled: false
parameters:
- choice:
name: debug
choices:
- Y
- N
description: 'debug flag'
- string:
name: deploy_tag
description: "tag to deploy, default to latest"
- choice:
name: deploy_env
choices:
- dev
- test
- preprod
- prod
description: "Environment"
project-type: pipeline
# you can use either DSL or pipeline SCM
dsl: |
node() {
stage('info') {
print params
}
}
# pipeline-scm:
# script-path: Jenkinsfile
# scm:
# - git:
# branches:
# - master
# url: 'https://repository.url.net/x.git'
# credentials-id: 'jenkinsautomation'
# skip-tag: true
# wipe-workspace: false
# lightweight-checkout: true
config.ini
[job_builder]
allow_duplicates = False
keep_descriptions = False
ignore_cache = True
recursive = False
update = all
[jenkins]
query_plugins_info = False
url = http://localhost:8080
command to load / update the job
jenkins-jobs --conf conf.ini -u $JENKINS_USER -p $JENKINS_PASSWORD update jobs.yaml
Note - To use jenkins-jobs command, make sure you need install this jenkins-job-builder python package.
This package has a lot of features like create (free-style, pipeline, multibranch) , update, delete , validate jenkins job configuration. It supports Templates - meaning with one generic template, you can build an 'n' number of similar jobs, dynamically generate parameters and etc..

Can I build stages with a function in a Jenkinsfile?

I'd like to use a function to build some of the stages of my Jenkinsfile. This is going to be a build with a number of repetitive stages/steps - I'd not like to generate everything manually.
I was wondering if it's possible to do something like this:
_make_stage() {
stage("xx") {
step("A") {
echo "A"
}
step("B") {
echo "B"
}
}
}
_make_stages() {
stages {
_make_stage()
}
}
// pipeline starts here!
pipeline {
agent any
_make_stages()
}
Unfortunately Jenkins doesn't like this - when I run I get the error:
WorkflowScript: 24: Undefined section "_make_stages" # line 24, column 5.
_make_stages()
^
WorkflowScript: 22: Missing required section "stages" # line 22, column 1.
pipeline {
^
So what's going wrong here? The function _make_stages() really looks like it returns whatever the stages object returns. Why does it matter whether I put that in a function call or just inline it into the pipeline definition?
As explained here, Pipeline "scripts" are not simple Groovy scripts, they are heavily transformed before running, some parts on master, some parts on slaves, with their state (variable values) serialized and passed to the next step. As such, every Groovy feature is not supported, and what you see as simple functions really is not.
It does not mean what you want to achieve is impossible. You can create stages programmatically, but apparently not with the declarative syntax. See also this question for good suggestions.
You can define a declarative pipeline in a shared library, for example:
// vars/evenOrOdd.groovy
def call(int buildNumber) {
if (buildNumber % 2 == 0) {
pipeline {
agent any
stages {
stage('Even Stage') {
steps {
echo "The build number is even"
}
}
}
}
} else {
pipeline {
agent any
stages {
stage('Odd Stage') {
steps {
echo "The build number is odd"
}
}
}
}
}
}
// Jenkinsfile
#Library('my-shared-library') _
evenOrOdd(currentBuild.getNumber())
See Defining Declarative Pipelines

Cannot define variable in pipeline stage

I'm trying to create a declarative Jenkins pipeline script but having issues with simple variable declaration.
Here is my script:
pipeline {
agent none
stages {
stage("first") {
def foo = "foo" // fails with "WorkflowScript: 5: Expected a step # line 5, column 13."
sh "echo ${foo}"
}
}
}
However, I get this error:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 5: Expected a step # line 5, column 13.
def foo = "foo"
^
I'm on Jenkins 2.7.4 and Pipeline 2.4.
The Declarative model for Jenkins Pipelines has a restricted subset of syntax that it allows in the stage blocks - see the syntax guide for more info. You can bypass that restriction by wrapping your steps in a script { ... } block, but as a result, you'll lose validation of syntax, parameters, etc within the script block.
I think error is not coming from the specified line but from the first 3 lines. Try this instead :
node {
stage("first") {
def foo = "foo"
sh "echo ${foo}"
}
}
I think you had some extra lines that are not valid...
From declaractive pipeline model documentation, it seems that you have to use an environment declaration block to declare your variables, e.g.:
pipeline {
environment {
FOO = "foo"
}
agent none
stages {
stage("first") {
sh "echo ${FOO}"
}
}
}
Agree with #Pom12, #abayer. To complete the answer you need to add script block
Try something like this:
pipeline {
agent any
environment {
ENV_NAME = "${env.BRANCH_NAME}"
}
// ----------------
stages {
stage('Build Container') {
steps {
echo 'Building Container..'
script {
if (ENVIRONMENT_NAME == 'development') {
ENV_NAME = 'Development'
} else if (ENVIRONMENT_NAME == 'release') {
ENV_NAME = 'Production'
}
}
echo 'Building Branch: ' + env.BRANCH_NAME
echo 'Build Number: ' + env.BUILD_NUMBER
echo 'Building Environment: ' + ENV_NAME
echo "Running your service with environemnt ${ENV_NAME} now"
}
}
}
}
In Jenkins 2.138.3 there are two different types of pipelines.
Declarative and Scripted pipelines.
"Declarative pipelines is a new extension of the pipeline DSL (it is basically a pipeline script with only one step, a pipeline step with arguments (called directives), these directives should follow a specific syntax. The point of this new format is that it is more strict and therefore should be easier for those new to pipelines, allow for graphical editing and much more.
scripted pipelines is the fallback for advanced requirements."
jenkins pipeline: agent vs node?
Here is an example of using environment and global variables in a Declarative Pipeline. From what I can tell enviroment are static after they are set.
def browser = 'Unknown'
pipeline {
agent any
environment {
//Use Pipeline Utility Steps plugin to read information from pom.xml into env variables
IMAGE = readMavenPom().getArtifactId()
VERSION = readMavenPom().getVersion()
}
stages {
stage('Example') {
steps {
script {
browser = sh(returnStdout: true, script: 'echo Chrome')
}
}
}
stage('SNAPSHOT') {
when {
expression {
return !env.JOB_NAME.equals("PROD") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "SNAPSHOT"
echo "${browser}"
}
}
stage('RELEASE') {
when {
expression {
return !env.JOB_NAME.equals("TEST") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "RELEASE"
echo "${browser}"
}
}
}//end of stages
}//end of pipeline
You are using a Declarative Pipeline which requires a script-step to execute Groovy code. This is a huge difference compared to the Scripted Pipeline where this is not necessary.
The official documentation says the following:
The script step takes a block of Scripted Pipeline and executes that
in the Declarative Pipeline.
pipeline {
agent none
stages {
stage("first") {
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
you can define the variable global , but when using this variable must to write in script block .
def foo="foo"
pipeline {
agent none
stages {
stage("first") {
script{
sh "echo ${foo}"
}
}
}
}
Try this declarative pipeline, its working
pipeline {
agent any
stages {
stage("first") {
steps{
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
}

Resources