I have a shared library containing a declarative pipeline used in a repo Jenkinsfile (it is in fact called through an intermediate): Jenkinsfile -> bupJavadocApiPipeline.groovy -> bupMavenPipeline.groovy
Jenkinsfile (shared library is implicit):
bupJavadocApiPipeline {}
bupJavadocApiPipeline.groovy:
def call(body, Map defaults = [:]) {
if (defaults.mavenGoals == null) defaults.mavenGoals = 'javadoc:javadoc package'
bupMavenPipeline(body,defaults)
}
bupMavenPipeline.groovy (bupParameters does the DELEGATE_FIRST trick):
def call(body, defaults = [:]) {
if (defaults.maven == null) defaults.maven='MVN3'
if (defaults.mavenGoals == null) defaults.mavenGoals='package'
if (defaults.jdk == null) defaults.jdk='JDK8'
if (defaults.buildsToKeep == null) defaults.buildsToKeep='10'
def parameters = bupParameters(body,defaults)
pipeline {
options {
timestamps()
buildDiscarder(logRotator(numToKeepStr: "${parameters.buildsToKeep}"))
}
agent ('docker') {
tools {
maven "${parameters.maven}"
jdk "${parameters.maven}"
}
stages {
stage ('Build') {
steps {
sh "mvn -Dmaven.test.failure.ignore=true clean ${parameters.mavenGoals}"
}
post {
success {
junit '**/target/surefire-report/**/*.xml'
}
}
}
}
}
}
}
This fails with:
[Pipeline] Start of Pipeline
[Pipeline] End of Pipeline
java.lang.NoSuchMethodError: No such DSL method 'options' found among steps [...
Versions are Jenkins 2.162 and Pipeline 2.6, but so many web resources say this is supported since September 2017!
I can make it all work if I only do scripted pipeline in bupMavenPipeline.groovy, but I like the "safety" of declarative (and there seem to be many more resources on that then on scripted).
Can you see what is tripping me up?
By inlining into the actual Jenkinsfile Jenkins kindly helped me to identify the problem:
agent ('docker') { ... }
should be
agent { label 'docker' }
Related
How can you create a pipeline that does its stages on every branch, but the postbuild action only on specific branches.
I have seen the when {branch 'production'} option, but it seems to me, that this only works for stage blocks and not for post blocks.
Is there a way to do something like
pipeline {
agent { any }
stages {
stage('build') {
steps {
bat script:"echo build"
}
post {
always {
when {
branch 'production'
}
bat script:"echo publish"
}
}
}
}
}
the if (env.BRANCH_NAME == 'production') seems to be only for scripted pipelines
Conditional post section in Jenkins pipeline has an answer: use script and if inside the post condition.
post {
always {
script {
if (env.BRANCH_NAME == 'production') {
// I have not verified whether your step works here;
// conditional emailext works as expected.
bat script:"echo publish"
}
}
}
}
I have a common Jenkins shared library for all the repositories as below.
vars/_publish.groovy
pipeline {
environment {
abc= credentials(’abc')
def= credentials(‘def’)
}
stages {
stage('Build') {
steps{
sh ‘docker build'
}
}
stage('Unit-test') {
steps{
sh ‘mvn test'
}
}
jenkinsfile
#Library('my-shared-library#branch') _
_publish() {
}
I have 10 Repository each has its own Jenkinsfile as shown above which refers to the jenkins shared library(vars/_publish.groovy). I have a condition here that I need to Pass. For few repository I want to skip the Unit test and just execute the build stage. For rest other repository I want both the stages. Is there anyone I can skip the particular stage based on the repository or repository name
Yes it's possible you can use when expression like this
pipeline {
agent any
stages {
stage('Test') {
when { expression { return repositoryName.contains('dev') } } <---------Add put your repository name 'dev' so whenever the repository names is ''dev' then execute this stage
steps {
script {
}
}
}
}
}
def repositoryName() {
def repositoryName = ['dev', 'test'] <----Add here the 10 repo name
return repositoryName
}
Here in my case repo names are dev and test so you can add yours accondigly
I would decorate my shared library and Jenkinsfile like this to achieve your scenario.
vars/_publish.groovy
def call(body={}) {
def pipelineParams = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent any;
stages {
stage('build') {
steps {
echo "BUILD"
}
}
stage('unitest') {
when {
anyOf {
equals expected: true, actual: pipelineParams.isEmpty();
equals expected: false, actual: pipelineParams.skipUnitest
}
}
steps {
echo "UNITEST"
}
}
}
}
}
I am enabling my shared library to accept parameter from Jenkinsfile and with when{} DSL deciding whether to skip unitest stage or not
Jenkinsfile
If your Jenkins file from the repo has below details, will skip the unitest stage
#Library('jenkins-shared-library')_
_publish(){
skipUnitest = true
}
below both scenario will run the unitest stage
#Library('jenkins-shared-library')_
_publish(){
skipUnitest = false
}
and
#Library('jenkins-shared-library')_
_publish(){
}
This looks like very basic question about Jenkins usage.
I have Jenkinsfile located in root folder of my Subversion repository tree. There are many branches (versions/tags) of the product - everywhere is the same Jenkinsfile. So far very basic setup, I suppose.
I need to provide some steps with current Subversion repository branch/url.
There are some similar questions like this or this, but none is working solution for Subversion.
pipeline {
agent { label 'master' }
stages {
stage("test") {
steps {
echo "Start pipeline "
// commented-out = not working
//echo scm.getUserRemoteConfigs()
//echo scm
script {
println "Current svn url/branch: "//??? + scm.getUserRemoteConfigs()[0].getUrl()
}
}
}
}
}
It will be like this
pipeline {
agent any;
stages {
stage('test'){
steps {
script {
def s = checkout scm;
if (s.GIT_URL != null) {
print s.GIT_URL
print s.GIT_BRANCH
print s.GIT_COMMIT
}
else if (s.SVN_URL != null) {
print s.SVN_REVISION
print s.SVN_REVISION_1
print s.SVN_URL
print s.SVN_URL_1
}
}
}
}
}
}
Note- This works fine with GIT and SVN, but a bit differently.
As far as declarative pipelines go in Jenkins, I'm having trouble with the when keyword.
I keep getting the error No such DSL method 'when' found among steps. I'm sort of new to Jenkins 2 declarative pipelines and don't think I am mixing up scripted pipelines with declarative ones.
The goal of this pipeline is to run mvn deploy after a successful Sonar run and send out mail notifications of a failure or success. I only want the artifacts to be deployed when on master or a release branch.
The part I'm having difficulties with is in the post section. The Notifications stage is working great. Note that I got this to work without the when clause, but really need it or an equivalent.
pipeline {
agent any
tools {
maven 'M3'
jdk 'JDK8'
}
stages {
stage('Notifications') {
steps {
sh 'mkdir tmpPom'
sh 'mv pom.xml tmpPom/pom.xml'
checkout([$class: 'GitSCM', branches: [[name: 'origin/master']], doGenerateSubmoduleConfigurations: false, submoduleCfg: [], userRemoteConfigs: [[url: 'https://repository.git']]])
sh 'mvn clean test'
sh 'rm pom.xml'
sh 'mv tmpPom/pom.xml ../pom.xml'
}
}
}
post {
success {
script {
currentBuild.result = 'SUCCESS'
}
when {
branch 'master|release/*'
}
steps {
sh 'mvn deploy'
}
sendNotification(recipients,
null,
'https://link.to.sonar',
currentBuild.result,
)
}
failure {
script {
currentBuild.result = 'FAILURE'
}
sendNotification(recipients,
null,
'https://link.to.sonar',
currentBuild.result
)
}
}
}
In the documentation of declarative pipelines, it's mentioned that you can't use when in the post block. when is allowed only inside a stage directive.
So what you can do is test the conditions using an if in a script:
post {
success {
script {
if (env.BRANCH_NAME == 'master')
currentBuild.result = 'SUCCESS'
}
}
// failure block
}
Using a GitHub Repository and the Pipeline plugin I have something along these lines:
pipeline {
agent any
stages {
stage('Build') {
steps {
sh '''
make
'''
}
}
}
post {
always {
sh '''
make clean
'''
}
success {
script {
if (env.BRANCH_NAME == 'master') {
emailext (
to: 'engineers#green-planet.com',
subject: "${env.JOB_NAME} #${env.BUILD_NUMBER} master is fine",
body: "The master build is happy.\n\nConsole: ${env.BUILD_URL}.\n\n",
attachLog: true,
)
} else if (env.BRANCH_NAME.startsWith('PR')) {
// also send email to tell people their PR status
} else {
// this is some other branch
}
}
}
}
}
And that way, notifications can be sent based on the type of branch being built. See the pipeline model definition and also the global variable reference available on your server at http://your-jenkins-ip:8080/pipeline-syntax/globals#env for details.
Ran into the same issue with post. Worked around it by annotating the variable with #groovy.transform.Field. This was based on info I found in the Jenkins docs for defining global variables.
e.g.
#!groovy
pipeline {
agent none
stages {
stage("Validate") {
parallel {
stage("Ubuntu") {
agent {
label "TEST_MACHINE"
}
steps {{
sh "run tests command"
recordFailures('Ubuntu', 'test-results.xml')
junit 'test-results.xml'
}
}
}
}
}
post {
unsuccessful {
notify()
}
}
}
// Make testFailures global so it can be accessed from a 'post' step
#groovy.transform.Field
def testFailures = [:]
def recordFailures(key, resultsFile) {
def failures = ... parse test-results.xml script for failures ...
if (failures) {
testFailures[key] = failures
}
}
def notify() {
if (testFailures) {
... do something here ...
}
}
I'm trying to create a declarative Jenkins pipeline script but having issues with simple variable declaration.
Here is my script:
pipeline {
agent none
stages {
stage("first") {
def foo = "foo" // fails with "WorkflowScript: 5: Expected a step # line 5, column 13."
sh "echo ${foo}"
}
}
}
However, I get this error:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 5: Expected a step # line 5, column 13.
def foo = "foo"
^
I'm on Jenkins 2.7.4 and Pipeline 2.4.
The Declarative model for Jenkins Pipelines has a restricted subset of syntax that it allows in the stage blocks - see the syntax guide for more info. You can bypass that restriction by wrapping your steps in a script { ... } block, but as a result, you'll lose validation of syntax, parameters, etc within the script block.
I think error is not coming from the specified line but from the first 3 lines. Try this instead :
node {
stage("first") {
def foo = "foo"
sh "echo ${foo}"
}
}
I think you had some extra lines that are not valid...
From declaractive pipeline model documentation, it seems that you have to use an environment declaration block to declare your variables, e.g.:
pipeline {
environment {
FOO = "foo"
}
agent none
stages {
stage("first") {
sh "echo ${FOO}"
}
}
}
Agree with #Pom12, #abayer. To complete the answer you need to add script block
Try something like this:
pipeline {
agent any
environment {
ENV_NAME = "${env.BRANCH_NAME}"
}
// ----------------
stages {
stage('Build Container') {
steps {
echo 'Building Container..'
script {
if (ENVIRONMENT_NAME == 'development') {
ENV_NAME = 'Development'
} else if (ENVIRONMENT_NAME == 'release') {
ENV_NAME = 'Production'
}
}
echo 'Building Branch: ' + env.BRANCH_NAME
echo 'Build Number: ' + env.BUILD_NUMBER
echo 'Building Environment: ' + ENV_NAME
echo "Running your service with environemnt ${ENV_NAME} now"
}
}
}
}
In Jenkins 2.138.3 there are two different types of pipelines.
Declarative and Scripted pipelines.
"Declarative pipelines is a new extension of the pipeline DSL (it is basically a pipeline script with only one step, a pipeline step with arguments (called directives), these directives should follow a specific syntax. The point of this new format is that it is more strict and therefore should be easier for those new to pipelines, allow for graphical editing and much more.
scripted pipelines is the fallback for advanced requirements."
jenkins pipeline: agent vs node?
Here is an example of using environment and global variables in a Declarative Pipeline. From what I can tell enviroment are static after they are set.
def browser = 'Unknown'
pipeline {
agent any
environment {
//Use Pipeline Utility Steps plugin to read information from pom.xml into env variables
IMAGE = readMavenPom().getArtifactId()
VERSION = readMavenPom().getVersion()
}
stages {
stage('Example') {
steps {
script {
browser = sh(returnStdout: true, script: 'echo Chrome')
}
}
}
stage('SNAPSHOT') {
when {
expression {
return !env.JOB_NAME.equals("PROD") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "SNAPSHOT"
echo "${browser}"
}
}
stage('RELEASE') {
when {
expression {
return !env.JOB_NAME.equals("TEST") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "RELEASE"
echo "${browser}"
}
}
}//end of stages
}//end of pipeline
You are using a Declarative Pipeline which requires a script-step to execute Groovy code. This is a huge difference compared to the Scripted Pipeline where this is not necessary.
The official documentation says the following:
The script step takes a block of Scripted Pipeline and executes that
in the Declarative Pipeline.
pipeline {
agent none
stages {
stage("first") {
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
you can define the variable global , but when using this variable must to write in script block .
def foo="foo"
pipeline {
agent none
stages {
stage("first") {
script{
sh "echo ${foo}"
}
}
}
}
Try this declarative pipeline, its working
pipeline {
agent any
stages {
stage("first") {
steps{
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
}