Jenkins declarative pipeline add groovy postbuild script - jenkins

I have groovy postbuild script
def error = manager.getLogMatcher(".*(Error:(.*)))
if(error?.matches()) {
manager. addShortText(matcher.group(1))
}
Now I'm trying to convert this into declarative pipeline syntax
pipeline{
post{
failure{}
}
}
So In failure tab can I add groovy script? or I have to add stage? I see there is jenkins-badge-plugin but not sure how to add regex to find text and then add batch

You just need to add script block inside failure as follows and there you can put your post build groovy script:
pipeline{
post{
failure{
script{
//Add your post build script code in case of failure
}
}
}
}

Related

What is differences between either of using def and without using def in Jenkinsfile in script block?

I have two Jenkinsfile for sample:
The content of A_Jenkinsfile is:
pipeline {
agent any
stages {
stage("first") {
steps {
script {
foo = "bar"
}
sh "echo ${foo}"
}
}
stage("two") {
steps {
sh "echo ${foo}"
}
}
}
}
The other one is B_Jenkinsfile and its content is:
pipeline {
agent any
stages {
stage("first") {
steps {
script {
def foo = "bar"
}
sh "echo ${foo}"
}
}
stage("two") {
steps {
sh "echo ${foo}"
}
}
}
}
When I build them, B_Jenkinsfile is failed and A_Jenkinsfile is success.
What is differences between either of using def and without using def in Jenkinsfile in script block?
There are two types of Pipeline syntax. Declarative Pipeline and Scripted Pipeline. A declarative pipeline starts with a pipeline {} wrapper and will have Stages and Steps. Declarative pipeline limits what is available to the user with a more strict and pre-defined structure. Where in scripted Pipeline it's more closer to groovy, and users will have more flexibility on what they can do. When you run something in a Script block in a declarative Pipeline, The script step takes a block of the Scripted Pipeline and executes that in the Declarative Pipeline. Basically, it runs a Groovy script for you. So your question can be rephrased as what def means in a Groovy script.
Simply in a Groovy script, if you omit adding the def keyword the variable will be added to the current script's binding. So it will be considered as a Global variable. If you use def the variable will be scoped, and you will only be able to use it in the current script block. There are multiple detailed answers for this here, so I'm not going to repeat them.

Pass maven argument in jenkins pipeline shell command

I have a doubt on the below issue can someone please help me on this.
I wanted to pass maven pom.xml properties from the shell in jenkins pipeline which needs to be substituted by maven and not by groovy or shell.
Example:
pipeline {
agent any
stages {
stage('build') {
steps {
sh 'mvn -Doracle.db.url=${db.url} package'
}
}
}
}
Here ${db.url} should be substituted by the url from maven setting.xml file properties and not by groovy or shell in Jenkins pipeline.
I have tried different combination but it gives me error in Jenkins pipeline.
If the above maven property is constant(some constant url) then it is easy to pass but when I wanted to pass any variable property (${db.url}) then I am not able to do so with any syntax.
If you want maven to evaluate ${db.url}, it has to be like
pipeline {
agent any
stages {
stage('build') {
steps {
sh 'mvn -Doracle.db.url=\\${db.url} package'
}
}
}
}
Now if you see, Jenkins will prepare the maven command like this-
If you don't escape it will give you Bad substitution error

Can I use a Closure to define a stage in a Jenkins Declarative Pipeline?

I'm trying to do something like this:
def makeStage = {
stage('a') {
steps {
echo 'Hello World'
}
}
}
pipeline {
agent none
stages {
makeStage()
}
}
But it gives me this exception:
WorkflowScript: 11: Expected a stage # line 11, column 5.
makeStage()
^
Is it possible to define a stage as a external closure and if so - how?
Super late, but in case anyone runs into this issue, a possible solution would be to wrap your generated stage around a script declarative and invoke .call on the generated stage.
So for you:
def makeStage = {
return {
stage('a') {
echo 'Hello World'
}
}
}
pipeline {
agent none
stages {
stage ('hello world') {
steps {
script {
makeStage().call()
}
}
}
}
}
Whoops. edited, I had "steps" inside on my stage('a') in the makeStage declaration. "steps" is a declarative pipeline directive so it was throwing an error inside the script block.
You can't define stages outside the declarative pipeline. The main purpose of declarative pipeline is to provide simplified and opinionated syntax so you can focus on what should be done (by using some of the available steps) and not how to do it.
If you are interested in more flexible way of implementing pipeline, you may choose Scripted Pipeline approach which is not that strict if it comes to the syntax - it's only limited by Groovy and CPS execution module.
Working (scripted) pipeline from your example would look like this:
#!groovy
def makeStage = {
stage('a') {
echo 'Hello World'
}
}
node {
makeStage()
}
Attention: There is no steps method inside stage in a scripted pipeline. If you leave it there you will get
java.lang.NoSuchMethodError: No such DSL method 'steps' found among
steps [archive, bat, build, catchError, checkout, deleteDir, dir,
dockerFingerprintFrom, ...
Scripts in declarative pipeline
Declarative pipeline defines a script step that allows you to put a block of scripted pipeline. However it still does not allow you to define stage dynamically or/and extract stage definition to a function or closure. script step gets executed inside the stage so you can't control inside this block if stage is executed or not. In some cases however this step might be very useful if you want to do something more complex than just calling pre-defined step of a declarative pipeline.

Create new Jenkins jobs using Pipeline Job and Groovy script

I have Jenkins pipeline Job with parameters (name, group, taskNumber)
I need to write pipeline script which will call groovy script (this one?: https://github.com/peterjenkins1/jenkins-scripts/blob/master/add-job.groovy)
I want to create new job (with name name_group_taskNamber) every times when I build main Pipeline Job.
I don't understand:
Where do I need to put may groovy script ?
How does Pipeline script should look like? :
node{
stage('Build'){
def pipeline = load "CreateJob.groovy"
pipeline.run()
}
}
You can use and configure a shared library like here (a git repo): https://github.com/lvthillo/shared-library . You need to configure this in your Jenkins global configuration.
It contains a folder vars/. Here you can manage pipelines and groovy scripts like my slackNotifier.groovy. The script is just a groovy script to print the build result in Slack.
In the jenkins pipeline job we will import our shared library:
#Library('name-of-shared-pipeline-library')_
mavenPipeline {
//define parameters
}
In the case above also the pipeline is in the shared library but this isn't necessary.
You can just write your pipeline in the job itself and call only the function from the pipeline like this:
This is the script in the shared library:
// vars/sayHello.groovy
def call(String name = 'human') {
echo "Hello, ${name}."
}
And in your pipeline:
final Lib= library('my-shared-library')
...
stage('stage name'){
echo "output"
Lib.sayHello.groovy('Peter')
}
...
EDIT:
In new declarative pipelines you can use:
pipeline {
agent { node { label 'xxx' } }
options {
buildDiscarder(logRotator(numToKeepStr: '3', artifactNumToKeepStr: '1'))
}
stages {
stage('test') {
steps {
sh 'echo "execute say hello script:"'
sayHello("Peter")
}
}
}
post {
always {
cleanWs()
}
}
}
def sayHello(String name = 'human') {
echo "Hello, ${name}."
}
output:
[test] Running shell script
+ echo 'execute say hello script:'
execute say hello script:
[Pipeline] echo
Hello, Peter.
[Pipeline] }
[Pipeline] // stage
We do it by using the https://wiki.jenkins.io/display/JENKINS/Jobcopy+Builder+plugin, try build another step in pipeline script and pass the parms which are to be considered

Jenkins: "Execute system groovy script" in a pipeline step (SCM committed)

Is there a way to use the Jenkins "Execute system groovy script" step from a pipeline file which is SCM commited ?
If yes, how would I access the predefined variables (like build) in it ?
If no, would I be able to replicate the functionality otherwise, using for example the Shared Library Plugin ?
Thanks !
You can put groovy code in a pipeline in a (always-source-controlled) Jenkinsfile, like this:
pipeline {
agent { label 'docker' }
stages {
stage ('build') {
steps {
script {
// i used a script block because you can jam arbitrary groovy in here
// without being constrained by the declarative Jenkinsfile DSL
def awesomeVar = 'so_true'
print "look at this: ${awesomeVar}"
// accessing a predefined variable:
echo "currentBuild.number: ${currentBuild.number}"
}
}
}
}
}
Produces console log:
[Pipeline] echo
look at this: so_true
[Pipeline] echo
currentBuild.number: 84
Click on the "Pipeline Syntax" link in the left navigation of any of pipeline job to get a bunch of examples of things you can access in the "Global Variables Reference."

Resources