In Jenkins pipeline need to run multiple commands in a loop mode in a different directory
preBuild = ['test\testSchemas\testSchemas', 'Common\Common']
stage('PreBuild') {
when { expression { pipelineParams.preBuild != null } }
steps {
script {
pipelineParams.preBuild.each {
dir(pipelineParams.preBuild.${it}){
executeShellCommand("\"${env.NUGET}\" restore -source ${env.NEXUS_SERVER_URL}/repository/nuget-group/ -PackagesDirectory \"Packages\" ${it}")
executeShellCommand("\"${buildTool}\" ${pipelineParams.buildCommand} ${it}")
}
}
}
}
}
Getting error due to some syntax error, can you please help in this am new to Jeninsfile
Related
I have a groovy script to setup a scheduled job in Jenkins.
I want to execute some shell scripts on failed build.
If I had the scripts manually after the job creation after the job is updated by groovy script, they run.
But the groovy script does not add it:
job('TestingAnalysis') {
triggers {
cron('H 8 28 * *')
}
steps {
shell('some jiberish to create error')
}
publishers {
postBuildScripts {
steps {
shell('echo "fff"')
shell('echo "FFDFDF"')
}
onlyIfBuildSucceeds(false)
onlyIfBuildFails(true)
}
retryBuild {
rerunIfUnstable()
retryLimit(3)
fixedDelay(600)
}
}
}
Every thing works fine except:
postBuildScripts {
steps {
shell('echo "fff"')
shell('echo "FFDFDF"')
}
onlyIfBuildSucceeds(false)
onlyIfBuildFails(true)
}
This is my result:
I tried postBuildSteps and also got error.
I tried also with error:
postBuildScripts {
steps {
sh' echo "ggg" '
}
onlyIfBuildSucceeds(false)
onlyIfBuildFails(true)
}
Take a look at JENKINS-66189 seems like there is an issue with version 3.0 of the PostBuildScript in which the old syntax (that you are using) is no longer supported. In order to use the new version it in a Job Dsl script you will need to use Dynamic DSL syntax.
Use the following link in your own Jenkins instance to see the correct usage:
YOUR_JENKINS_URL/plugin/job-dsl/api-viewer/index.html#path/freeStyleJob-publishers-postBuildScript.
it will help you build the correct command. In your case it will be:
job('TestingAnalysis') {
triggers {
cron('H 8 28 * *')
}
steps {
shell('some jiberish to create error')
}
publishers {
postBuildScript {
buildSteps {
postBuildStep {
stopOnFailure(false) // Mandatory setting
results(['FAILURE']) // Replaces onlyIfBuildFails(true)
buildSteps {
shell {
command('echo "fff"')
}
shell {
command('echo "FFDFDF"')
}
}
}
}
markBuildUnstable(false) // Mandatory setting
}
}
}
Notice that instead of using functions like onlyIfBuildSucceeds and onlyIfBuildFails you now just pass a list of relevant build results to the results function. (SUCCESS,UNSTABLE,FAILURE,NOT_BUILT,ABORTED)
This question ties in with one of my earlier questions here
Tl;dr of the linked question:
Basically I want a generic pipeline to generate distributable bundles (zip files etc) of any of my applications. An application can have multiple components (almost all components are Java/Spring or NodeJS projects).
The plan was to store a pipeline descriptor of each application in a JSON file like such:
{
"app": "MyApp",
"components": [
{
"name": "Component1",
"scmUrl": "https://git.mycompany.com/app/component1.git",
"buildCmd": "mvn clean install"
},
{
"name": "Component2",
"scmUrl": "https://git.mycompany.com/app/component2.git",
"buildCmd": "npm run build"
},
]
}
There will be a descriptor for each application and will be checked into separate repository.
When the pipeline is run the required application name will be an input parameter and the above repo will be cloned and the respective JSON descriptor will be loaded.
This is where things start to get tricky. All components will have some common stages (Checkout, Build, Docker Build). So I am trying to loop the components array and run the stages in parallel:
def parallelCheckoutStages = components.collectEntries {
[ "Checkout ${it.name}", generateCheckoutStage(it) ]
}
generateCheckoutStage(component) {
return {
stage("Stage: ${component.name}") {
script {
git(url: component.scmUrl, branch: component.branch)
}
}
}
}
def parallelBuildStages = components.collectEntries {
[ "Build ${it.name}", generateBuildStage(it) ]
}
generateBuildStage(component) {
return {
stage("Stage: ${component.name}") {
script {
sh script "${component.buildCmd}"
}
}
}
}
pipeline {
agent any
stages {
.
. // clone repo and load json
.
stage("Checkout Components") {
steps {
script {
parallel parallelCheckoutStages
}
}
}
stage("Build Components") {
steps {
script {
parallel parallelBuildStages
}
}
}
}
}
Sometimes I need run the build command inside a docker container (only for some components). To accomplish this I want to edit the generateBuildStage method to something like this:
generateBuildStage(component) {
if(component.requiresDocker) {
return {
stage("Stage: ${component.name}") {
agent {
docker {
image 'jdk11-mvn3.6'
}
}
script {
sh script "${component.buildCmd}"
}
}
}
} else {
return {
stage("Stage: ${component.name}") {
script {
sh script "${component.buildCmd}"
}
}
}
}
}
When I run the above code I get an error java.lang.NoSuchMethodError: No such DSL method 'agent' found among steps
Sort of a second part to my question, does my pipeline seem hacky? I could replace the entire parallel stages by creating individual jobs for each component and calling them from the pipeline. Which approach is better?
I am new to jenkins and I try to build a declarative pipeline according to the tutorial.
On the page: https://jenkins.io/doc/book/pipeline/syntax/#matrix-cell-directives
there is an example on how to build a pipeline with a matrix which I tried.
Unfortunately I get the following error:
WorkflowScript: 32: Unknown stage section "matrix". Starting with version 0.5, steps in a stage must be in a ‘steps’ block. # line 32, column 5.
stage ('Deploy NB') {
^
WorkflowScript: 32: Expected one of "steps", "stages", or "parallel" for stage "Deploy NB" # line 32, column 5.
stage ('Deploy NB') {
My pipeline in the jenkinsfile looks like this:
The functions from the lib are surely without any problems because they are used in several other jenkinsfiles which run without problems.
pipeline {
agent {
node {
label ""
// Location of the output files
customWorkspace "/home/wf/builds/${env.JOB_NAME}"
}
}
environment {
// mail addresses that gets notifications about failures, success etc., - comma delimited
MAIL_NOTIFY = "mustbeanonymous"
// Server admin (not necessary for wildfly)
ADMIN_USER = " "
ADMIN_PWD = " "
// home directory
HOME_DIR = "/home/wf"
// Product name
PRODUCT_NAME = "MYPRD"
}
options {
disableConcurrentBuilds()
durabilityHint("PERFORMANCE_OPTIMIZED")
}
stages {
stage ('Deploy NB') {
matrix {
axes {
axis {
name 'ENVIRONMENT'
values 'NB', 'TEST1'
}
axis {
name 'DATABASE'
values 'ORA', 'ORA_INIT', 'DB2', 'DB2_INIT'
}
}
environment {
// Server scripts installation path
SERVER_PATH = "${HOME_DIR}/WildFly16_${PRODUCT_NAME}_${ENVIRONMENT}_${DATABASE}"
// EAR to deploy on server
DEPLOY_EAR = "${PRODUCT_NAME}_WF_${DATABASE}.ear"
}
stages {
/* BUILD */
stage('Init tools') {
steps {
script {
def lib = load "${workspace}/build/Jenkinsfile.lib"
lib.initTools()
}
}
}
stage('Copy Deployment') {
steps {
script {
def lib = load "${workspace}/build/Jenkinsfile.lib"
lib.copyDeployment()
}
}
}
/* DEPLOY */
stage('Install EAR') {
steps {
script {
def lib = load "${workspace}/build/Jenkinsfile.lib"
lib.installEARDeploy()
}
}
}
}
}
}
}
/* POST PROCESSING */
post {
success {
script {
def lib = load "${workspace}/build/Jenkinsfile.lib"
lib.onSuccess()
}
}
failure {
script {
def lib = load "${workspace}/build/Jenkinsfile.lib"
lib.onFailure()
}
}
unstable {
script {
def lib = load "${workspace}/build/Jenkinsfile.lib"
lib.onUnstable()
}
}
always {
script {
def lib = load "${workspace}/build/Jenkinsfile.lib"
lib.onAlways()
}
}
}
}
What I try to achieve is that the pipeline runs for every ENVIRONMENT and DATABASE (each cell) and executes the stages. But where did I make a mistake?
I use Jenkins: 2.198
Update: The solution was to upgrade the plugin to a version above 1.5.0. See accepted answer for more information.
What version of Declarative Pipeline do you use ?
Matrix section was only added in version 1.5.0 of Declarative Pipeline plugin
See https://github.com/jenkinsci/pipeline-model-definition-plugin/releases
To verify the version, search for pipeline-model-definition on jenkins.yourcompany.com/pluginManager/api/xml?depth=1
I want to be able select whether a pipeline stage is going to be executed with the dockerfile agent depending on the presence of a Dockerfile in the repository. If there's no Dockerfile, the stage should be run locally.
I tried something like
pipeline {
stage('AwesomeStage') {
when {
beforeAgent true
expression { return fileExists("Dockerfile") }
}
agent { dockerfile }
steps {
// long list of awesome steps that should be run either on Docker either locally, depending on the presence of a Dockerfile
}
}
}
But the result is that the whole stage is skipped when there's no Dockerfile.
Is it possible to do something like the following block?
//...
if (fileExists("Dockerfile")) {
agent {dockerfile}
}
else {
agent none
}
//...
I came up with this solution that relies on defining a function to avoid repetion and defines two different stages according to type of agent.
If anyone has a more elegant solution, please let me know.
def awesomeScript() {
// long list of awesome steps that should be run either on Docker either locally, depending on the presence of a Dockerfile
}
pipeline {
stage('AwesomeStageDockerfile') {
when {
beforeAgent true
expression { return fileExists("Dockerfile") }
}
agent { dockerfile }
steps {
awesomeScript()
}
}
stage('AwesomeStageLocal') {
when {
beforeAgent true
expression { return !fileExists("Dockerfile") }
}
agent none
steps {
awesomeScript()
}
}
}
I am having below groovy script for my Jenkins pipeline. But when running its giving error as step expected where my script already having step. Can anyone suggest what's wrong here..
Script file
pipeline {
agent any
stages {
stage('Workspace Preparation') {
steps {
sh """
rm -rf ${workspace}/*
"""
}
}
stage('Get Deployment files') {
steps {
dir("${workspace}/deployfiles") {
if("${params.componentType}"=="A") {
echo "A component deployment"
checkout(## necessary step)
}
else if ("${params.componentType}"=="B") {
echo "B component deployment"
checkout(## necessary step)
}
else {
echo "Invalid"
}
}
}
}
}
}
Getting error as
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 19: Expected a step # line 14, column 6.
if("${params.componentType}"=="A") {
^
enter code here
enter code here
You are missing a script-block.
(Source)
Such a block gives you acces to execute groovy code (for, if-else etc. etc.)
stage('Check') {
steps {
script { // Allows to execute groovy code
dir (...) {
if (...)
}
}
}
See also: How to fix Pipeline-Script “Expected a step” error