can someone help me to convert the below Jenkins scripted pipeline to declarative pipeline
node('agent') {
if ( ! "${GIT_BRANCH}".isEmpty()) {
branch="${GIT_BRANCH}"
} else {
echo 'The git branch is not provided, exiting..'
sh 'exit 1'
}
version = extract_version("${GIT_BRANCH}")
if ( "${GIT_BRANCH}".contains("feature")) {
currentBuild.displayName = "${version}-SNAPSHOT-${env.BUILD_ID}"
}
else {
currentBuild.displayName = "${version}-${env.BUILD_ID}"
}
}
I am trying to check if git branch has been provided and setup jenkins build id dynamically based on git branch
pipeline {
agent {
label 'agent'
}
stages{
stage('stag1'){
steps {
script {
if ( ! "${GIT_BRANCH}".isEmpty()) {
branch="${GIT_BRANCH}"
} else {
echo 'The git branch is not provided, exiting..'
sh 'exit 1'
}
version = extract_version("${GIT_BRANCH}")
if ( "${GIT_BRANCH}".contains("feature")) {
currentBuild.displayName = "${version}-SNAPSHOT-${env.BUILD_ID}"
}
else {
currentBuild.displayName = "${version}-${env.BUILD_ID}"
}
}
}
}
}
}
Related
node('Docker3') {
try {
cleanWs()
wrap([$class: 'AnsiColorBuildWrapper', 'colorMapName': 'XTerm']) {
git branch: "${params.branch}", url: "${params.scmUrl}"
gitCommitHash = sh(script: "git log -n 1 --pretty=format:'%H'", returnStdout: true)
load("jenkins_modules/moduleMain.groovy")
}
} catch (all) {
print(all)
}
}
I have this code which I am using in jenkinsfile. But I want to schedule jenkins job from script. Can anyone please help me to tell how can I use that above code with,this following code.
pipeline {
agent any
triggers {
cron('H * * * *')
}
stages {
stage('Example') {
steps {
echo 'Hello World'
}
}
}
}
I tried above code like this following way,
**pipeline {
agent any
triggers {
cron('H */4 * * 1-5')
}
stages {
stage('Example') {
steps {
echo 'Hello World'
}
}
}
}
node('Docker3') {
try {
cleanWs()
wrap([$class: 'AnsiColorBuildWrapper', 'colorMapName': 'XTerm']) {
git branch: "${params.branch}", url: "${params.scmUrl}"
gitCommitHash = sh(script: "git log -n 1 --pretty=format:'%H'", returnStdout: true)
load("jenkins_modules/moduleMain.groovy")
}
} catch (all) {
print(all)
}
}**
it is working, but i dont think so it is correct way.
I want to execute multiple jobs from a single pipeline using declarative syntax in parallel. Can this be possible!! I know we can make a declarative parallel pipeline using "parallel" parameter.
pipeline {
agent any
parallel{
stages {
stage('Test1') {
steps {
sh 'pip install -r requirements.txt'
}
}
stage('Test2') {
steps {
echo 'Stage 2'
sh 'behave -f allure_behave.formatter:AllureFormatter -o allure-results features/scenarios/**/*.feature'
}
}
stage('Test3') {
steps {
script {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'allure-results']]
])
}
}
}
}
}
}
Below image will show you the proper flow that I want. Any approach how to do it?
// Pipeline project: SO-69680107-1-parallel-downstream-jobs-matrix
pipeline {
agent any
stages {
stage('Clean Workspace') {
steps {
cleanWs()
}
}
stage('Job matrix') {
matrix {
axes {
axis {
name 'job'
values 'SO-69680107-2', 'SO-69680107-3', 'SO-69680107-k' // , ...
}
}
stages {
stage('Run job') {
steps {
build "$job"
copyFiles( "$WORKSPACE\\..\\$job", "$WORKSPACE")
}
} // stage 'Run job'
}
} // matrix
} // stage 'Job matrix'
stage('List upstream workspace') {
steps {
bat "#dir /b \"$WORKSPACE\""
}
}
} // stages
}
def copyFiles( downstreamWorkspace, upstreamWorkspace ) {
dir("$downstreamWorkspace") {
bat """
#set prompt=\$g\$s
#echo Begin: %time%
dir /b
xcopy /f *.* \"$upstreamWorkspace\\\"
#echo End: %time%
"""
}
}
Template for downstream projects SO-69680107-2, SO-69680107-3, SO-69680107-k:
// Pipeline project: SO-69680107-X
pipeline {
agent any
stages {
stage('Stage X') {
steps {
sh 'set +x; echo "Step X" | tee SO-69680107-X.log; date; sleep 3; date'
}
}
}
}
I have a very simple pipeline. Everything is defined in my pom.xml files and .m2/settings.xml. I want to set my build as instable on Jenkins when SonarQube's Quality Gate is failed. Here is what I did but I have several errors like "expected }". Does anyone know how it works ?
Note that the environment part is optional.
Thank you.
pipeline {
agent {
label "master"
}
tools {
// Note: this should match with the tool name configured in your jenkins instance (JENKINS_URL/configureTools/)
maven "Maven 3.6.0"
jdk 'Java 1.8'
}
environment {
// This can be nexus3 or nexus2
NEXUS_VERSION = "nexus3"
// This can be http or https
NEXUS_PROTOCOL = "http"
// Where your Nexus is running
NEXUS_URL = "192.168.1.8:8081"
// Repository where we will upload the artifact
NEXUS_REPOSITORY = "repository-example"
// Jenkins credential id to authenticate to Nexus OSS
NEXUS_CREDENTIAL_ID = "nexus-credentials"
}
stages {
stage ('Initialize') {
steps {
sh '''
echo "PATH = ${PATH}"
echo "M2_HOME = ${M2_HOME}"
'''
}
}
stage("mvn clean deploy") {
steps {
script {
// If you are using Windows then you should use "bat" step
// Since unit testing is out of the scope we skip them
sh "mvn -B clean deploy"
}
}
}
stage ("SonarQube check") {
steps {
script {
sh 'mvn -B sonar:sonar'
}
step {
qualitygate = waitForQualityGate()
if (qualitygate.status != "OK") {
currentBuild.result = "UNSTABLE"
}
}
}
}
}
}
You need to wrap the qualitygate stuff and all inside a script block as shown below:
stage ("SonarQube check") {
steps {
script {
sh 'mvn -B sonar:sonar'
qualitygate = waitForQualityGate()
if (qualitygate.status != "OK") {
currentBuild.result = "UNSTABLE"
}
}
}
}
As far as declarative pipelines go in Jenkins, I'm having trouble with the when keyword.
I keep getting the error No such DSL method 'when' found among steps. I'm sort of new to Jenkins 2 declarative pipelines and don't think I am mixing up scripted pipelines with declarative ones.
The goal of this pipeline is to run mvn deploy after a successful Sonar run and send out mail notifications of a failure or success. I only want the artifacts to be deployed when on master or a release branch.
The part I'm having difficulties with is in the post section. The Notifications stage is working great. Note that I got this to work without the when clause, but really need it or an equivalent.
pipeline {
agent any
tools {
maven 'M3'
jdk 'JDK8'
}
stages {
stage('Notifications') {
steps {
sh 'mkdir tmpPom'
sh 'mv pom.xml tmpPom/pom.xml'
checkout([$class: 'GitSCM', branches: [[name: 'origin/master']], doGenerateSubmoduleConfigurations: false, submoduleCfg: [], userRemoteConfigs: [[url: 'https://repository.git']]])
sh 'mvn clean test'
sh 'rm pom.xml'
sh 'mv tmpPom/pom.xml ../pom.xml'
}
}
}
post {
success {
script {
currentBuild.result = 'SUCCESS'
}
when {
branch 'master|release/*'
}
steps {
sh 'mvn deploy'
}
sendNotification(recipients,
null,
'https://link.to.sonar',
currentBuild.result,
)
}
failure {
script {
currentBuild.result = 'FAILURE'
}
sendNotification(recipients,
null,
'https://link.to.sonar',
currentBuild.result
)
}
}
}
In the documentation of declarative pipelines, it's mentioned that you can't use when in the post block. when is allowed only inside a stage directive.
So what you can do is test the conditions using an if in a script:
post {
success {
script {
if (env.BRANCH_NAME == 'master')
currentBuild.result = 'SUCCESS'
}
}
// failure block
}
Using a GitHub Repository and the Pipeline plugin I have something along these lines:
pipeline {
agent any
stages {
stage('Build') {
steps {
sh '''
make
'''
}
}
}
post {
always {
sh '''
make clean
'''
}
success {
script {
if (env.BRANCH_NAME == 'master') {
emailext (
to: 'engineers#green-planet.com',
subject: "${env.JOB_NAME} #${env.BUILD_NUMBER} master is fine",
body: "The master build is happy.\n\nConsole: ${env.BUILD_URL}.\n\n",
attachLog: true,
)
} else if (env.BRANCH_NAME.startsWith('PR')) {
// also send email to tell people their PR status
} else {
// this is some other branch
}
}
}
}
}
And that way, notifications can be sent based on the type of branch being built. See the pipeline model definition and also the global variable reference available on your server at http://your-jenkins-ip:8080/pipeline-syntax/globals#env for details.
Ran into the same issue with post. Worked around it by annotating the variable with #groovy.transform.Field. This was based on info I found in the Jenkins docs for defining global variables.
e.g.
#!groovy
pipeline {
agent none
stages {
stage("Validate") {
parallel {
stage("Ubuntu") {
agent {
label "TEST_MACHINE"
}
steps {{
sh "run tests command"
recordFailures('Ubuntu', 'test-results.xml')
junit 'test-results.xml'
}
}
}
}
}
post {
unsuccessful {
notify()
}
}
}
// Make testFailures global so it can be accessed from a 'post' step
#groovy.transform.Field
def testFailures = [:]
def recordFailures(key, resultsFile) {
def failures = ... parse test-results.xml script for failures ...
if (failures) {
testFailures[key] = failures
}
}
def notify() {
if (testFailures) {
... do something here ...
}
}
I am creating a sample jenkins pipeline, here is the code.
pipeline {
agent any
stages {
stage('test') {
steps {
sh 'echo hello'
}
}
stage('test1') {
steps {
sh 'echo $TEST'
}
}
stage('test3') {
if (env.BRANCH_NAME == 'master') {
echo 'I only execute on the master branch'
} else {
echo 'I execute elsewhere'
}
}
}
}
this pipeline fails with following error logs
Started by user admin
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 15: Not a valid stage section definition: "if (env.BRANCH_NAME == 'master') {
echo 'I only execute on the master branch'
} else {
echo 'I execute elsewhere'
}". Some extra configuration is required. # line 15, column 9.
stage('test3') {
^
WorkflowScript: 15: Nothing to execute within stage "test3" # line 15, column 9.
stage('test3') {
^
But when i execute the following example from this url, it executes successfully and print the else part.
node {
stage('Example') {
if (env.BRANCH_NAME == 'master') {
echo 'I only execute on the master branch'
} else {
echo 'I execute elsewhere'
}
}
}
The only difference i can see is that in the working example there is no stages but in my case it has.
What is wrong here, can anyone please suggest?
your first try is using declarative pipelines, and the second working one is using scripted pipelines. you need to enclose steps in a steps declaration, and you can't use if as a top-level step in declarative, so you need to wrap it in a script step. here's a working declarative version:
pipeline {
agent any
stages {
stage('test') {
steps {
sh 'echo hello'
}
}
stage('test1') {
steps {
sh 'echo $TEST'
}
}
stage('test3') {
steps {
script {
if (env.BRANCH_NAME == 'master') {
echo 'I only execute on the master branch'
} else {
echo 'I execute elsewhere'
}
}
}
}
}
}
you can simplify this and potentially avoid the if statement (as long as you don't need the else) by using "when". See "when directive" at https://jenkins.io/doc/book/pipeline/syntax/. you can also validate jenkinsfiles using the jenkins rest api. it's super sweet. have fun with declarative pipelines in jenkins!
It requires a bit of rearranging, but when does a good job to replace conditionals above. Here's the example from above written using the declarative syntax. Note that test3 stage is now two different stages. One that runs on the master branch and one that runs on anything else.
stage ('Test 3: Master') {
when { branch 'master' }
steps {
echo 'I only execute on the master branch.'
}
}
stage ('Test 3: Dev') {
when { not { branch 'master' } }
steps {
echo 'I execute on non-master branches.'
}
}
If you wanted to create a condition to execute only a stage based on expression you can use keyword when
stage ('test3'){
when { expression { return env.BRANCH_NAME == 'master'} }
steps {
echo 'I only execute on the master branch.'
}
}
}
With the expression key word you can add any condition.
e.g. if stage is dependent on generated file in workspace.
stage ('File Dependent stage'){
when { expression { return fileExists ('myfile') } }
steps {
echo "file exists"
}
}
}
if ( params.build_deploy == '1' ) {
println "build_deploy 是 ${params.build_deploy}"
jobB = build job: 'k8s-core-user_deploy', propagate: false, wait: true, parameters: [
string(name:'environment', value: "${params.environment}"),
string(name:'branch_name', value: "${params.branch_name}"),
string(name:'service_name', value: "${params.service_name}"),
]
println jobB.getResult()
}