groovy.lang.MissingPropertyException: No such property: failedjob - jenkins

I'm trying to implement e-mail notification I'll appreciate any help, I tried to implement like a emalext plug in but I don't know how to do that the subject contains the build result
this is my jenkinsfile
node('G') {
properties([disableConcurrentBuilds()])
boolean failedJob = 'null'
stage('Checkout') {checkout scm}
stage('Install dependencies') {whatever}
stage('Packaging Development') {whatever}
stage('Notify') {
notify(failedjob)
}
} //end node
def notify(failedJob) {
def subject, body
def notifyEmail = "exaple#hotmail.com"
if (failedJob) {
subject = "Failed Deployment. Job Name: " + env.JOB_NAME + " Build: " + env.BUILD_NUMBER
body = "<p>A build triggered in Jenkins has failed in the " + env.STAGE_NAME + " stage.</p>" + "<p>Check console output at Jenkins"
}
else {
subject = "Successful Deployment. Job Name: " + env.JOB_NAME + " Build: " + env.BUILD_NUMBER
body = "<p>A build triggered in Jenkins has succeeded.</p>" + "<p>Check console output at Jenkins"
}
emailext(attachLog: true, compressLog: false, mimeType: 'text/html', subject: subject, body: body, from: 'Jenkins#jenkins.com', to: notifyEmail)
}
this are the logs
groovy.lang.MissingPropertyException: No such property: failedjob for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:270)
at org.kohsuke.groovy.sandbox.impl.Checker$7.call(Checker.java:353)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:357)

Related

groovy current scope error in jenkins pipeline

i neeed your help please
i'm working on groovy script to list all scm polling jobs.
the script is working fine on jenkins scripting console but when i integrate it in jenkinsfile and run it in pipeline i get this error :
12:51:21 WorkflowScript: 10: The current scope already contains a variable of the name it
12:51:21 # line 10, column 25.
12:51:21 def logSpec = { it, getTrigger -> String spec = getTrigger(it)?.getSpec(); if (spec ) println ("job_name " + it.name + " job_path " + it.getFullName() + " with spec " + spec )}
12:51:21 ^
12:51:21
12:51:21 1 error
12:51:21
12:51:21 at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
12:51:21 at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:958)
Here is the jenkinsfile :
#!/usr/bin/env groovy
import hudson.triggers.*
import hudson.maven.MavenModuleSet
import org.jenkinsci.plugins.workflow.job.*
pipeline {
agent any
stages {
stage('list jobs with scm polling') {
steps {
def logSpec = { it, getTrigger -> String spec = getTrigger(it)?.getSpec(); if (spec ) println ("job_name " + it.name + " job_path " + it.getFullName() + " with spec " + spec )}
println("--- SCM Frequent Polling for Pipeline jobs ---")
Jenkins.getInstance().getAllItems(WorkflowJob.class).each() { logSpec(it, {it.getSCMTrigger()}) }
println("\n--- SCM Frequent Polling for FreeStyle jobs ---")
Jenkins.getInstance().getAllItems(FreeStyleProject.class).each() { logSpec(it, {it.getSCMTrigger()}) }
println("\n--- SCM Frequent Polling for Maven jobs ---");
Jenkins.getInstance().getAllItems(MavenModuleSet.class).each() { logSpec(it, {it.getTrigger(SCMTrigger.class)}) }
println("--- SCM Frequent Polling for Abstract jobs---")
Jenkins.getInstance().getAllItems(AbstractProject.class).each() { logSpec(it, {it.getTrigger(SCMTrigger.class)}) }
println '\nDone.'
}} }}
Does anyone can help ?
thanksss
it is an implicit variable that is provided in closures, when the closure doesn't have an explicitly declared parameter. So when you declare a parameter, make sure it is not called it to avoid conflicts with parent scopes that already define it (in your case the closure of .each()).
Also, to integrate a script section in a pipeline, either use the script step or define a function that you could call like a built-in step.
Lastly, .each() doesn't work well in pipeline code, due to the restrictions imposed by the CPS transformations applied by Jenkins to the pipeline code (unless tagged #NonCPS - which has other restrictions). So .each() should be replaced by a for loop.
pipeline {
agent any
stages {
stage('list jobs with scm polling') {
steps {
script {
def logSpec = { job, getTrigger -> String spec = getTrigger(job)?.getSpec(); if (spec ) println ("job_name " + job.name + " job_path " + job.getFullName() + " with spec " + spec )}
println("--- SCM Frequent Polling for Pipeline jobs ---")
for( item in Jenkins.getInstance().getAllItems(WorkflowJob.class) ) {
logSpec( item, {item.getSCMTrigger()})
}
// ... other code ...
println '\nDone.'
}
}} }}
Variant with separate function:
pipeline {
agent any
stages {
stage('list jobs with scm polling') {
steps {
doStuff()
}} }}
void doStuff() {
def logSpec = { job, getTrigger -> String spec = getTrigger(job)?.getSpec(); if (spec ) println ("job_name " + job.name + " job_path " + job.getFullName() + " with spec " + spec )}
println("--- SCM Frequent Polling for Pipeline jobs ---")
for( item in Jenkins.getInstance().getAllItems(WorkflowJob.class) ) {
logSpec( item, {item.getSCMTrigger()})
}
// ... other code ...
println '\nDone.'
}

How to get the Build url of the triggered job in jenkins pipeline?

script {
def job = build job: '<job>'
if (job.getResult().equals('FAILURE')) {
def jobName = '<job>'
def date = new Date().format('dd/MM/yyyy')
echo "Build Result: " + job.getResult()
slackSend color: "danger", message: "${jobName} failed on " + date + "\n Build URL - ${absoluteUrl}"
}
else {
echo "Build Result: " + job.getResult()
}
}
How to get the build url for ''.
By using 'JOB_URL' I get the url of the Parent Job.
And 'absoluteUrl' throws ' No such property: absoluteUrl for class: groovy.lang.Binding' error.
Thanks for the help.
absoluteUrl is not defined. job.getAbsoluteUrl() would give you the URL.
See documentation

Load jenkins parameters from external groovy file

I have a big Jenkinsfile that I would like to re-use for other projects, but I have different parameters per project, so i've tried having one file per project containing only those parameters like this :
Jenkinsfile
node {
checkout scm
def options = []
def optionsBuilder = load pwd() + '/global-scripts/optionsBuilder.groovy'
options.addAll(optionsBuilder.buildOptions(env.JOB_BASE_NAME))
properties { options }
}
global-scripts/optionsBuilder.groovy
def buildOptions(jobName) {
echo "load " + pwd() + "/project-scripts/" + jobName + ".groovy"
def jobOptionsBuilder = load pwd() + "/project-scripts/" + jobName + ".groovy"
return jobOptionsBuilder.buildOptions()
}
return this
project-scripts/job.groovy
def buildOptions() {
def options = [buildDiscarder(logRotator(numToKeepStr: '5')),
parameters([string(name: 'releaseVersion', defaultValue: env.releaseVersion, description: 'Version that needs to be released'),
string(name: 'nextVersion', defaultValue: env.nextVersion, description: 'Next snapshot version' ),
string(name: 'branch', defaultValue: env.branch, description: 'Branch that needs to be released'),
booleanParam(name: 'sendRocketChatNotification', defaultValue: true, description: 'Send notification to Rocket_Chat'),
booleanParam(name: 'sendEmail', defaultValue: true, description: 'Send an email with changelog'),
booleanParam(name: 'dryRun', defaultValue: false, description: 'No git push and no mvn deploy')])]
return options
}
return this
But it seems i can't find the right syntax .. Jenkins throws me this error :
java.lang.ClassCastException: org.jenkinsci.plugins.workflow.multibranch.JobPropertyStep.properties expects java.util.List<hudson.model.JobProperty> but received class org.jenkinsci.plugins.workflow.cps.CpsClosure2
at org.jenkinsci.plugins.structs.describable.DescribableModel.coerce(DescribableModel.java:394)
at org.jenkinsci.plugins.structs.describable.DescribableModel.buildArguments(DescribableModel.java:318)
I have the impression that in your Jenkinsfile you should just write
properties(options)

Jenkins pipeline - "cannot invoke method on null object" on function outside the pipeline

i get the error above when trying to run my pipeline
tried to run it inside and outside the groovy sandBox.
also tried debugging and it's fails on this method call "last_build_number(lastBuildToGenerateNumber)"
before adding try, catch and recursion this code was working well outside the pipeline. don't get me wrong - this code can not run inside the pipeline so i did not try it.
/*
SEIIc DPS_NIGHTLY_BUILD JenkinsFile
*/
def buildDescription // for setting the build name, based on the downstream jobs name
def last_build_number(build) {
println 'the display name is' + build.displayName
return build.displayName
if (build != null) {
if(build.displayName!=null){
println 'the display name is' + build.displayName
return build.displayName
}
}
else {
return '0.0.0.0'
}
return '0.0.0.0'
}
def autoIncBuildNightlyNumber(build) {
def debugEnable = 1
println 'build is: ' + build.displayName
def lastBuildToGenerateNumber = build; //a build variable
def last_build_number; //last build number i.e: "52.88.0.7" or "#43"
build_number=0;
try{
println 'last build to genreate from ' + lastBuildToGenerateNumber.displayName
last_build_number = last_build_number(lastBuildToGenerateNumber);
if (debugEnable == 1) println 'last successfull build: ' + last_successfull_build
def tokens = last_build_number.tokenize('.')
if (debugEnable == 1) println 'tokens: ' + tokens
// update global variable - if it's not a legit number the crash will be catched
build_number = tokens[3].toInteger() + 1
if (debugEnable == 1) println 'new build number: ' + build_number
return build_number
} catch (e){
if (debugEnable == 1) println 'error is ' + e
if (debugEnable == 1) println 'build number: ' + build_number + ' is not valid. recurse now to find a valid number'
build_number = autoIncBuildNightlyNumber(lastBuildToGenerateNumber.getPreviousBuild());
println 'genrate ' + lastBuildToGenerateNumber
return build_number
}
}
// Declarative Pipeline
pipeline {
/*
maximum time for this job
*/
options { //maximum time for this job
timeout(time: 1, unit: 'HOURS')
}
environment {
AUTO_BUILD_NUMBER = autoIncBuildNightlyNumber(currentBuild.getPreviousBuild())
PLASTICSCM_TARGET_SERVER = "g-plasticscm-server.gilat.com:8087"
PLASTICSCM_TARGET_REPOSITORY = "SEIIc_DPS"
PLASTICSCM_WORKSPACE_NAME = "${env.JOB_BASE_NAME}_${env.BUILD_NUMBER}"
AUTOMATION_FOLDER = "${env.PLASTICSCM_WORKSPACE_NAME}\\Tools\\Automation"
Branch = "/main"
TEST_BRANCH = "/QualiTest for SW Automation"
QUALITEST_FOLDER = "${env.PLASTICSCM_WORKSPACE_NAME}\\QualiTest for SW Automation"
PLASTICSCM_TEST_REPOSITORY="SW_Utiles"
PLASTICSCM_TEST_WORKSPACE = "TEST_${env.JOB_BASE_NAME}_${env.BUILD_NUMBER}"
}
// Select target host for building this pipeline
agent { node { label "SEIIc_DPS" } }
// Stages to run for this pipeline
stages {
/*
Checkout files from source control. In this case the pipeline use PlasticSCM plugin to checkout a branch with given parameter "Branch".
When this stage run, it will checkout the branch in the parameter string from the defined repository and server.
It will not
*/
stage('SCM Checkout') {
steps {
cm branch: env.Branch, changelog: true, repository: env.PLASTICSCM_TARGET_REPOSITORY, server: env.PLASTICSCM_TARGET_SERVER, useUpdate: false, workspaceName: env.PLASTICSCM_WORKSPACE_NAME
//checkOut QualiTest
cm branch: env.TEST_BRANCH, changelog: false, repository: 'SW_Utiles', server: env.PLASTICSCM_TARGET_SERVER, useUpdate: false, workspaceName: env.PLASTICSCM_TEST_WORKSPACE
}
}
}//stages
}//pipeline

Jenkins pipeline - How to give choice parameters dynamically

pipeline {
agent any
stages {
stage("foo") {
steps {
script {
env.RELEASE_SCOPE = input message: 'User input required', ok: 'Release!',
parameters: [choice(name: 'RELEASE_SCOPE', choices: 'patch\nminor\nmajor',
description: 'What is the release scope?')]
}
echo "${env.RELEASE_SCOPE}"
}
}
}
}
In this above code, The choice are hardcoded (patch\nminor\nmajor) -- My requirement is to dynamically give choice values in the dropdown.
I get the values from calling api - Artifacts list (.zip) file names from artifactory
In the above example, It request input when we do the build, But i want to do a "Build with parameters"
Please suggest/help on this.
Depends how you get data from API there will be different options for it, for example let's imagine that you get data as a List of Strings (let's call it releaseScope), in that case your code be following:
...
script {
def releaseScopeChoices = ''
releaseScope.each {
releaseScopeChoices += it + '\n'
}
parameters: [choice(name: 'RELEASE_SCOPE', choices: ${releaseScopeChoices}, description: 'What is the release scope?')]
}
...
hope it will help.
This is a cutdown version of what we use. We separate stuff into shared libraries but I have consolidated a bit to make it easier.
Jenkinsfile looks something like this:
#!groovy
#Library('shared') _
def imageList = pipelineChoices.artifactoryArtifactSearchList(repoName, env.BRANCH_NAME)
imageList.add(0, 'build')
properties([
buildDiscarder(logRotator(numToKeepStr: '20')),
parameters([
choice(name: 'ARTIFACT_NAME', choices: imageList.join('\n'), description: '')
])
])
Shared library that looks at artifactory, its pretty simple.
Essentially make GET Request (And provide auth creds on it) then filter/split result to whittle down to desired values and return list to Jenkinsfile.
import com.cloudbees.groovy.cps.NonCPS
import groovy.json.JsonSlurper
import java.util.regex.Pattern
import java.util.regex.Matcher
List artifactoryArtifactSearchList(String repoKey, String artifact_name, String artifact_archive, String branchName) {
// URL components
String baseUrl = "https://org.jfrog.io/org/api/search/artifact"
String url = baseUrl + "?name=${artifact_name}&repos=${repoKey}"
Object responseJson = getRequest(url)
String regexPattern = "(.+)${artifact_name}-(\\d+).(\\d+).(\\d+).${artifact_archive}\$"
Pattern regex = ~ regexPattern
List<String> outlist = responseJson.results.findAll({ it['uri'].matches(regex) })
List<String> artifactlist=[]
for (i in outlist) {
artifactlist.add(i['uri'].tokenize('/')[-1])
}
return artifactlist.reverse()
}
// Artifactory Get Request - Consume in other methods
Object getRequest(url_string){
URL url = url_string.toURL()
// Open connection
URLConnection connection = url.openConnection()
connection.setRequestProperty ("Authorization", basicAuthString())
// Open input stream
InputStream inputStream = connection.getInputStream()
#NonCPS
json_data = new groovy.json.JsonSlurper().parseText(inputStream.text)
// Close the stream
inputStream.close()
return json_data
}
// Artifactory Get Request - Consume in other methods
Object basicAuthString() {
// Retrieve password
String username = "artifactoryMachineUsername"
String credid = "artifactoryApiKey"
#NonCPS
credentials_store = jenkins.model.Jenkins.instance.getExtensionList(
'com.cloudbees.plugins.credentials.SystemCredentialsProvider'
)
credentials_store[0].credentials.each { it ->
if (it instanceof org.jenkinsci.plugins.plaincredentials.StringCredentials) {
if (it.getId() == credid) {
apiKey = it.getSecret()
}
}
}
// Create authorization header format using Base64 encoding
String userpass = username + ":" + apiKey;
String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(userpass.getBytes());
return basicAuth
}
I could achieve it without any plugin:
With Jenkins 2.249.2 using a declarative pipeline,
the following pattern prompt the user with a dynamic dropdown menu
(for him to choose a branch):
(the surrounding withCredentials bloc is optional, required only if your script and jenkins configuration do use credentials)
node {
withCredentials([[$class: 'UsernamePasswordMultiBinding',
credentialsId: 'user-credential-in-gitlab',
usernameVariable: 'GIT_USERNAME',
passwordVariable: 'GITLAB_ACCESS_TOKEN']]) {
BRANCH_NAMES = sh (script: 'git ls-remote -h https://${GIT_USERNAME}:${GITLAB_ACCESS_TOKEN}#dns.name/gitlab/PROJS/PROJ.git | sed \'s/\\(.*\\)\\/\\(.*\\)/\\2/\' ', returnStdout:true).trim()
}
}
pipeline {
agent any
parameters {
choice(
name: 'BranchName',
choices: "${BRANCH_NAMES}",
description: 'to refresh the list, go to configure, disable "this build has parameters", launch build (without parameters)to reload the list and stop it, then launch it again (with parameters)'
)
}
stages {
stage("Run Tests") {
steps {
sh "echo SUCCESS on ${BranchName}"
}
}
}
}
The drawback is that one should refresh the jenkins configration and use a blank run for the list be refreshed using the script ...
Solution (not from me): This limitation can be made less anoying using an aditional parameters used to specifically refresh the values:
parameters {
booleanParam(name: 'REFRESH_BRANCHES', defaultValue: false, description: 'refresh BRANCH_NAMES branch list and launch no step')
}
then wihtin stage:
stage('a stage') {
when {
expression {
return ! params.REFRESH_BRANCHES.toBoolean()
}
}
...
}
this is my solution.
def envList
def dockerId
node {
envList = "defaultValue\n" + sh (script: 'kubectl get namespaces --no-headers -o custom-columns=":metadata.name"', returnStdout: true).trim()
}
pipeline {
agent any
parameters {
choice(choices: "${envList}", name: 'DEPLOYMENT_ENVIRONMENT', description: 'please choose the environment you want to deploy?')
booleanParam(name: 'SECURITY_SCAN',defaultValue: false, description: 'container vulnerability scan')
}
The example of Jenkinsfile below contains AWS CLI command to get the list of Docker images from AWS ECR dynamically, but it can be replaced with your own command. Active Choices Plug-in is required.
Note! You need to approve the script specified in parameters after first run in "Manage Jenkins" -> "In-process Script Approval", or open job configuration and save it to approve
automatically (might require administrator permissions).
properties([
parameters([[
$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'image',
description: 'Docker image',
filterLength: 1,
filterable: false,
script: [
$class: 'GroovyScript',
fallbackScript: [classpath: [], sandbox: false, script: 'return ["none"]'],
script: [
classpath: [],
sandbox: false,
script: '''\
def repository = "frontend"
def aws_ecr_cmd = "aws ecr list-images" +
" --repository-name ${repository}" +
" --filter tagStatus=TAGGED" +
" --query imageIds[*].[imageTag]" +
" --region us-east-1 --output text"
def aws_ecr_out = aws_ecr_cmd.execute() | "sort -V".execute()
def images = aws_ecr_out.text.tokenize().reverse()
return images
'''.stripIndent()
]
]
]])
])
pipeline {
agent any
stages {
stage('First stage') {
steps {
sh 'echo "${image}"'
}
}
}
}
choiceArray = [ "patch" , "minor" , "major" ]
properties([
parameters([
choice(choices: choiceArray.collect { "$it\n" }.join(' ') ,
description: '',
name: 'SOME_CHOICE')
])
])

Resources