Lets assume a scenario where Job A calls Job B:
...
...
...
crID = build (job: "Open Change Request", wait: true, parameters: [
string(name: "assignedTo", value: "${BUILD_USER_EMAIL}"),
string(name: "crType", value: "Upgrade worker nodes"),
string(name: "environment", value: "${region}")]).result
The above code is flawed, as result will return FAILURE, SUCCESS, etc...
What I require is to actually retrieve the value that Job B generates.
Is this at all possible, to retrieve the response of the job that ran as part of a build step?
Possibilities:
Read log from the other job?
Global properties?
I ended up doing so by reading the build log.
In job B print the value to log:
echo "Change Request ID:${crID}"
In job A process the log text to get the printed value:
openCrRawData = build (job: "Open Change Request", wait: true, parameters: [
string(name: "assignedTo", value: "${jobInitiator}"),
string(name: "crType", value: "Upgrade worker nodes"),
string(name: "environmentsForCR", value: "${region}")])
crIDRaw = sh (script: "echo \"${openCrRawData.rawBuild.log}\" | grep \"Change Request ID:\"", returnStdout: true).trim().split(":")
crID = crIDRaw[1]
Related
build job: 'build_Test', parameters: [validatingString(name: 'version', value: '1.0.0.1'), string(name: 'TASK', value: 'build')]
I am trying to trigger another job via jenkinfile. The above script triggers the job but can see below error in the triggered job's console log
java.lang.NullPointerException
at java.util.regex.Pattern.<init>(Pattern.java:1350)
at java.util.regex.Pattern.compile(Pattern.java:1028)
at java.util.regex.Pattern.matches(Pattern.java:1133)
at hudson.plugins.validating_string_parameter.ValidatingStringParameterValue.createBuildWrapper(ValidatingStringParameterValue.java:87)
ValidateString Parameters should have the following options. And it seems your regex is null.
validatingString(name: "test", defaultValue: "", regex: /^abc-[0-9]+$/, failedValidationMessage: "Validation failed!", description: "ABC")
I am looking for solution to pass a 'file' (.csv) parameter value to downstream job. I have tried with below code but its not working.
build job: "DownstreamJobName",
parameters: [
string(name: 'Releases', value: "1.2.9"),
[$class: "FileParameterValue", name: "test.csv", file: new FileParameterValue.FileItemImpl(new File(env.WORKSPACE/env.filepath))],
string(name: 'UserEmail', value: "testemail")
]
When I got researched got below link that there is an existing defect with file for Jenkins pipeline, dont know whether it got fixed or not. https://issues.jenkins.io/browse/JENKINS-27413
I am able to solve this like below
propertiesFilePath = "${env.WORKSPACE}/test.csv"
parameters: [
[$class: "FileParameterValue", name: "test.csv", file: new FileParameterValue.FileItemImpl(new File(propertiesFilePath))]
]
I am creating the Dataflow Jenkins auto scheduler job where user can able to enter some parameters that are required to run the dataflow job and should be able to schedule the job. User can schedule the job whenever they need by changing the cron values in the Jenkins UI/console.
The job is working as expected, it is scheduling the job at specified time and also taking the whatever the user supplied values in the input text fields in the Jenkins console and it is triggering with the values entered by user. But after scheduler/ timer runs and complete the job, the subsequent job runs, it's overwriting the user entered values with the default values in Jenkins UI after scheduler/timer runs.
"Jenkins pipeline Cron Auto Scheduler job Overwriting the user entered values with the default values in Jenkins UI after scheduler/timer runs"
Does anyone faced similar issue or have a solution for the above query? I appreciate your help in this matter.
Here is my Jenkins groovy script file code
//Dataflow Jenkins pipeline auto scheduler job
properties([
parameters([
choice(name: 'jarType', choices: ['snapshot','release'], description: 'Select the JarType to execute the files'),
string(name: 'hour', description: 'The hour of the day in 24 hour in (0-23) format', defaultValue: '14'),
string(name: 'dayInMonth', description: 'DOM: The day in the month in (1–31) format', defaultValue: '*'),
string(name: 'theMonth', description: 'MONTH: The month in (1–12) format', defaultValue: '*'),
string(name: 'dayInWeek', description: 'DOW: The day in the week in (0–7) format', defaultValue: '*'),
string(name: 'jobName', description: 'Please enter jobName for the Dataflow job', defaultValue: 'test'),
string(name: 'project', description: 'Please enter project for the Dataflow job', defaultValue: 'test'),
]),
pipelineTriggers(createPipelineTriggers())
])
//below is the scheduling part
def createPipelineTriggers() {
echo "H ${params.hour} ${params.dayInMonth} ${params.theMonth} ${params.dayInWeek} ${params.jobName} ${params.project}"
if (env.BRANCH_NAME == 'develop') {
// return [cron("H ${params.hour} ${params.dayInMonth} ${params.theMonth} ${params.dayInWeek} %jobName=${params.jobName}; project=${params.project}")]
return [cron("H ${params.hour} ${params.dayInMonth} ${params.theMonth} ${params.dayInWeek}")]
}
return []
}
node("test-pod") {
container('test-container') {
stage("Checkout Scm"){
checkout scm
}
stage("Download ${params.jarType} CI jar from Nexus") {
echo "${params.jarType}"
//****
}
stage("Download xxxx and xxxx from Nexus") {
//**
}
stage("Running dataflow job") {
rootDir = pwd()
try {
sh "java -jar ${params.application}.jar \\\n" + "--runner=DataflowRunner --jobName=${params.jobName} --project=${params.project}
} catch (Exception e) {
echo "dataflow job Failed"
}
}
}
}
Have a Jenkins job with a string parameter(P1), how do i pass multiple values to this parameter(P1=a,b,c..), I have to further pass the all values of P1 to my jenkins build step(a batch script) how can this be done?
stage ('Invoke_pipeline') {
steps {
build job: 'test-docker-image',
parameters: [
string(name: 'ProjectName', value: "atm"),
string(name: 'Product', value:"renojournalman,reno-keymanagement,reno-management,renoalertman,reno-atm-update,reno-eventmonitor,reno-valmediamon"),
]
}
Here is my problem simplified :
I have a main job (pipeline job) and I have x job (freestyle). In my main job I build x job using the following :
code in main job -
res = build job: 'x', parameters: [string(name: 'JOBNAME', value: string(name: 'JIRACHEF', value: "oldvalue")], quietPeriod: 2
Now in this job x I change the value of JIRACHEF parameter and I print to check if it actually changed.:
os.environ["JIRACHEF"] = "newvalue"
print os.environ["JIRACHEF"]
This works in job x console output. I presume as per the solution presented, this updated value should be now available in the main job so I do the following after in main job just after building x:
res = build job: 'x', parameters: [string(name: 'JOBNAME', value: string(name: 'JIRACHEF', value: "oldvalue")], quietPeriod: 2
print "$res.buildVariables"
which should print "newvalue" but prints "oldvalue" thus making me believe it isn't actually passing the value upstream.
Note - I realize my job x is freestyle, but I have tried the above solution by making x pipeline job as well and still getting the same result - 'oldvalue'
Main job - configuration: pipeline job
node {
x = build job: 'test1', quietPeriod: 2
build job: 'test2', parameters: [
string(name: 'aValue1FromX', value: "$x.buildVariables.value1fromx"),
string(name: 'aValue2FromX', value: "$x.buildVariables.value2fromx")
], quietPeriod: 2
}
test1 - configuration: pipeline job
node {
env.value1fromx = "bull"
env.value2fromx = "bear"
}
test2 - configuration: pipeline job, parametrized, two parameters aValue1FromX and aValue2FromX both strings
node {
echo "$env.aValue1FromX"
echo "$env.aValue2FromX"
}