I run job like this:
def job = build job: job_name, parameters:
[
string(name: 'Param1', value: Value1),
string(name: 'Param2', value: Value2),
string(name: 'Param3', value: Value3),
], wait: false
return job
I use wait: false because I need to run few in same time and then waiting for all of them finished. But in this case job is null because I use wait: false.
Maybe do you know other way to run multiply jobs in same time and get their objects ?
Other way that I try:
def buildsNumberStart = GetLastDockerBuildNumber(DockerBuildJobName)
def finishedJobs = overrideBranches.collect { item -> RunJobBuild(item.Tag, author, GetImageNameFromRepository(item.DockerRepository), item.Name, item.Branch, item.Repository, DockerBuildJobName)}
sleep(30)
def buildNumberFinish = GetLastDockerBuildNumber(DockerBuildJobName)
while(true)
{
if (IsAllDockerBuildJobsFinished(buildsNumberStart, buildNumberFinish, DockerBuildJobName))
{
break;
}
sleep(5)
}
for (build in GetFinishedJobs(buildsNumberStart, buildNumberFinish, DockerBuildJobName))
{
def listener = build.getListener()
build
.getEnvironment(listener)
.each
{
// HERE IS PROBLEM, I saw just environment variables that
// set on start and didn`t see that I added in pipeline
println it
}
}
}
#NonCPS
def GetFinishedJobs(buildNumbersStart, buildNumberFinish, String dockerBuildJobName)
{
return jenkins.model.Jenkins
.instance
.getItemByFullName(dockerBuildJobName)
.builds
.findAll { it.number > buildNumbersStart && it.number <= buildNumberFinish}
}
I find solution https://stackoverflow.com/a/40148397/14392639
for my case its looks like:
def jobs = [:]
some_source.each
{
item -> jobs[some_source.Name] = build job: job_name, parameters:
[
string(name: 'Param1', value: Value1),
string(name: 'Param2', value: Value2),
string(name: 'Param3', value: Value3),
]
}
parallel jobs
Related
I have to run for loop in groovy for 40 items, but do wish to run it for 4 items in parallel then next batch and so on. I know of parallel deployments in jenkinsfile but it triggers all 40 at once.
def i = 0
mslist.collate(4).each{
build job: 'deploy', parameters: [string(name: 'PROJECT', value: "${it[i]}"), string(name: 'ENVIRONMENT', value: params.ENVIRONMENT)]
i=i+1
}
My Updated code:
stages {
stage ('Parallel Deployments') {
steps {
script {
def m = rc.ic()
m = m.collect { "${it}" }
println "$m"
m.collate(4).each{
def deployments = [:]
batch.each {
deployments[it] = {
build job: 'jb', parameters: [string(name: 'pt', value: it), string(name: 'pl', value: params.gh), string(name: 'dc', value: params.nb)]
}
}
parallel deployments
}
deployments["failFast"] = false
}
}
}
}
It can be done like this :
node {
def items = (1..40).collect { "item-${it}" }
items.collate(4).each { List batch ->
def n=[:]
batch.each {
n[it] = {
stage(it) {
build job: 'x', parameters: [ string( name: "it", value : it) ]
}
}
}
parallel n
}
}
job: x Jenkinsfile content
node {
echo "Hello from Pipeline x"
print params
}
This will invoke 4 jobs at a time and run parallelly. Make sure you have more than 4 executors configured on Jenkins.
You can do something like:
def items = (1..40).collect { "item-${it}" }
items.collate(4).each { List batch ->
// batch is now a list of 4 items, do something with it here
}
to use the groovy Iterable.collate method to split the items into batches of four items and loop through the batches.
If you really want to do this "in parallell" as in using multiple threads, then that is a different question.
My Jenkins job run multiple builds in parallel as below:
def branches = [:]
for (int i = 0; i < 4; i++) {
def index = i
branches["branch${i}"] = {
build job: 'Test', parameters: [
string(name: 'param1', value:'test_param'),
string(name:'dummy', value: "${index}")]
}
}
parallel branches
For the above code I want to print all build result. So how can I get build result (e.g. SUCCESS, FAILURE...) of all parallel jobs?
If you want to print all branches result in the same console
you can do it like this
def branches = [:]
for (int i = 0; i < 4; i++) {
def index = i
branches["branch${i}"] = {
build job: 'Test', parameters: [
string(name: 'param1', value:'test_param'),
string(name:'dummy', value: "${index}")]
}
println currentBuild.result
}
parallel branches
currentBuild.result holds status of the build so if you print it in each branch you will get what you need. If the stage
To explain the issue, consider that I have 2 jenkins jobs.
Job1 : PARAM_TEST1
it accepts a parameterized value called 'MYPARAM'
Job2: PARAM_TEST2
it also accepts a parameterized value called 'MYPARAM'
Sometimes I am in need of running these 2 jobs in sequence - so i created a separate pipeline job as shown below. It works just fine.
it also accepts a parameterized value called 'MYPARAM' to simply pass it to the build job steps.
pipeline {
agent any
stages {
stage("PARAM 1") {
steps {
build job: 'PARAM_TEST1', parameters: [string(name: 'MYPARAM', value: "${params.MYPARAM}")]
}
}
stage("PARAM 2") {
steps {
build job: 'PARAM_TEST2', parameters: [string(name: 'MYPARAM', value: "${params.MYPARAM}")]
}
}
}
}
My question:
This example is simple. Actually I have 20 jobs. I do not want to repeat parameters: [string(name: 'MYPARAM', value: "${params.MYPARAM}")] in every single stage.
Is there any way to set the parameters for all the build job steps in one single place?
What you could do is place the common params on the pipeline level and add specific ones to those in the stages
pipeline {
agent any
parameters {
string(name: 'PARAM1', description: 'Param 1?')
string(name: 'PARAM2', description: 'Param 2?')
}
stages {
stage('Example') {
steps {
echo "${params}"
script {
def myparams = params + string(name: 'MYPARAM', value: "${params.MYPARAM}")
build job: 'downstream-pipeline-with-params', parameters: myparams
}
}
}
}
}
I am new to Jenkins. I have spent the last few weeks creating jobs to execute chains of shell commands, but now when I tried to find out how to chain jobs together, I have failed to find the answer I was looking for.
I have a CreateStack job, and if it fails somehow, I'd like to run DeleteStack to remove the stuff that CreateStack left behind while failing. If CreateStack does not fail, build the rest of the jobs.
Something like this:
b = build(job: "CreateStack", propagate: false, parameters: [string(name: 'TASVersion', value: "$TASVersion"), string(name: 'CloudID', value: "$CloudID"), string(name: 'StackName', value: "$StackName"), booleanParam(name: 'Swap partition required', value: true)]).result
if(b == 'FAILURE') {
echo "CreateStack has failed. Running DeleteStack."
build(job: "DeleteStack", parameters: [string(name: 'CloudID', value: "$CloudID"), string(name: 'StackName', value: "$StackName")]
}
else {
build job: 'TAS Deploy', parameters: [string(name: 'FT_NODE_IP', value: "$FT-NodeIP"), string(name: 'TASVersion', value: "RawTASVersion")]
}
Could somebody help me out with this, please?
Also, can I use variables in a pipeline script like this? I set the project to be parameterized and added the necessary choice parameters, e.g.: $StackName
You can try something like this in a scripted pipeline:
node {
try {
stage('CreateStack') {
build(job: 'CreateStack', parameters: [<parameters>])
}
stage('OtherJobs') {
#build the rest of the jobs
}
} catch (error) {
build(job: 'DeleteStack', parameters: [<parameters>])
currentBuild.result = "FAILURE"
throw error
} finally {
build(job: 'LastJob', parameters: [<parameters>])
}
}
Please note, that the catch block is executed if any job fails. There you have to implent a little additional logic.
I've tried using the following Script, but all downstream jobs are running on different nodes.
Any idea how can I get a random node and run all downstream jobs on the same one?
#!/usr/bin/env groovy
pipeline {
agent { label 'WindowsServer' }
stages{
stage("Get Dev Branch"){
steps {
script {
build(job: "GetDevBranchStep", parameters: [string(name: 'DevBranchName', value: "${params.CloudDevBranch}")])
}
}
}
stage("Get SA Branch"){
steps {
script {
build(job: "GetSABranchStep", parameters: [string(name: 'SABranchName', value: "${params.SABranch}")])
}
}
}
stage("Compile Models and Copy To Network Folder"){
steps {
script {
build(job: "CompileNewModelsAndCopyToNetwork", parameters: [string(name: 'DevBranchName', value: "${params.CloudDevBranch}"), string(name: 'SABranchName', value: "${params.SABranch}"), string(name: 'GetSAStepJobName', value: "GetSABranchStep"), string(name: 'GetDevRepoJobName', value: "GetDevBranchStep"), string(name: 'NetworkFoderToCopyTo', value: "NetworkFolderAddress")])
}
}
}
}
}
provide a downstream job with ${NODE_NAME} as additional parameter
in downstream job in agent section you can use:
agent { label "${params.NODE_NAME}" }
(meanwhile did not found how to inject parameters of upstream job to the downstream without actually insert them one by one as input parameters)