trigger list of external Jenkins jobs in parallel - jenkins

joblist=[A, B]
//A, B are external job's path
for (job in joblist)
job build : "{$job}", params
above script for series runs I want to run parallel

Related

Concatenated jobs in Jenkins

I have to solve the following problem:
I have a job A in Jenkis. In one of the stages another Job B is started by the "job build" command.
For Job A to finish, it needs to wait for Job B to finish.
The problem is that Job B does not start because it is waiting for Job A to finish.
I'm using pipeline script in all Jobs.
I imagined that using the command " build job: 'my Job', propagate: true, wait: true" Jenkins would start Job B and after completion of B, it would return to complete Job A
workflow example
I'm assuming you just have one executor configured in your Jenkins Server/Slave. Hence Multiple builds are unable to run at the same time, which causes your deadlock.
In order to increase the executor count follow the following instructions. If you are just using the master to build. Go to Dashboard > Manage Jenkins > Configure and increase the executor count.
If you have slaves go to Dashboard > Manage Jenkins > Manage nodes and clouds click on the edit icon on the Slave and increase the executor count(more than 1) for slaves.

Trigger multiple builds parallel at one fire in Jenkins Pipeline

Context
I have a few Jenkins pipelines which are based on the regex to select the matching jobs to run. The number of downstream jobs for each pipeline around 60.
So I have written down the declarative scripts to select these matching job then built it with Jenkins Build Plugin parallel. In general, it's something as below:
Finish deploy to A --> Pipeline A (master node)
->> 60 downstream jobs (slave nodes)
Finish deploy to B --> Pipeline B (master node)
->> 60 downstream jobs (slave nodes)
Finish deploy to C --> Pipeline C (master node)
->> 60 downstream jobs (slave nodes)
The master node is the server used to run Jenkins.
The slave nodes are the AWS EC2 instances used to run the task, my pool around 32 servers which can be used up to 180 tasks at once.
Expected
Let's say I have those pipeline triggered pipeline A, pipeline B, pipeline C in sequence then I expect the downstream jobs are triggered gonna be in the queue in sequence. It means first 60 jobs from A will be scheduled, built then so on pipeline B, pipeline C, given the slave executors are still free and available to use.
Observed
Even the AWS EC2 instances running no task (all available), the pipeline did not trigger all the jobs at one fire but part of its. It means a random number of jobs in pipeline A will be built first, then after a while, the rest will be built.
The pipeline script:
Stage('Integration Test Run') {
steps {
script {
matchingJobs = commonPipelineMethods.getTestJobs(venture_to_test, testAgainst)
parallel matchingJobs.collectEntries{downstreamJob-> [downstreamJob.name, commonPipelineMethods.buildSingleJob(this, downstreamJob)]}
}
}
}
def buildSingleJob(steps, downstreamJob) {
return {
def result = steps.build job: downstreamJob.fullName, propagate: false
steps.echo "${downstreamJob.fullName} finished: ${result.rawBuild.result}"
}
}
So I'm not sure if I need anything to config, setting up the pipeline script to get those downstream job running at one fire.
Could anybody please look into this and give me some suggestion? Thanks

How to trigger a Jenkins Pipeline job to use the same workspace

I have 2 pipeline jobs. JOB A & JOB B
I want to trigger JOB B to run on the same node and workspace of JOB A.
When I do so, Jenkins adds "#2" to the workspace of JOB B. for example:
JOB A workspace = /DATA/Jenkins_User/workspace/JOB-A
JOB B receives the correct workspace as parameter,
but when I perform ws(customWorkspace) it is being built in
/DATA/Jenkins_User/workspace/JOB-A#2
Any idea how can I remove the "#2" characters?

Get final result Multijob from several hosts (Jenkins)

I have a MultiJob Project with structure:
Master MultiJob Project (Job)
|----- Phase 1
|------> JOB A
|------> JOB D
|----- Phase 2
|------> JOB B
|------> JOB D
|----- Phase 2
|------> JOB C
Main Job (Master MultiJob Project) run on the Master, but other jobs can run on another free worker, but result of each Job(A/B/C/D) must send to Master MultiJob to collect result and get summury result of all jobs.
When all Jobs was on one host I use:
ln -s $WORKSPACE/$REPORTSDIR
where $WORKSPACE I send from Master MultiJob like a parameter, but if they on differents hosts I cann't use this solution. What is the best way to solve this problem?
Wait for sub-jobs to complete. They must keep the reports as artifacts.
From master job get sub-jobs build numbers and copy artifacts from finished sub-jobs.
This is in general what you have to do. But you must be more clear what type of build are you using - pipeline or simple freestyle project? You might need to install Copy Artifact Plugin.

Jenkins Parallel Trigger and Wait

I have 4 jobs which needs to be executed in the following sequence
JOB A
|------> JOB B
|------> JOB C
|------> JOB D
In the above
A should trigger B & C parallely and C inturn triggers D.
A should hold the job as running till all 3 of them completed.
I tried the following plugins and couldn't achieve what I am looking for
Join Plugin
Multijob Plugin
Multi-Configuration Project
Paramterized Trigger Plugin
Is there any plugin which I haven't tried would help me in resolving this. Or is this can be achieved in a different way. Please advise.
Use DSL Script with Build Flow plugin.
try this Example for your execution:
build("job A")
parallel
(
{build("job B")}
{build("job C")}
)
build("job D")
Try the Locks and Latches plugin.
This may not be optimal way, but it should work. Use the Parameterized Trigger Plugin. To Job A, add a build step (NOT a Post Build Action) to start both Jobs B and C in the same build step AND block until they finish. In Job C, add a build step (NOT a Post Build Action) that starts Job D AND blocks until it is finished. That should keep Job A running for the full duration.
This isn't really optimal though: Job A is held open waiting for B and C to finish. Then C is held open until D is finished.
Is there some reason that Job A needs to remain running for the duration? Another possibility is to have Job A terminate after B and C are started, but have a Promotion on Job A that will execute your final actions after jobs B, C and D are successful.
I am trying to build a same system. I am building a certification pipeline where I need to run packager/build/deploy jobs and and corresponding test jobs. When all of them are successful, I want to aggregate the test results and trigger the release job that can do an automated maven release.
I selected Build pipeline plugin for visualization of the system. Initially tried with Parameterized trigger Plugin with blocking builds. I could not setup archiving the artifacts/fingerprinting and downstream build relationship this way since archiving the artifacts works only in postbuild. Then I put the Parameterized trigger in Post build activity. This way I was able to setup downstream builds, fingerprinting, aggregate test results but the build failures were not bubbling to upstream job chain and upstream jobs were non blocking
I was finally able to achieve this using these plugins-
Build Pipeline
MultiJob Plugin
FingerPrint Plugin
Copy Artifacts Plugin
Join Plugin
I'm using Jenkins 1.514
System looks like this
Trigger Job --> build (and deploy) Job (1..n) ---> Test Job (1..n)
Trigger Job -
Create as MultiJob and create a fingerprint file in shell exec
echo date +%s > fingerprint.txt
Trick is that file needs to be archived during the build, to do that execute this script-
ARCHIVEDIR=$JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID/archive
mkdir $ARCHIVEDIR
cp fingerprint.txt $ARCHIVEDIR
Create MultiJob Phase consisting of build/deploy job.
Build/deploy job is itself a multijob
follow the same steps for creating build/deploy job as above relative
to fingerprinting.
Copy the fingerprint.txt artifact from upstream job
Setup MultiJob phase in deploy job that triggers the test job
create a new fingerprint file and force archive it similar to above step
Collect Junit results in the final test job.
In the trigger Job, use Join Plugin to execute the Release Job by choosing 'Run Post Build Actions at join' and execute the release project only on stable build of Trigger Job.
This way all the steps are showing up in Build Pipeline view and Trigger job is blocking for all downstream builds to finish and sets its status as the worst downstream build to give a decision point for release job.
Multijob Plugin
If you'd like to stop the mess with downstream / upstream jobs chains definitions. Or when you want to add a full hierarchy of Jenkins jobs that will be executed in sequence or in parallel. Add context to your buildflow implementing parameter inheritance from the MultiJob to all its Phases and Jobs. Phases are sequential while jobs inside each Phase are parallel.
https://wiki.jenkins-ci.org/display/JENKINS/Multijob+Plugin

Resources