Pipeline within a pipeline? - jenkins

I have three jobs on my jenkins server. One of them triggers the other two, and all three of them run one after another in a sequence. All are free style jobs.
Now, I want to convert all three jobs into pipeline jobs. So in my case, all three jobs will have their own separate pipelines and there would be an outer pipeline that will show me the three jobs running one after another. Is it possible at all to have a situation where the first job completes building in pipeline, then triggers the second job which runs in its own pipeline stages and then the third job also completes all the stages in its pipeline?
From the outside there would be a larger open pipeline : Job1->Job2->Job3
and on the inside there'd be smaller pipelines with each stage of each job like Clone->Build->Report Generation->.....
Please help.

at the end of Job1 pipline when it is success put this code
build 'JOB_NAME_2', propagate: false, wait: false
then at the end of Job2 pipline put this code
build 'JOB_NAME_3', propagate: false, wait: false
shape propogate and wait according to your needs.
to check please refer to Pipeline Syntax in your jenkins instance
/pipeline-syntax/

Related

Can I set up dependency builds in Jenkins similar to TeamCity?

I have not found information about that, I want to trigger build but before it executes, I want it to trigger some other pipelines and wait for a file created by these other pipelines to pass to a main Pipeline, can I do this in Jenkins?
You can, yes, in multiple ways depending on your actual use case.
The simplest way would be to create the job you want to call and then add a step for calling that job, copy artifacts from the job and then continue with the pipeline.
Using Jenkins Build step:
stage ('Child job') {
steps {
build(job: 'foo', wait: true, propagate: true, parameters: [parameters_list])
}
}
The parameter wait makes it so your pipeline waits for the child to complete run before continuing and propagate means that the result of the job is shown in the parent pipeline as well. The parameters section is needed only if your child job requires parameters. See the parameter types in the Build step documentation.
You can also use the Jenkins pipeline snippet generator to create the call properly in your own Jenkins instance.
To get any build artifacts from the child job, the easiest way to go is to use the Copy Artifact plugin. You'll need to archiveArtifacts in the first job and then
copyArtifacts fingerprintArtifacts: true, projectName: 'foo', selector: lastCompleted()
When the artifacts are needed.

Jenkins pipeline: how to trigger another job and wait for it without using an extra agent/executor

I am trying to setup various Jenkins pipelines whose last stage is always to run some acceptance tests. To cut a long story short, acceptance tests and test data (much of which is shared) for all products are checked into the same repository which is about 0.5 GB in size. It therefore seemed best to have a separate job for the acceptance tests and trigger it with a "build" step from each pipeline with the appropriate arguments to run the relevant tests. (It is also sometimes useful to rerun these tests without rebuilding the product)
stage('AcceptanceTest') {
steps {
build job: 'run-tests', parameters: ..., wait: true
}
}
So far I have seen that I can either:
trigger the job as normal. But this uses an extra agent/executor,
there doesn't seem to be a way to tell it to reuse the one from the
build (main pipeline). Both pipelines start with "agent { label 'master' }" but that
seems to mean "allocate a new agent on a node matching master".
trigger the job with the "wait: false" argument. This doesn't
block an executor but it does mean I can't report the results of the
tests in the main pipeline. It gives the impression that the test
stage has always succeeded.
Is there a better way?
I seem to have solved this, by adding "agent none" at the top of my main pipeline and moving "agent { label 'master' }" into the build stage. I can then leave my 'AcceptanceTest' stage without an agent and define it in the 'run-tests' job as before. I was under the impression from the docs that if you put agents in stages then all stages needed to have one, but it seems not to be the case. Which is lucky for this usecase...
I don't think that there's another way for declarative pipeline.
On the other hand for scripted pipeline you could execute this outside of node {} and it would just hold onto one executor on master releasing the one on slave.
stage("some") {
build job: 'test'
node {
...
Related question: Jenkis - Trigger another pipeline job in same machine - without creating new "Executor"

How to run jenkins job only once but with two conditions

Thanks for looking into my concern.
I have 3 jenkins jobs. JOb A, B & C.
Job A starts at 10PM at night.
JOB B is a down stream of Job A and runs only if job A is success.
Job C is a downstream job of job B
Now I want job C to be triggered after successful completion of job B or at at a scheduled time. Problem is if I schedule job C as down stream as well as with a schedule. It runs twice.
But, it should run only once.
Please help me to achieve this.
Did you try "Conditional BuildStep" plug-in? You can execute a downstream job (or a script) based on "Build cause"
You can add more than 1 "single" conditions for each build cause.
Now you'll need to decide when to run a job, as a timer or as a downstream
You can use jenkins pipeline plugin. You can create a pipeline job with stages. A pipeline will proceed only to next stage if previous stage is successful. Refer documentation for more details on pipeline.
Pipeline comes with a lot of flexibilities in which you can define the flow. You can either use a declarative pipeline or a scripted pipeline. Good number of examples can be found in here

Visualizing build steps in Jenkins pipeline

In my Jenkins pipeline, I trigger several other jobs using the build step and pass some parameters to it. I'm having issues visualizing the different jobs I've triggered in addition to my pipeline. I have set up the Jenkins Delivery Pipeline plugin but the documentation for it is extremely vague and I have only been able to visualize the steps within my pipeline, despite tagging the jobs with both a stage and task name.
Example:
I have two jobs in Jenkins as pipelines/workflow jobs with the following pipeline script:
Job Foo:
stage('Building') {
println 'Triggering job'
build 'Bar'
}
Job Bar:
node('master') {
stage('Child job stage') {
println 'Doing stuff in child job'
}
}
When visualizing this with the Jenkins Pipeline Delivery plugin, I only get this:
How do I make it also show the stage in job Bar in a separate box?
Unfortunately, this use case is currently not supported in the Delivery Pipeline plugin version 1.0.0. The delivery pipeline plugin views for Jenkins pipelines are only rendering what is contained within one pipeline definition at this point. This feature request is tracked in JENKINS-43679.

Jenkins Pipeline and Promotions

When having build job with implemented promotion cycle, i.e. Dev->QA->Performance->Production.
What will be the correct way to migrate this cycle into pipeline? it looks rather clean\structured to call each of the above mentioned jobs, Yet, How can I query the build ID (to be able to call the deployment job)? Or I have have totally misunderstood the pipeline concept?
You might consider multiple solutions :
Trigger each job sequentially
Just call each job sequentially using build step :
node() {
stage "Dev"
build job: 'Dev'
stage "QA"
build job: 'QA'
// Your other promotion cycles...
}
It is easy to use and will probably be already compliant with your actual solution, but I'm not a big fan of this solution because the actual output of your pipeline stages (Dev, QA, etc.) will really be in the dedicated job (Dev job, QA job) instead of being directly inside your pipeline. Your pipeline will be an empty shell just calling other jobs...
Call pipelines functions instead of jobs
Define a pipeline function for each of your promotion cycle (preferably in an external file) and then call each function sequentially. Example :
node {
git 'http://urlToYourGit/projectContainingYourFunctions'
cycles = load 'promotions-cycles.groovy'
stage "Dev"
cycles.dev()
stage "QA"
cycles.qa()
// Your other promotion cycles calls...
}
The biggest advantages is that your promotions cycles code is commited in your Git repository and that all your stages output is actually a part of your pipeline output, which is great for easy debugging.
Plus you can easily apply conditions based on the success/failure of your functions (e.g. if your QA stage fails you don't want to go any further).
Note that both solutions should allow you to launch your promotions cycles in parallel if needed and to pass parameters to either your jobs or functions.
It is better to call each build in separate pipeline stages. Something like this:
stage "Dev"
node{
build job: 'Dev', parameters:
[
[$class: 'StringParameterValue', name: 'param', value: "param"],
];
}
stage "QA"
node{
build job: 'QA'
}
etc...
To cycle this process you ca use retry option or endless cycle in Groovy

Resources