Jenkins Workflow Plugin Link to Downstream Jobs - jenkins

I've just started working with the Workflow plugin.
The set-up I have currently consists of a Workflow script that uses the build step to basically define a pipeline made up of multiple downstream jobs.
This is working well but their there isn't really any link between the output of Workflow build and the output from all the downstream builds, is their a way to either,
Link from the Workflow project build output to all the corresponding downstream builds.
Capture the console output of the downstream jobs and include it in the output of the Workflow job.
I'm hoping with either of these options it will be possible to see the output from the whole pipeline via the Workflow job output.

IMO, the intention of Workflow is to replace pipelines of various Jenkins jobs with just a single job. This may be why Workflow doesn't make any significant effort to link to downstream jobs. I've been converting my "pipelines" to monolithic Workflow jobs, and really appreciating the fact that all the actions are more tightly grouped together.

Link from the Workflow project build output to all the corresponding downstream builds.
PR 218 under review as of this writing.
Capture the console output of the downstream jobs and include it in the output of the Workflow job.
JENKINS-26124

Related

Jenkins build summary link to post-build build summary

I have a job that is triggered as a post-build action for dozens of other jobs. It essentially organizes and process the artifacts of those upstream jobs (using Copy Artifact Plugin), and publishes the reformatted logs, and originals, as artifacts of its own.
I want the build summary pages of an upstream job to have a link to that downstream job. From what I gather, this is not an intended use case. Conventional wisdom seems to be that if we want a link to a downstream job, we should run it as a sub-project within the Build step of the upstream job. But if I do that, I don't have the artifacts to pass to the downstream job. Catch 22.
Or is there something (even something really hacky and nasty) I can do to make this work. People want to get the processed artifacts directly from the build page.
One way (and I think the only way) to do this would be to call the Jenkins api from the downstream job to put a link to itself in the upstream job's description. But this seemed like more work than it was worth. So we just didn't do anything, and we're all fine.

Configure Jenkins jobs depending on other jobs

Currently I have one big job for a big C++ project, which does everything, compiling, running unit tests, coverage, release binaries and creating docs.
As the job takes 40 minutes I would like to split the job in different smaller ones.
I want to use the following approach:
main job every 15 minutes, which checks out the SCM, compiles the Debug configuration and runs basic unit tests
Several jobs for code analysis, coverage, integration tests, compiling Release builds and deployment to our application server running once per night, if the main job and each previous job were successful
I need the SVN revision, the build number and the workspace of the main job in all following jobs.
So far I was unable to achieve this.
The Parameterize Trigger plugin doesn't support triggers only once a day, the Build Trigger plugin doesn't support parameters, the built-in trigger also didn't work.
I understand that pipelines would probably make my approach easier, but e.g. my used CMake plugin won't support pipelines in a while.
Any other ideas or solutions?
You can just configure a job with parameters (https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Build) as post build job, for all your downstream jobs and this plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Trigger+Plugin.
As parameter you can pass any var you need like buildNr and workspace.
Or just have a look at Jenkins Pipeline.

Jenkins pipeline using upstream and downstream dependency

I had some jenkins standalone jobs to build, package and deploy. Now I am connecting them and making 'build' job trigger 'package' job , and 'package' job to trigger 'deploy' job and am passing the required parameters between them.I can also see them neatly in pipeline view.
My question is, can this technically be called a pipeline? Or can I call it a pipeline only if I use pipeline plugin and write groovy script?
Thanks
p.s: Please do not devote this question. It is a sincere question for which I am not able to find the right answer. I want to be technically correct.
In Jenkins context, a pipeline is a job that defines a workflow using pipeline DSL (here, based on Groovy). A pipeline aims to define a bunch of steps (e.g. build + package + deploy in your case) in a single place, allows to define a complex workflow (e.g. parallel steps, input step, try/catch instructions) that can be both replayed and versionned (because it can be saved to git). For more information you should read Jenkins official pipeline documentation that explains in details what a pipeline is.
The kind of jobs you are currently using are called freestyle jobs, and even if they do define a "flow" (by chaining jobs together), they cannot be called pipelines jobs.
In short, pipelines are jobs that use pipeline plugin and groovy script syntax to define the whole application lifecycle, and standard Jenkins 1.x jobs are called freestyle jobs.

Creating tasks as part of a build in Jenkins

I currently have a project set up in Bamboo that has various tasks attached to it ranging from: source code checkouts, scripts and commands.
I would like to transition over to Jenkins and perform the same action and found that Pipelines may be what I am looking for.
Can anyone provide me with a basic understanding of how I can create a Pipeline or freestyle job in Jenkins with sequential tasks that are executed as part of a build? A project in Bamboo has a plan with various tasks and I would like to replicate the same thing in Jenkins.
Here is an example of what my Bamboo tasks look like:
Thanks

How to get URL of pipeline job in jenkins

We are setting up a continuous delivery pipeline in Jenkins, using the build pipeline plugin.
Our deployment steps uses a proprietary deploy tool (triggered by a HTTP request from jenkins), but we need to have an additional Jenkins step for acceptance tests on the then deployed project. So our deploy tool will need to trigger the last pipeline step.
The jenkins setup for this is obvious:
For a Manually Triggered downstream build step: To add a build step
that will wait for a manual trigger:
Select the Build Pipeline Plugin, Manually Execute Downstream Project check-box
Enter the name(s) of the downstream projects in the Downstream
Project Names field. (n.b. Multiple projects can be specified by using comma, like "abc, def".)
Source: Build Pipeline Plugin
The problem is: I can't seem to find a way to trigger this downstream build through a URL.
In fact I'd need the URL in the deploy job, so I can send it to the deploy tool as a callback URL. Can anybody help?
If I understand correctly, you want to use remote access API, which to my knowledge is no different between general project or pipeline one.
Take a look here:
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Submitting jobs
Jobs without parameters
You merely need to perform an HTTP POST on JENKINS_URL/job/JOBNAME/build?token=TOKEN where TOKEN is set up in the job configuration.
As stated above by #rafal S do
read a file which has list projects name for which build job has to be triggered do a curl HTTP POST on JENKINS_URL/job/${JOBNAME from the file}/build?token=TOKEN within a for loop , where for loop has list of all project names from the file you read

Resources