Jenkins build summary link to post-build build summary - jenkins

I have a job that is triggered as a post-build action for dozens of other jobs. It essentially organizes and process the artifacts of those upstream jobs (using Copy Artifact Plugin), and publishes the reformatted logs, and originals, as artifacts of its own.
I want the build summary pages of an upstream job to have a link to that downstream job. From what I gather, this is not an intended use case. Conventional wisdom seems to be that if we want a link to a downstream job, we should run it as a sub-project within the Build step of the upstream job. But if I do that, I don't have the artifacts to pass to the downstream job. Catch 22.
Or is there something (even something really hacky and nasty) I can do to make this work. People want to get the processed artifacts directly from the build page.

One way (and I think the only way) to do this would be to call the Jenkins api from the downstream job to put a link to itself in the upstream job's description. But this seemed like more work than it was worth. So we just didn't do anything, and we're all fine.

Related

Jenkins and GitLab: How to setup SCM aware job which is not triggered by the hook?

To give some context the question is about GitLab and Jenkins setup.
I know how to setup a web hook, I know how to setup a job to be triggered by the hook. The problem is that I need to have multiple jobs and only a single entry-point (parent job) trigger for them.
The downstream jobs at the same time need to be git repo aware so I have to set repo url for them. This causes them to be triggered independently by the hook and I don't want that as this means that they are triggered twice.
On the other hand if I don't configure repo url on a downstream job and the parent job triggers it, it fails as it is not able to do a checkout.
I may try to hack around with some 'execute shell' build step, I believe it's not a valid way to go. Has anybody a good tip how to solve that?
For the reference here is the GitLab Jenkins plugin documentation according to which:
Plugin will parse the GitLab payload and extract the branch for which
the commit is being pushed and changes made. It will then scan all Git
projects in Jenkins and start the build for those that:
match url of the GitLab repo
match the configured refspec pattern if any
and match committed GitLab branch
I tried playing around with different settings, without a great result though.
For the project you want to get only local triggers, just enable Don't trigger a built on commit notification in the Additional behaviours of git plugin.
(https://github.com/elvanja/jenkins-gitlab-hook-plugin/issues/11#issuecomment-35385032, as you actually have discovered).
But a better solution could be to make your downstream jobs reference the repository locally cloned by main job (not sure if actually possible), so the plugin will never consider them for schedule a build, as the git url don't match.

Jenkins Workflow Plugin Link to Downstream Jobs

I've just started working with the Workflow plugin.
The set-up I have currently consists of a Workflow script that uses the build step to basically define a pipeline made up of multiple downstream jobs.
This is working well but their there isn't really any link between the output of Workflow build and the output from all the downstream builds, is their a way to either,
Link from the Workflow project build output to all the corresponding downstream builds.
Capture the console output of the downstream jobs and include it in the output of the Workflow job.
I'm hoping with either of these options it will be possible to see the output from the whole pipeline via the Workflow job output.
IMO, the intention of Workflow is to replace pipelines of various Jenkins jobs with just a single job. This may be why Workflow doesn't make any significant effort to link to downstream jobs. I've been converting my "pipelines" to monolithic Workflow jobs, and really appreciating the fact that all the actions are more tightly grouped together.
Link from the Workflow project build output to all the corresponding downstream builds.
PR 218 under review as of this writing.
Capture the console output of the downstream jobs and include it in the output of the Workflow job.
JENKINS-26124

run a Jenkins job on another Jenkins instance from the Jenkins job

I want to create a Jenkins job that starts other Jenkins jobs. That would be quite easy, because Jenkins Template Project Plugin allows us to create a build step of a type "use builders from another project". However, what makes my situation harder is that I have to start Jenkins jobs on other machines. Is there any standard way to do that?
In case you want only to trigger new build of Job You Have plenty of ways to accomplish it
you can use remote access API and Trigger a request to build target job from source Job.
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Or you can use https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Remote+Trigger+Plugin
which is handy in handling server details and other stuff. you shoukld ensure ssh keys shared by both servers.

How to get URL of pipeline job in jenkins

We are setting up a continuous delivery pipeline in Jenkins, using the build pipeline plugin.
Our deployment steps uses a proprietary deploy tool (triggered by a HTTP request from jenkins), but we need to have an additional Jenkins step for acceptance tests on the then deployed project. So our deploy tool will need to trigger the last pipeline step.
The jenkins setup for this is obvious:
For a Manually Triggered downstream build step: To add a build step
that will wait for a manual trigger:
Select the Build Pipeline Plugin, Manually Execute Downstream Project check-box
Enter the name(s) of the downstream projects in the Downstream
Project Names field. (n.b. Multiple projects can be specified by using comma, like "abc, def".)
Source: Build Pipeline Plugin
The problem is: I can't seem to find a way to trigger this downstream build through a URL.
In fact I'd need the URL in the deploy job, so I can send it to the deploy tool as a callback URL. Can anybody help?
If I understand correctly, you want to use remote access API, which to my knowledge is no different between general project or pipeline one.
Take a look here:
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Submitting jobs
Jobs without parameters
You merely need to perform an HTTP POST on JENKINS_URL/job/JOBNAME/build?token=TOKEN where TOKEN is set up in the job configuration.
As stated above by #rafal S do
read a file which has list projects name for which build job has to be triggered do a curl HTTP POST on JENKINS_URL/job/${JOBNAME from the file}/build?token=TOKEN within a for loop , where for loop has list of all project names from the file you read

Can Jenkins get the BUILD_URL of a hand-run upstream job?

I have a job that uses the Copy Artifacts Plugin to upload a .ipa file to TestFlight. I would like to only run this job by hand, not trigger it automatically. The job is configured with a build-selector parameter so that I can start the upload from one of a handful of similar jobs.
Is there an easy way (possibly with a plugin or script) to get the URL to the specific job that provided the artifact being uploaded?
Essentially I want to take the $BUILD_URL value from the upstream job so that I can include it in the TestFlight build notes.
(I am not sure if it directly pertains to what I want, but Get Jenkins upstream jobs seems to suggest that Groovy scripting might be the way to go. I also found a post on the Jenkins forums, http://jenkins-ci.361315.n4.nabble.com/Getting-upstream-job-s-build-number-td3167291.html that looked promising, but does not seem to apply to my scenario of manually triggered builds.)
Unless you are "Triggering Parameterized build..." a downstream job, in which case you could pass along Predefined Parameters "UPSTREAM_BUILD_URL=$BUILD_URL", I think you would have to store the BUILD_URL with the artifacts.

Resources