Spinnaker - Using the Parameter of Trigger Jenkins Build - jenkins

I'm trying trigger a Jenkins build in Spinnaker and pull the parameters used in the trigger for the build I'm triggering.
So pipeline A is run with params a:1, b:2, c:3.
I want this to trigger my pipeline B with params a:1, b:2, c:3, somehow retrieving those parameters from pipeline A.
Can spinnaker do this? I'm not able to make any alterations to pipeline A and it doesn't spit out any files artifacts files I can make use of.

Related

how to get upstream build information in a script step of jenkins classic ui pipeline

I have an old classic ui jenkins pipeline. now i need this pipeline to be triggered on the completion of other pipelines. And get the upstream pipeline information in this old pipeline.
I know how to set the upstream build trigger in the jenkins pipeline. However i cannot find a way to get the upstream build information (eg, project name, git commit).
When i output the env variables in downstream pipeline, i can only see the BUILD_CAUSE=UPSTREAMTRIGGER which is not useful for me.
Trigger Downstream Job With Parameters
The old job would need to be updated to be parameterised, then you can pass the required information as parameters when you build the downstream job.
Example:
build job: "DOWNSTREAM_JOB_NAME",
parameters: [string(name: 'upstreamJobName', value: env.JOB_NAME),
string(name: 'upstreamJobVar', value: "${upstreamJobVar}"]
Trigger Downstream Job Without Parameters
When parameters are not being send from triggering upstream job, then we can get some of the upstream information in the downstream job like this:
currentBuild.upstreamBuilds[0].projectName
All available methods for upstreamBuilds information can be found here

Jenkins - How to pass environmental variable from freestyle job to pipeline job

I have a freestyle job in Jenkins and would normally pass parameters to another freestyle job using the Predefined parameters option.
example:
PROJECT=Myproject
PATH=/depot/workspace/
Previously I could access the above values through the KEY in the downstream job through the environment by using ${PROJECT} OR ${PATH}.
My problem now is that I have a pipeline job that needs to access the above values but when using ${PROJECT} OR ${PATH} it does not work.
So, in general how I want it to work is have the freestyle job run first and pass the parameters to the downstream pipeline job.
You might need to use "${params.PROJECT}" in your pipeline to access the parameters.

How to pass output from one pipeline to another in jenkins

I'm new to Jenkins and I've been tasked with a simple task of passing the output from one pipeline to the other.
Lets say that the first pipeline has a script that says echo HelloWorld, how would i pass this output to another pipeline so it displays the same thing.
I've looked at parameterized triggers and couple of other answers but I was hoping if someone could layout the step by step procedure to me.
If you want to implement it purely with Jenkins pipeline code - what I do is have an orchestrator pipeline job that builds all the pipeline jobs in my process, waits for them to complete then gets the build number:
Orchestrator job
def result = build job: 'jobA'
def buildNumber = result.getNumber()
echo "jobA build number : ${buildNumber}"
In each job like say 'jobA' I arrange to write the output to a known file (a properties file for example) which is then archived:
jobA
writeFile encoding: 'utf-8', file: 'results.properties', text: 'a=123\r\nb=foo'
archiveArtifacts 'results.properties'
Then after the build of each job like jobA, use the build number and use the Copy Artifacts plugin to get the file back into your orchestrator job and process it however you want:
Orchestrator job
step([$class : 'CopyArtifact',
filter : 'results.properties',
flatten : true,
projectName: 'jobA',
selector : [$class : 'SpecificBuildSelector',
buildNumber: buildNumber.toString()]])
You will find these plugins useful to look at:
Copy Artifact Plugin
Pipeline Utility Steps Plugin
If you are chaining jobs instead of using an orchestrator - say jobA builds jobB builds jobC etc - then you can use a similar method. CopyArtifacts can copy from the upstream job or you can pass parameters with the build number and name of the upstream job. I chose to use an orchestrator job after changing from chained jobs because I need some jobs to be built in parallel.

Jenkins Parallel Trigger and Wait

I have 4 jobs which needs to be executed in the following sequence
JOB A
|------> JOB B
|------> JOB C
|------> JOB D
In the above
A should trigger B & C parallely and C inturn triggers D.
A should hold the job as running till all 3 of them completed.
I tried the following plugins and couldn't achieve what I am looking for
Join Plugin
Multijob Plugin
Multi-Configuration Project
Paramterized Trigger Plugin
Is there any plugin which I haven't tried would help me in resolving this. Or is this can be achieved in a different way. Please advise.
Use DSL Script with Build Flow plugin.
try this Example for your execution:
build("job A")
parallel
(
{build("job B")}
{build("job C")}
)
build("job D")
Try the Locks and Latches plugin.
This may not be optimal way, but it should work. Use the Parameterized Trigger Plugin. To Job A, add a build step (NOT a Post Build Action) to start both Jobs B and C in the same build step AND block until they finish. In Job C, add a build step (NOT a Post Build Action) that starts Job D AND blocks until it is finished. That should keep Job A running for the full duration.
This isn't really optimal though: Job A is held open waiting for B and C to finish. Then C is held open until D is finished.
Is there some reason that Job A needs to remain running for the duration? Another possibility is to have Job A terminate after B and C are started, but have a Promotion on Job A that will execute your final actions after jobs B, C and D are successful.
I am trying to build a same system. I am building a certification pipeline where I need to run packager/build/deploy jobs and and corresponding test jobs. When all of them are successful, I want to aggregate the test results and trigger the release job that can do an automated maven release.
I selected Build pipeline plugin for visualization of the system. Initially tried with Parameterized trigger Plugin with blocking builds. I could not setup archiving the artifacts/fingerprinting and downstream build relationship this way since archiving the artifacts works only in postbuild. Then I put the Parameterized trigger in Post build activity. This way I was able to setup downstream builds, fingerprinting, aggregate test results but the build failures were not bubbling to upstream job chain and upstream jobs were non blocking
I was finally able to achieve this using these plugins-
Build Pipeline
MultiJob Plugin
FingerPrint Plugin
Copy Artifacts Plugin
Join Plugin
I'm using Jenkins 1.514
System looks like this
Trigger Job --> build (and deploy) Job (1..n) ---> Test Job (1..n)
Trigger Job -
Create as MultiJob and create a fingerprint file in shell exec
echo date +%s > fingerprint.txt
Trick is that file needs to be archived during the build, to do that execute this script-
ARCHIVEDIR=$JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID/archive
mkdir $ARCHIVEDIR
cp fingerprint.txt $ARCHIVEDIR
Create MultiJob Phase consisting of build/deploy job.
Build/deploy job is itself a multijob
follow the same steps for creating build/deploy job as above relative
to fingerprinting.
Copy the fingerprint.txt artifact from upstream job
Setup MultiJob phase in deploy job that triggers the test job
create a new fingerprint file and force archive it similar to above step
Collect Junit results in the final test job.
In the trigger Job, use Join Plugin to execute the Release Job by choosing 'Run Post Build Actions at join' and execute the release project only on stable build of Trigger Job.
This way all the steps are showing up in Build Pipeline view and Trigger job is blocking for all downstream builds to finish and sets its status as the worst downstream build to give a decision point for release job.
Multijob Plugin
If you'd like to stop the mess with downstream / upstream jobs chains definitions. Or when you want to add a full hierarchy of Jenkins jobs that will be executed in sequence or in parallel. Add context to your buildflow implementing parameter inheritance from the MultiJob to all its Phases and Jobs. Phases are sequential while jobs inside each Phase are parallel.
https://wiki.jenkins-ci.org/display/JENKINS/Multijob+Plugin

Can one Jenkins Trigger a job on a remote jenkins

I have 2 Jenkins hosts, and would like First Jenkins to trigger a job on remote Jenkins based on "SUCCESS" in result on the first one.
I have looked at various plugins , but they all seem to indicate ONE Jenkins host, where multiple jobs can be chained in this manner.
Meanwhile, a jenkins plugin became available which makes it a lot easier:
https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Remote+Trigger+Plugin
It's very easy to do using cURL requests, no need for plugins or master>slave relations. It took me 5 minutes from beginning to start.
Use the following manual:
https://www.nczonline.net/blog/2015/10/triggering-jenkins-builds-by-url/
You could set up a downstream job on host1 that only builds if first job on host1 succeeds.
In this job you would trigger a remote build much like i described it in this answer
Step 1: Install following plugins in both Jenkins.
Generic Webhook Trigger: Job can be triggered from http request.
HTTP Request Plugin: To send http request as build step
Any Build Step Plugin: To use any build step in post-build action.
Step 2: Configure job to be triggered(Jenkins B).
Select generic webhook trigger in build trigger and generate a token and paste.
After saving this job can be triggered by sending a http request to
http://JENKINS_B_URL/generic-webhook-trigger/invoke?token=TOKEN_VALUE
Step 3: In master Jenkins(Jenkins A) configure flexible publish settings in configure system to allow use all build steps as post build actions.
Step 4: In post build actions add another step “Flexible publish”.
Using this any build action can be used as post-build action. Add a HTTP Request action.
Provide Jenkins B webhook url in url field and save.
Yes. Configure your Jenkins nodes and label them, say masterand slave (Manage Jenkins -> Manage Nodes).
1) Configure Job A and specify that it can only run on master ("Restrict where this project can be run" and in the label field put master).
2) Configure Job B so that it is only triggered if Job A is successful:
"Post-build Actions" -> "Trigger only if build succeeds"
3) Pin Job B to slave similar to step 1.

Resources