I have Job A and B in Jenkins. How can I always execute a file in the most recent build of A in the build steps in job B - jenkins

I have these Jenkins jobs A and B. Job A Builds a bunch of Files for my project. In Job B i wanna execute a command to run a file in the most recent build of job A.
My execution even works fine, but only because I have hard coded the build number and I am picking that from the files stored by Jenkins in my C:JenkinsData Directory, I would wanna have that called from the Workspace instead
See image for clarification.
Jenkins build steps illustration
For e.g my last build right now is 70 I want to know how I can be always executing those same files but in the most recent Build
Or if its even way better Can I execute those same file from Job A since the built files are in the Workspace.

You could get the last build index using this API and "number" property:
/lastBuild/api/json
ie:
http://localhost:8080/job/yourJobName/lastBuild/api/json
This could help:
Jenkins - Get last completed build status

Related

Previous Build Number in Jenkins

I have a batch Job (not the pipeline Job) in Jenkins where I am using a plugin called "Naginator" which will check if the current build is failed/unstable - If the current build is failed/unstable it will run immediately the next build to confirm the error is genuine. If the error is not genuine then the run is successful and it will periodically set the next build to run. Along with this, I use some CMD commands to move my data into another folder with the Build number as my folder name.
Now here is the issue, if the previous build was unstable and Naginator runs the next build and that build is stable then what I need to do is delete the previous unstable build data from the folder manually. Is it possible to fetch the previous build number in Jenkins so that I can delete the file in an automated way - lets say in CMD Commands .BAT file.
Jenkins has it's own global variables. You can check whem in your pipeline job -> Pipeline Syntax -> Global Variables Reference.
Additionally check http://jenkins_url:port/env-vars.html/
For your purpose BUILD_NUMBER exist.
Just create new env var like this:
PREV_BUILD_NUMBER = $($BUILD_NUMBER -1)
Excuse me if this piece of code will not work, I'm not good about scripting) Just for example.
UPDATE:
also you can find in mentioned reference a list of variables:
previousBuild
previousBuildInProgress
previousBuiltBuild
previousCompletedBuild
previousFailedBuild
previousNotFailedBuild
previousSuccessfulBuild

Get build number of triggering project in Jenkins

I configured a Jenkins project B to run when project A completes succesfully.
How can I find the buildnumber of A in the project B pipeline?
If you just need the last successful build of A you can just read it from Jenkins:
http://JenkinsMaster:Port/job/MyJob/lastSuccessfulBuild/buildNumber
If you need the build the triggered B you can use the Parametized Trigger Plugin and use :
TRIGGERED_BUILD_NUMBER_MyJob="Last build number triggered"
You can do the following:
Use the Execute windows batch or Execute shell build step to store the build version in a file during the build of project A - e.g. from a windows batch:
echo "VARIABLEA=%BUILD_NUMBER%" > %WORKSPACE%\myartifact.properties:
Use the Archive the artifacts post build step to store the file against that build in project A
At the start of project B use the Copy artifacts from another project build step, point to project A and use the Artifacts to copy field to filter down to the file you created and choos Last successful build for the Which build field
Read the file in a shell script during the build of project A to pickup the build number
If you output the artifact in the format:
VARIABLEA=${BUILD_NUMBER}
VARIABLEB=${BUILD_NUMBER}
and you're using Linux on the Jenkins server, you could use the source command to make VARIABLEA and VARIABLEB available in that shell session, e.g.:
source "${WORKSPACE}/myartifact.properties"
echo ${VARIABLEA}
You could then do something with that variable in the shell script.
Alternately, you could simply use the Trigger parameterized build on other projects post build step (which I believe requires the Parameterized Trigger Plugin) on project A and setup project B to accept those parameters.

Add Multiple Workspace Cleanup for Jenkins

Can I have multiple Delete workspace when build is done executions in a single job?
failure status: clear all workspace
success status: clear only distribution package directories (**/target/dist)
We break our builds into compilation and test jobs with build-stalker plugin providing the link between the two. Compilation job doesn't clean up after itself as test job will do so but we don't run a test job for each compilation job (only the latest in a 4 hour period) leaving orphaned workspaces.
I'd like a way to have the orphaned workspaces have less impact and a selective status based cleanup is one way to do this.
I'm not aware of a Jenkins plugin that supplies such a feature.
I'd establish the following:
Let each compile build write a line with its workspace path, e.g.:
.../jenkins/workspace/<...compile job...>/
by using a script in a language of your choice (bash, Groovy, cmd, ...) to a file like:
.../jenkins/workspace/toBeWipedOut
Create a job that runs a script in a language of your choice periodically that:
reads the file
deletes the workspaces mentioned therein except the last
removes all lines therein except the last
or
Let each compile build write a line with its job name.
See Wipe out workspaces of all jobs for how to wipe out workspaces by job name.

Jenkins - Upstream project dependency issue

Here is something I want to achieve:
I have a jenkins project which has 4 upstream projects. But I don't want to trigger this project when the upstream jobs are done building, but I want the trigger the project via remote API, which then waits on upstream projects until they are done building, if these projects are building.
Lets say all the 4 upstream projects can build the source code from any branch passed via API, but I want the downstream project to start only when a specific branch is passed to these upstream projects.
Scenario:
Lets say I have two clusters A and B, for the sake of this question, I want to deploy my code to cluster A, i.e front end and backend code. Now I have a project to build front end and 1 project to build backend (these two projects can build code for cluster A and B, based on the branch passed). Now, I have two deploy projects for cluster A which will deploy front end and backend. So, when I pass a branch to build code for cluster A, it will trigger the build projects. But now I only want these two deploy projects to start when this specific branch was passed.
If you want to control the builds remotely then use the Jerkins cli - I have found it very useful http://jenkinshost:8080/cli
You need to get the ssh key config right, add the public key of the user running the cli to the user you want to run the job in Jenkins using the Jenkins user configuration (not on the command line
Test key setup with
java -jar jenkins=cli.jar -s http://jenkinshost:8080 who-am-i
This should then report which user will be used to run the build in Jenkins
But I think you can use the Conditional Build Step plugin for your problem
https://wiki.jenkins-ci.org/display/JENKINS/Conditional+BuildStep+Plugin
This will allow you to put a conditional wrapper around a build step i.e.
if branch==branchA then
trigger step - deploy to clusterA
if branch==branchB then
trigger step - deploy to clusterB
Personally I find this plugin a bit clunky and it makes the job config page a little messy
Another solution I came up with was to always call the child job and then let it decide if it runs.
So I have a script step at the start of the child job to see if it should run
if [${branch}="Not the right branch name" ] ; then
echo "EXIT_GREEN"
exit 1
fi
You have now failed this job which would cause the parent job to go red but by using the Groovy Postbuild plugin https://wiki.jenkins-ci.org/display/JENKINS/Groovy+Postbuild+Plugin you can add a post build step like this
if (manager.logContains(".*EXIT_GREEN.*")) {
manager.addBadge("info.gif","This job had nothing to do")
manager.build.#result = hudson.model.Result.SUCCESS
}
Child job has run green (with an info icon against the build) but has actually not done anything. Obviously if the branch is one you want deploy then the first script step does not run the exit 1 and the job continues as normal

Copying artifacts from multiple upstream jobs at join in Jenkins

Is it possible to have a Jenkins Job with has been triggered by the Join plugin copy artifacts from multiple upstream jobs?
I'm trying to set-up a Jenkins configuration with a "diamond" of jobs: my-trigger runs and spawns two jobs, my-fork1 and my-fork2, that can run concurrently and take varying amounts of time, and the Join plugin sets off the job my-join once both forks have completed.
Each of my-trigger, my-fork1 and my-fork2 creates and fingerprints artifacts (say, text files).
I want to copy the artifacts from each of the upstream jobs in my-join using the "Copy artifacts from another project" tool, with the "Which build" parameter set to "Upstream build that triggered this job". However, I see output like this in the console of my-join:
Building remotely on build-machine in workspace /path/to/workspace/my-join
Copied 1 artifact from "my-trigger" build number 63
Copied 1 artifact from "my-fork1" build number 63
Unable to find a build for artifact copy from: my-fork2
and the job fails. In this case, my-fork2 finished first, so my-fork1 triggered the join step. I believe that that means that my-join only has record of my-fork1 and my-trigger as being upstream. If my-fork1 finishes first, then my-fork2 kicks off the join, and the job fails when trying to copy from my-fork1.
If I change the configuration to copy the artifact from the build "Latest successful build" then the build succeeds, but my-trigger may run many times in succession so there would be no guarantee that my-join is joining related artifacts.
How can I get the join step to copy artifacts from multiple forks upstream?
Note: the second point of this question seems to be asking the same thing, but the only answer there doesn't address it, and has been accepted.
Thanks
tensorproduct
If your builds are parameterized with a unique parameter for each run of the join-diamond, you can use that parameter in the CopyArtifact plugin to determine which build to copy from. You would want to specify "Latest successful build" and qualify it with the parameter and value.
We have a similar situation where I work; multiple simultaneous runs of a join-diamond. The parameter in the build allows the downstream jobs to get the correct artifacts from the upstream jobs.
Step by Step settings of the provided solution from Jason Swager:
Project dependencies:
diamond->fork->diamond_ready
Project "fork":
String parameter "UNIQUE_ID" (only dummy not used inside)
(Creates an artifcat and Archive the artifacts)
Project "diamond_ready"
String parameter: UNIQUE_ID
Copy artifacts from another project
Project name: fork
Parameter filters: UNIQUE_ID=${UNIQUE_ID}
Project "diamond":
Trigger parameterized build on other project
Projects to build: fork
Predefinded parameters: UNIQUE_ID=${BUILD_TAG}
Join Trigger:
Post-Join Actions:
Trigger parameterized build on other projects
Projects to build: diamond_ready
Predefined Generator parameters: UNIQUE_ID=${BUILD_TAG}

Resources