Storing Jenkins pipeline job metadata? - jenkins

Is there a way where to store some metadata from Jenkins pipeline job, e.g:
We have a Jenkinsfile which builds a gradle project, creates docker image and pushes it to google cloud
Then a "Subjob" is launched which runs integration tests (IT) on that docker image. Subjob receives a couple of parameters (one of them - the generated docker image name)
Now sometimes that IT job fails, and I would like to re-run it from the main job view, so idealy:
we have a plugin which renders a custom button in blue ocean UI on the main job
By clicking that button a subjob is invoked again with the same parameters (plugin queries the jenkins api, get params of this job, and resubmits the subjob).
The problem ? How to get/set those parameters. I could not seem to find a mechanism for that, expect artifact storage. I could get away with that by creating a simple json/text file and uploading it as artifact, and then retrieving it in my plugin, but maybe there is a better way?
Stage restart is not coming to Scripted Pipelines so that does not look like ant option.

Maybe you can use the Jenkins API to get the details of the build?
https://your_jenkins_url.com/job/job_name/lastBuild/api/json?pretty=true
Instead of lastBuild you can also use the build number or one of lastStableBuild, lastSuccessfulBuild, lastFailedBuild, lastUnstableBuild, lastUnsuccessfulBuild, lastCompletedBuild
There is a parameters key there with all parameter names and values used in the build.
More details on https://your_jenkins_url.com/job/job_name/api/
Also, any reason you can't use the replay button in the IT job?

Related

How to Send Gradle Parameters TO Jenkins?

I’ve seen many articles for sending Jenkins parameters to Gradle, which are passed as System variables, but have not come across doing the opposite. In my build.gradle file I have specified 2 parameters for a file name and version number. In order to pass this file name to the next job in the pipeline, I’d like to send the parameters to the next Jenkins job so it can make use of the file.
Is sending gradle params back to Jenkins possible using built in functionality??

How to copy parameters from one pipeline to another without copying entire pipeline?

On our team, only few people have Jenkins access to perform admin operations as it is Production Jenkins server which developers continuously use for builds.
Sometimes I have to enhance any pipeline or fix issues of pipeline. For that admin has created one pipeline for me so I can add code there and test it. I am suppose to use only that pipeline to test anything.
But I test different pipelines, each pipelines has different parameters list. In this case, I've to add parameters one by one and copying all details of that parameter like Groovy Script, default value etc. which takes lot of time.
Is there any way/plugin using which we can simply copy only parameters from one pipeline to other?
I think you should know each job has a config.xml which represents the job configuration. You can get it by <job_url>/config.xml.
Get the config.xml of the job you want to debug, then extract the xml block for job parameters from the config.xml
Prepare an empty structure config.xml, inject the job parameters' xml block into the empty config.xml
Call Jenkins Rest API to update/save the config.xml to your debug job, then your debug job has target job's params.
You can write a script to implements above 3 steps.

How to programmatically generate config.xml from Jenkinsfile?

Jenkins has the ability to upload new jobs via its REST API. Those new jobs require an XML document which, to the best of my searching, has no schema available.
When creating jobs as part of an SCM repo, you can include a Jenkinsfile and it automatically gets translated into a job with the config.xml filled out.
I tried creating a minimal config.xml and including the Jenkinsfile content in the <script>…</script> section of the xml file. This works for trivial jobs, but does not work for jobs that have parameters: The job gets uploaded as a parameterless job. The first time you trigger a build of the job, it fails - but then the job turns into a job-with-parameters, and can properly be built.
How do I convert a Jenkinsfile, possibly with parameters or other "advanced" features, into a working config.xml file on the first try? Or, alternatively, is it possible to directly upload the Jenkinsfile to the Jenkins REST API to create the job?
Thanks in advance,
— Johnson

Jenkins plugin code that should excute before any kind of job is run in Jenkins

I am new to Jenkins plugin development. M trying to write a plugin that should be executed before any Multi configuration type job runs in Jenkins.
In this plugin I want to write rules that will check what configuration parameters user has selected while submitting the job, based on selected parameters, I want to decide whether to allow the job to run or to restrict it.
User should be shown reason as to why that job cannot be run in the Console Output.
Does anyone have any ideas which class I need to extend or which interface I need to implement in order to get a hook into Jenkins job run?
You could look at the Matrix Execution Strategy which allows for a groovy script to select which matrix combinations to run. I would think if your script threw an exception it would stop the build.
For background, the multi configuration projects run a control job (or flyweight) which runs the SCM phase then starts all the actual combinations. This plugin runs after the flyweight SCM checkout.
If nothing else, this will give you a working plugin to start from
Disclaimer: I wrote this plugin
Blocked queue job plugin was what I needed
Out of the box that plugin supports two ways to block the jobs -
Based on the result of last run of another project.
Based on result of last run of the current project
In that plugin the BlockQueueItemTaskDispatcher.java extends Jenkin's QueueTaskDispatcher providing us a hook into Jenkins logic to allow or block the jobs present in the queue from running.
I used this plugin as a starting point for developing a new plugin that allows us to restrict project based on parameters selected and the current time. Ultimate goal is to restrict production migrations from running during the day.
Overriding the isBlocked() method of QueueTaskDispatcher gave access to hudson.model.Queue.Item instance as an argument to me. Then I used the Item instance's getParams method to get access to build parameters selected by the user at runtime. Parsed the lifecyle value from it. Checked the current time. If lifecycle was Production and current time was day time then restricted the job by returning non null CauseOfBlockage from isBlocked() method. If that condition was false, then returnedCauseOfBlockage as null allowing the queued job to run.

How to call a jenkins Job based on User inputs

The issue here is once the first Job in a Jenkins pipeline is done, we need to ask some inputs from user and based on the user Inputs to decide the next job to be triggered(Job2 or Job3)
tried build flow and parameterzied trigger plugin but didn't find any suitable option under these.
Any other plugin or jenkins feature which can help in achieving the above scenario?
There are a few plugins I have tried which collect user input on manually triggered jobs in a build pipeline: Active Choices Plug-in 1.2 and Extensible Choice Parameter 1.3.2.
With Active Choices you define a list of selections and a default value. With Extensible Choice Parameter you can have a text box and a default value.
This is how they work for me in Build Pipeline 1.4.8 on jenkins 1.628
If you run the manual step directly in the pipeline the default is used and other parameters propagate through correctly.
If you open the step there is an option to Build with Parameters which will ask for the user input. This works but other parameters like the build number do not propagate through so the pipeline is broken, and the pipeline screen does not show the status.
Jenkins will never pause and ask a user for inputs. It is an automated build system. It doesn't expect anyone sitting at the console watching the progress.
You can provide "inputs" or parameters when you manually trigger the job, i.e on the first job in your pipeline. You can them pass these parameters to downstream jobs, either through the Parameterized Trigger plugin or through a file copied between jobs.
If you need a human decision in the middle of your build flow, consider Promoted Builds plugin. With this plugin, a human can select a build, and then decide which "Promotion" to execute (which could branch your workflow as you need). The promotions can also be automated if needed, based on criteria and not human input.

Resources