Each application has its own build job and deploy job. I want to create a dashboard of Jenkins job that contains the builds from different applications and can select the application to deploy instead of going to each deploy job page. Can I create multiple deploys in a single Jenkins job? Something that looks like this:
You'll have to create some Python and Groovy code for this one.
You can:
Divide jobs under different views.
Create a job with Active Parameters in job configuration
Create a groovy script that populates the parameters by fetching jobs view-wise.
Write a Python code that uses HTTP calls using REST API of Jenkins, using selected parameters from Step 3, and execute them.
Basically, you can make one executor job which can help you select job names and execute them using Jenkins APIs.
Related
I need to create a set of pipelines every month for the new version.
I want to set up a job that will create the Jenkins jobs with Jenkinsfile from the repo
It has to either trigger or run automatically and create a job w.r.t Jenkins file in a folder or workspace.
so that I don't have to access the server whenever I need to run a job
This way I can create the jobs of the new version and patches jobs.
Also, new views if needed. Dashboard I can manage but creating the number of similar jobs with a similar format is waste of time and resources, is there a way to automate it?
Is there a way/plugin to manipulate artifacts after build on Jenkins?
Basically, from the build result page, you can click on "MANIPULATE" link or button and execute a script/command.
Thank you.
One way you could accomplish this is:
Create a RESTful web service on the box running Jenkins. The endpoint would contain the logic/script that you want to execute. If you edit the artifact on the Jenkins master directly, it will get updated in the UI.
Add some simple HTML to the build description to call the RESTful end point you made in #1. For example, via a link target in a button or a tag. Note, you can also update the build description dynamically for each build using a plugin like Groovy Postbuild.
UPDATE
Note that if you don't want to create a web service running on the master, you can just invoke another job via the Jenkins remote access api that does the update for you using something like Groovy Postbuild Plugin. The Groovy postbuild step runs on the Jenkins master, so you technically have access to the entire filesystem as well as the Jenkins API.
For example, you would:
Create button that would start another job by hitting the url "http://myjenkinsurl/job/myjob/buildWithParameters?param1=test". Depending on your security, you might need to specify a token to kickoff jobs like this.
Inside that job write a groovy script to update the artifacts via Groovy Postbuild. For example, a small groovy script like
'copy C:\\Jenkins\\jobs\\myjob\\myartifact C:\\Backup'.execute(); could be used to backup the artifact somewhere else.
I am creating a list of Jenkins jobs for sanity test of our Jenkins build environment. I want to create layers of jobs. The first layer of jobs will check the environment, e.g. if all slaves are up, the 2nd layer then can check the integration to other tools such as GitHub, TFS, SonarQube, then the 3rd layer can run some typical build projects. This sanity test can also be used to verify the environment after any major changes to the Jenkins servers.
We have about 10 slaves created on two servers, one Windows and one Linux. I know I can create a job to run on a specific slave, therefore test if the slave is online, but this way I need to create 10 jobs just to test all slaves. Is there a best approach to check if all slaves are online?
One option is to use Jenkins Groovy scripting for a task like this. The Groovy plugin provides the Jenkins Script Console (a useful way to experiment) and the ability to run groovy scripts as build steps. If you're going to use the Script Console for periodic maintenance, you'll also want the Scriptler plugin which allows you to manage the scripts that you run.
From Manage Jenkins -> Script Console, you can write a groovy script that iterates through the slaves and checks whether they are online:
for (node in Jenkins.instance.nodes) {
println "${node.name}, ${node.numExecutors}"
def computer = node.computer
println "Online: ${computer.online}, ${computer.connectTime} (${computer.offlineCauseReason})"
}
Once you have the basic checks worked out, you can create either a standalone script in Scriptler, or a special build to run these checks periodically.
It often takes some iteration to figure out the right set of properties to examine. As I describe in another answer, you can write functions to introspect the objects available to scripting. And so with some trial and error, you can develop a script performs the checks you want to run.
I want to create a Jenkins job that starts other Jenkins jobs. That would be quite easy, because Jenkins Template Project Plugin allows us to create a build step of a type "use builders from another project". However, what makes my situation harder is that I have to start Jenkins jobs on other machines. Is there any standard way to do that?
In case you want only to trigger new build of Job You Have plenty of ways to accomplish it
you can use remote access API and Trigger a request to build target job from source Job.
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Or you can use https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Remote+Trigger+Plugin
which is handy in handling server details and other stuff. you shoukld ensure ssh keys shared by both servers.
We are setting up a continuous delivery pipeline in Jenkins, using the build pipeline plugin.
Our deployment steps uses a proprietary deploy tool (triggered by a HTTP request from jenkins), but we need to have an additional Jenkins step for acceptance tests on the then deployed project. So our deploy tool will need to trigger the last pipeline step.
The jenkins setup for this is obvious:
For a Manually Triggered downstream build step: To add a build step
that will wait for a manual trigger:
Select the Build Pipeline Plugin, Manually Execute Downstream Project check-box
Enter the name(s) of the downstream projects in the Downstream
Project Names field. (n.b. Multiple projects can be specified by using comma, like "abc, def".)
Source: Build Pipeline Plugin
The problem is: I can't seem to find a way to trigger this downstream build through a URL.
In fact I'd need the URL in the deploy job, so I can send it to the deploy tool as a callback URL. Can anybody help?
If I understand correctly, you want to use remote access API, which to my knowledge is no different between general project or pipeline one.
Take a look here:
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Submitting jobs
Jobs without parameters
You merely need to perform an HTTP POST on JENKINS_URL/job/JOBNAME/build?token=TOKEN where TOKEN is set up in the job configuration.
As stated above by #rafal S do
read a file which has list projects name for which build job has to be triggered do a curl HTTP POST on JENKINS_URL/job/${JOBNAME from the file}/build?token=TOKEN within a for loop , where for loop has list of all project names from the file you read