I am trying to combine nice branching handling of Workflow Multibranch
with powerfull Job-Dsl plugin Job generation. So basically I want branch to regenerate it's jobs from script in repository and run the main one.
But I don't see a way to run Process Job DSLs step from workflow script. May be there is a built in way to execute custom steps in Workflow, but I just can't find it.
You could create a separate job that processes the job-dsl, and then call it with the proper parameters from the workflow via a "build job: xxx" step.
Not quite sure where you are going with this, but perhaps what you really want is multibranch binding for Job DSL, or to manually iterate branches.
Alternately, with Workflow alone you can probably accomplish your goal, whatever that is.
It seems that the jobDSL method can be used in the pipeline.
Have a look into the Snippet Generator to generate some code :
Related
Short: Can I set the name of a Jenkins job created automatically by the Multibranch Pipeline job, to something calculated in the job itself?
Long:
We do a lot of microservices with mostly identical build processes, and we would like to have as little hassle building and testing them as we can.
To that end, I am considering the Jenkins Multibranch Pipeline jobs, where I could just add another project repository and have that new repository built with new jobs being created for new branches that contain Jenkinsfiles. That would also cause the new jobs appear on the build monitor. And here is where the problems start.
I would like to see the name of the project on the build monitor cells, rather than something like my_multibranch_pipeline_t ยป temp_branch_one. However, I couldn't find a way to set the JOB_NAME to anything.
Am I missing something?
I really don't know any way to set the job name from the Jenkinsfile. However we solve the same issue that you describe using seedjobs. These are basically freestyle jobs executing Jenkins' Job DSL, which is able to define as many jobs as you like.
We are using a map of service names, mapped to their Git-urls and iterate over that using Groovy's each.
I am trying to upgrade my current regression infrastructure to use pipeline plugin and I realize there are two methods: scripted pipeline and declarative pipeline. Going through multiple articles, I realize that declarative pipeline is more future proof and more powerful, hence I am inclined to use this. But there seems to be following restrictions which I don't want to have in my setup:
The jenkinsfile needs to be in the repository. I don't want to keep my jenkinsfile in the code repository.
Since the jenkinsfile needs to be in SCM. Does it mean I cannot test any modification in the file until I check that in to the repository.
Any details on the above will be very helpful.
Declarative pipelines are compiled to scripted ones, so those will definitely not go away. But declarative ones are a bit easier to handle, so all fine for you.
You don't have to check a Jenkinsfile into VCS. You can also set up a job of type Pipeline and define it there. But this has the usual disadvantages like no history etc.
When using multi-branch pipelines, i.e., every branch containing a Jenkinsfile generating an own job, you just push your changed pipeline to a new branch and execute it. Once it's done, you merge it.
This approach certainly increases feedback cycles a bit, but it just applies the same principles as when writing your software. For experimentation, just set up a Pipeline type job and play around. Afterwards, commit it to a branch, test it, review it, merge it.
You can use the Pipeline Multibranch Defaults Plugin for that. It allows you to define the Jenkinsfile in the web UI (with the Config File Provider plugin) itself and then reference that file from a Multibranch Pipeline.
i am new to Jenkins , i need to execute one job that run's another multiple jobs in parallel were it should not stop even if one job fails.
i am not sure how to achieve it. After googling i can achieve by 3 ways Multi-Job plugin , Pipeline multiple Jenkins jobs , Build after other projects , Build Flow Plugin.
can any body please provide me the correct way.
Update : i am trying to achieve this using the pipeline plugin , can any body suggest me were it was correct choice ?..Please suggest!..
We use the Parameterized Trigger Plugin to do this.
In your build configuration add a Trigger/call builds on other projects build step. Add the names of the builds you want to trigger as a comma separated list and make sure that the Block until triggered projects finish their builds box is unchecked. Your build will trigger each of the listed builds, however note that your parent build won't wait for them to finish it will just trigger them and then perform the rest of it's buildsteps so if you have buildsteps.
If you do want to wait then check the block until triggered builds finish box, but set the options for when to fail the build, build step or mark the build as unstable appropriately.
If you need to pass parameters to the jobs you can add parameters using this plugin. If your downstream jobs need different parameters for different jobs you can click the add trigger button which adds another project to build where you can specify different options.
If these other jobs are follow up jobs to the current job and you don't need to wait for them to finish you can also achieve what you want to do by using the post build action build other projects, but again this occurs after the current job and you won't be able to use the results.
can any body please provide me the correct way.
I wouldn't approach using Jenkins with a "one correct way" mentality. Often times the requirements of your build will dictate which method or plugin you use in your build configurations.
The job can start other jobs via the jenkins api.
updated Answer : i used pipeline plugin to achieve my task and tuffwer was right to if u have paramaterized trigger plugin!..
I want to automate the process of creating jenkins job and want to trigger a script that will automatically create a job with certain set of required parameter. One way I exploreed is using DSL and parameterized trigger plugin.
As far as my observation over it-
Parameterized Trigger Plugin- used trigger new builds when one build has completed, with various ways of specifying parameters for the new build.
DSL Plugin- Allows the programmatic creation of projects using a DSL.
My requirement is to write a script that create a job with some parameters like string or choice parameters including Source Code Management configuration and build trigger specification.
Any input is appreciated :)
I have a fairly complicated Jenkins job that builds, unit tests and packages a web application. Depending on the situation, I would like to do different things once this job completes. I have not found a re-usable/maintainable way to do this. Is that really the case or am I missing something?
The options I would like to have once my complicated job completes:
Do nothing
Start my low-risk-change build pipeline:
copies my WAR file to my artifact repository
deploys to production
Start my high-risk-change build pipeline:
copies my WAR file to my artifact repository
deploys to test
run acceptance tests
deploy to production
I have not found an easy way to do this. The simplest, but not very maintainable approach would be to make three separate jobs, each of which kicks off a downstream build. This approach scares me for a few reasons including the fact that changes would have to be made in three places instead of one. In addition, many of the downstream jobs are also nearly identical. The only difference is which downstream jobs they call. The proliferation of jobs seems like it would lead to an un-maintainable mess.
I have looked at using several approaches to keep this as one job, but none have worked so far:
Make the job a multi-configuration project (https://wiki.jenkins-ci.org/display/JENKINS/Building+a+matrix+project). This provides a way to inject the job with a parameter. I have not found a way to make the "build other projects" step respond to a parameter.
Use the Parameterized-Trigger plugin (https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Trigger+Plugin). This plugin lets you trigger downstream-jobs based on certain triggers. The triggers appear to be too restrictive though. They're all based on the state of the build, not arbitrary variables. I don't see any option provided here that would work for my use case.
Use the Flexible Publish plugin (https://wiki.jenkins-ci.org/display/JENKINS/Flexible+Publish+Plugin). This plugin has the opposite problem as the parameterized-trigger plugin. It has many useful conditions it can check, but it doesn't look like it can start building another project. Its actions are limited to publishing type activities.
Use Flexible Publish + Any Build Step plugin (https://wiki.jenkins-ci.org/display/JENKINS/Any+Build+Step+Plugin). The Any Build Step plugin allows making any build action available to the Flexible Publish plugin. While more actions were made available once this plugin was activated, those actions didn't include "build other projects."
Is there really not an easy way to do this? I'm surprised that I haven't found it and even more surprised that I haven't really seen any one else trying to do this? Am I doing something unusual? Is there something obvious that I am missing?
If I understood it correct you should be able to do this by following these Steps:
First Build Step:
Does the regular work. In your case: building, unit testing and packaging of the web application
Depending on the result let it create a file with a specific name.
This means if you want the low-risk-change to run afterwards create a file low-risk.prop
Second Build Step:
Create a Trigger/call builds on other projects Step from the Parameterized-Trigger
plugin.
Entery the name of your low-risk job into the Projects to build field
Click on: Add Parameter
Choose: Parameters from properties File
Enter low-risk.prop into the Use properties from file Field
Enable Don't trigger if any files are missing
Third Build Step:
Check if a low-risk.prop file exists
Delete the File
Do the same for the high-risk job
Now you should have the following Setup:
if a file called low-risk.prop occurs during the first Build Step the low-risk job will be started
if a file called high-risk.prop occurs during the first Build Step the high-risk job will be started
if there's no .prop File nothing happens
And that's what you wanted to achieve. Isn't it?
Have you looked at the Conditional Build Plugin? (https://wiki.jenkins.io/display/JENKINS/Conditional+BuildStep+Plugin)
I think it can do what you're looking for.
If you want a conditional post-build step, there is a plugin for that:
https://wiki.jenkins-ci.org/display/JENKINS/Post+build+task
It will search the console log for a RegEx you specify, and if found, will execute a custom script. You can configure fairly complex criteria, and you can configure multiple sets of criteria each executing different post build tasks.
It doesn't provide you with the usual "build step" actions, so you've got to write your own script there. You can trigger execution of the same job with different parameters, or another job with some parameters, in standard ways that jenkins supports (for example using curl)
Yet another alternative is Jenkins text finder plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Text-finder+Plugin
This is a post-build step that allows to forcefully mark a build as "unstable" if a RegEx is found in console text (or even some file in workspace). So, in your build steps, depending on your conditions, echo a unique line into console log, and then do a RegEx for that line. You can then use "Trigger parameterized buids" and set the condition as "unstable". This has an added benefit of visually marking the build different (with a yellow ball), however you only have 1 conditional option with this method, and from your OP, looks like you need 2.
Try a combination of these 2 methods:
Do you use Ant for your builds?
If so, it's possible to do conditional building in ant by having a set of environment variables your build scripts can use to conditionally build. In Jenkins, your build will then be building all of the projects, but your actual build will decide whether it builds or just short-circuits.
I think the way to do it is to add an intermediate job that you put in the post-build step and pass to it all the parameters your downstream jobs could possibly need, and then within that job place conditional builds for the real downstream jobs.
The simplest approach I found is to trigger other jobs remotely, so that you can use Conditional Build Plugin or any other plugins to build other jobs conditionally.